Joy Casino Ап Икс Unresolved Debates - What are debates? | Open to Debate
November 18, 2022
Feature Debate Topic

Should tech companies moderate misinformation that their users post?

Does America need a governing body to regulate disinformation?

Was Twitter right to ban Donald Trump?

The age of “information disorder” is upon us. Deep fakes, false political narratives, and flawed COVID rumors are all rampant online, threatening America’s national security, as well as democracy itself. Though bad actors have always had the capacity to deceive, the ease, speed, and degree to which anyone can create misleading information has engendered a dangerous new world. And yet many solutions can also run directly against longstanding western principles, such as free speech and a lack of censorship. Prescriptions, some argue, can be as dangerous as the disorder itself. So, what can be done? In partnership with the Homeland Security Experts Group, Intelligence Squared U.S. debates how to combat this dangerous new phenomenon, termed “information disorder.” Our expert panel takes a look at what the private sector should do, what the public sector can do, and how political actors who spread false information should be handled.

This debate took place in front of a live audience, at the 2022 Homeland Security Enterprise Forum, on October 25, 2022. The debate is presented and produced in partnership with the Homeland Security Experts Group.

12:00 PM Friday, November 18, 2022
down

Should Tech Companies Moderate Misinformation That Their Users Post? (7 RESOURCES)

down

Does America Need a Governing Body to Regulate Disinformation? (8 RESOURCES)

down

Was Twitter Right to Ban Donald Trump? (7 RESOURCES)

  • 00:00:00

    [music playing]

    John Donvan:

    Welcome, everybody, to an Intelligence Squared program that we are calling Unresolved: Information Disorder because as we all know, we are living in a time that is uniquely exposed to the perils of disinformation, which certainly makes it timely to hold a debate asking what should be done about it. And because at times, disinformation is a national security threat, we are delighted to be holding this debate in partnership with the Homeland Security Experts Group. A whole audience of experts, that is really something. And of course, we have the experts on our stage.

    In this debate, we are going to be working through three questions one at a time. And what and what we would like to do now is to ask you your opinion on these questions. The three questions we’re going to be looking at are, should tech companies moderate misinformation that their users post? Does America need a governing body to regulate disinformation? And finally, was Twitter right to ban Donald Trump?

  • 00:01:00

    I want to say now let’s welcome our debaters to the stage. First, former United States Secretary of Homeland Security and co-author of the Patriot Act, Michael Chertoff.

    [applause]

    Next, internationally recognized expert on disinformation and democratization and author of “How to Lose the Information War,” Nina Jankowicz.

    [applause]

    Now please welcome principal at Cornerstone Government Affairs, visiting fellow at the National Security Institute, former staff member on House Committee of Homeland Security, and Georgetown University professor, Charles Carithers.

    [applause]

    And finally, former assistant secretary for policy at the U.S. Department of Homeland Security under President George W. Bush, Stewart Baker.

    [applause]

  • 00:02:00

    So let’s get to the debate. We’re going to go through three questions, and our first of three questions is this. Should tech companies moderate misinformation that their users post? Let’s see where each of you is going to argue on this. On that question, Michael, are you a yes or a no?

    Michael Chertoff:

    I’m a yes.

    Robert Rosenkranz:

    Nina?

    Nina Jankowicz:

    I’m also a yes.

    Robert Rosenkranz:

    Charles?

    Charles Carithers:

    I’m a yes.

    Robert Rosenkranz:

    And Stewart?

    Stewart Baker:

    With an asterisk, I’m a no.

    [laughter]

    Robert Rosenkranz:

    So, let’s get to hearing the arguments. On the question should tech companies moderate misinformation that their users post, Michael, you are a yes. Tell us why.

    Michael Chertoff:

    So first of all, again, the important thing is the question doesn’t say they should be required. It simply says they should be allowed to. And that is really the function of anybody who edits any kind of publication. They have the right, protected by the First Amendment, to decide what gets published and what doesn’t get published and to curate accordingly. We don’t make them do it, but we don’t prevent them from doing it.

  • 00:03:00

    In this case, responsible social media platforms should evaluate whether something really is misinformation or disinformation. They’ve got to use their editorial judgment, just like every editor of every news organization does. And if they believe it’s false and misleading and even harmful, particularly, then they should take it down because that is part of their responsibility to their users and their customers. Finally, it’s important to have terms of service that lay out the parameters of what is permitted and what’s not permitted so the people who actually use the site are warned in advance that if they step into a zone that is inappropriate, they will be taken down.

    Robert:

    Thank you, Michael Chertoff. Nina Jankowicz, you’re next. You’re also a yes on the question.

    Nina Jankowicz:

    I am. You know, disinformation and misinformation have affected the functioning of our democracy. They’ve affected public health and they are affecting public safety.

  • 00:03:59

    Just a couple of weeks ago in Michigan, a man who was radicalized by the QAnon conspiracy theory killed his wife, the family dog, and injured his daughter. This has very real-world consequences, and I think we like to think of the things that happen online as staying online. But in reality, disinformation is causing offline consequences. And as Secretary Chertoff said, tech companies are private entities. They all moderate to some degree. They all have terms of service that we sign up to when we log on to share pictures of our kids or our dogs and cats. You know, we sign up to be moderated by them, even Truth Social moderates. Despite saying otherwise, they’ve been found to actually be taking down some posts that are critical of Pres. Trump, so I think we should all note that.

  • 00:04:44

    And I think also important is that you know equal and transparent enforcement of those rules, which the tech companies to this point have not done, would really reduce the uproar about moderation on those platforms. So I’d like to see all of that. And, you know, my final point is that content moderation about dis and misinformation doesn’t have to equal removal of speech. We can put friction on dis-informative posts, reducing their amplification. Freedom of speech does not necessarily mean freedom of reach, right? So we’re not talking about removing speech. Again, there are a lot of other elements to content moderation that we can be talking about, and it is incumbent on the social media platforms to be to be moderating the speech that users are posting.

    John Donvan:

    Thank you, Nina. Now, Charles, you were going to be next, but given the lineup here, in the interest of challenging monotony, I’m going to have Stewart jump in and then come to you. You good with that?

    Charles Carithers:

    Sure. Sounds good.

    John Donvan:

    So, Stewart, you are a no on this question.

    Stewart Baker:

    I’m a no. Obviously, there is and must be some forms of content moderation, and everybody does it, including Truth Social. But the content moderation system we have now is basically there are four companies that tell us what we can say to our friends, to our family, to our coworkers.

  • 00:06:04

    They decide and they say, as Mike Chertoff said, not only is it something we choose to do, but it’s our First Amendment right to tell you what you cannot say. What the hell is that? This is the First Amendment to censor? That’s the right we’re talking about? And the idea that these four companies, right, Twitter, YouTube, Google, Facebook and TikTok, whose parent company’s CEO apologized to the Chinese Communist Party for not doing a good enough job of censoring the views of people who disagreed with the Chinese Communist Party. Those are the four companies that will tell us what we can say. No.

    John Donvan:

    Thank you, Stewart.

    [applause]

  • 00:06:54

    All right, Charles, it turns to you. You’re a yes.

    Charles Carithers:

    Tech companies have a responsibility to not only their users, but to the nation to promulgate information that is healthy, that is truthful, and without doing so severely, severely impacts not only our national security, but the health and safety of U.S. citizens. So for example, we saw with the COVID-19 pandemic, things went rampant with misinformation on various social media sites. Certain ones did a better job than others in taking down that misinformation and probably saved a lot of lives. Others, who have a more, you know, political agenda, think like 4chan, those, that content stayed up.

  • 00:07:48

    Also, I would like to say that there’s an economic incentive for these social media companies to, you know, do really well in content moderation. What I mean by that is this. If you have greater users that trust your platform, they think the information that you’re promulgating is true, then more than likely, they’re going to have greater ad buys, right? So, it’s within their own self-interest to self-moderate and to self-moderate well.

    John Donvan:

    Thank you, Charles. So, let’s chat about this. Michael, I want to take it back to you since you are our first speaker and bring to you the main contrary point that came from the sole no vote on this conversation that it’s a no with an asterisk, but the no is based on the fact that the content companies would be given enormous power, that they would be acting as a government in their own, without checks on what they’re doing. So, can you respond to that?

    Michael Chertoff:

    Sure. First of all, let me say this. There are issues about antitrust and whether we have too much of a monopoly by certain companies, which have much broader implications than just moderation. It has to do with commercial advertising, the power over companies that use the media to reach people, and that’s a discussion for another debate.

  • 00:09:06

    But the reality is, it’s like a popular newspaper or a popular channel, let’s say Fox News. I use that as an example. They don’t have an obligation to air whatever somebody says, I want to have my voice heard. They may choose to do that to balance, but they may choose not to do it, and the First Amendment does protect the right of a speaker to curate what his platform is used to speak on.

    Some of you will remember there used to be cases involving license plates. I think there was one, live free or die, and some people said, “I don’t agree with that. I want to put a piece of tape over the slogan.” And the courts upheld that under the First Amendment. You can’t be required to propagate a position you disagree with or you know to be false. Now, obviously, the speaker has a right to start his own platform or to go to another platform.

  • 00:09:59

    And as Charles pointed out, there are a number of different platforms out there, some of which are very sketchy. But if you want to put yourself out there, it’ll reach an audience. But the key here is for the companies to be able to exercise their right to control what their platforms are being used to, and if we’re concerned about too much market power, we want to deal with that as a distinct issue.

    John Donvan:

    All right, Stewart, you’re going to get a chance to do more talking as a result of your position, because I would like you to respond to what Michael had to say.

    Stewart Baker:

    Yeah, with respect, that’s not the world we live in. To say let’s not talk about the fact that there’s only four companies that tell us what we can say because that’s an antitrust issue, and we’ll talk about that some other day. The question is, should we allow them to do this? And to say it’s just like Fox News and anybody who wants to walk in is not allowed to broadcast on Fox News, what, is there a shortage of electrons that I didn’t hear about?

  • 00:10:55

    All of this stuff can be transmitted, and they’re not doing it because they don’t have enough bandwidth. They’re doing it because they have chosen certain people that they don’t want to hear speak, and they don’t want us to hear them speak. And I do not think that that is a power that should be conveyed just to four private sectors companies, one of which is not even an American company.

    John Donvan:

    Nina?

    Nina Jankowicz:

    Yeah, I think — and I’m curious to see if Charles would agree with me here. As a woman, I would say that there is definitely not enough moderation happening on the internet. People can say whatever they want. And frankly, when they say it to women or people of color, there’s very little moderation happening in my own experience. I’ve probably sent thousands and thousands of reports to the companies for things that expressly violate terms of service.

    John Donvan:

    Literally thousands and thousands?

    Nina Jankowicz:

    Literally thousands of reports. I don’t know if anybody in this room knows I was recently the subject of a disinformation campaign and hate campaign myself. Literally thousands of reports have been sent to the companies on my behalf.

  • 00:11:59

    And I would say fewer than, I don’t know, 200 have been actioned for things that were violently threatening me and my family, things that were lies about the work that I was meant to do at the Department of Homeland Security, and, you know, again, things that were expressly violative of those terms of service. I don’t think the companies are doing enough.

    [music playing]

    John Donvan:

    More from Intelligence Squared U.S. when we return.

    [music playing]

    Welcome back to Intelligence Squared U.S. Let’s get back to our debate.

    Charles Carithers:

    If we did not allow social media or tech companies to self-moderate, then what would things look like? Every single election season, African Americans are disproportionately targeted by outside actors trying to influence us not to vote, not to participate in the electoral process. That’s shameful, and it happens every two years.

  • 00:13:01

    Thankfully, because of content moderation, some of those posts are taken down. They’re still up, but some of are taken down. We can’t allow these social media companies to not moderate. They have to do this because it goes against our grain for not only preserving our democracy and our electoral process, but it disenfranchised certain groups of American citizens.

    Nina Jankowicz:

    Can I just add one two-finger there?

    John Donvan:

    Please do.

    Nina Jankowicz:

    Often, then speech that is most moderated, and this is backed up by research, the speech that is most moderated is not against the majority. It is against people of color. It is against women and those that are not able to take their voice and speak it out. It’s those people that are most moderated online, not the folks who are actually breaking the rules.

  • 00:13:52

    John Donvan:

    In about four minutes, I’m going to come to the audience for questions, so please, if you’d like to take part in that, get ready for that. Stewart, all three of the debaters who are taking the opposite side of the question have used the word responsibility. And we’ve now just heard two examples of what that responsibility would be. Now, you argued that the enormity of these companies, to some degree, concerns you about their having this role. But the enormity of these companies would seem to amplify their responsibility as well. So I’d like you to take on your perception of what the responsibility of these companies is to their users, and actually, to the culture.

    Stewart Baker:

    So that takes me to the asterisk that I mentioned at the beginning. You know, we take it for granted that this is a brand-new problem. It turns out that when the telegram was invented, and Western Union became our telegram service provider, they said the same thing that the platforms are saying today. “Hey, this is our platform. You want to send messages about strikes, you’re not going to be allowed on the platform. If you are Associated Press, you can send your stories because you give us good coverage. If you’re United Press International, we don’t like your stories. We’re not going to carry your stories over the telegram.”

  • 00:15:03

    And the outrage from people in the benighted 19th century was loud and strong, and they were turned into, because they had a monopoly, a natural monopoly, into a common carrier, which meant that they had to have nondiscriminatory rules about what they were going to do. That was their responsibility. They had to make it clear what their rules were. They had to carry everybody subject to nondiscriminatory rules that could be enforced by oversight and regulators. We are not in that world now.

    John Donvan:

    Michael?

    Michael Chertoff:

    So here’s the problem with that. Stewart’s confusing two things. There are common carriers, like the telephone company and Western Union, and they do have an obligation. They have to be — they don’t even have visibility, actually, into the content of what they transmit necessarily. They have to open it to all comers, good, bad, or indifferent. But that’s not what the platforms are. The platforms are content providers. In fact, their business model involves providing content that will stimulate people to buy things so they can get advertising revenue.

  • 00:16:05

    In that sense, they’re like a newspaper or like a television show. People do not get to simply say I want to be on your television show therefore you have to let me say whatever I want to say. So I think in this case, responsibility means — you know there are more than four platforms to post things on. There are four big ones, but there are a lot of other ones. The key is to give them the power they have under the Constitution, the First Amendment, to determine what they want to put out there, and particularly, to takedown things that might be violent, that might cause harm, like drink Clorox to deal with COVID, or that might be simply grossly defamatory.

    Stewart Baker:

    But didn’t you say that they had a First Amendment right to take down any damn thing they wanted to?

    Michael Chertoff:

    Yeah. That’s called the First Amendment.

    Stewart Baker:

    So, it’s not misinformation, not special stuff. It’s whatever they choose.

  • 00:16:57

    John Donvan:

    Okay, I’d like to go to questions. Does anybody have a question they would like to jump into the conversation?

    John O’Connor:
    My name is John O’
    Connor. We’ve mentioned the word responsibility, but the word liability hasn’t come up. Nina rightfully points out some degree of inconsistency, some might say whimsy, with respect to application of rules. Do any of you believe there should be more liability, in a bright-line sense, than just this amorphous responsibility, question mark, question mark?

    John Donvan:

    That was a model question. I congratulate you. Thank you. I’m going to record that and play it at all of our debates. That was terrific. Who would like to take that question on?

    Nina Jankowicz:

    Well, this is a whole can of worms, right, because what we’re getting at here is Section 230 of the Communications Decency Act, and we can have an entire debate about that. I come down somewhere in the middle. I think Communications Decency Act, which says that the platforms are not liable for the content that their users post, is not adequate for the internet age. However, what we have seen in places like Germany, where there is a law that makes the companies liable if they do not remove illegal content within a certain period of time, we’ve seen the over-removal of speech there.

  • 00:18:08

    And so, my answer would be we need something in between, right? We don’t want to encourage the companies to just be flatly removing speech because as I said before, that often affects women and minorities more than it affects the majorities of countries. But as I also said, this is not just about removing speech. There are there are inbetweens, adding friction, reducing amplification. You can put that out there. You have the right to make that Facebook post, to put that tweet, but you don’t necessarily have the right to go viral or to have, you know, millions and millions of people feeding off of that, or to make money off of it, which frankly, a lot of disinformants do, make money off of the lies that they are putting out there, as we saw with the Alex Jones Defamation trial recently.

    John Donvan:

    Stewart, I’m giving you special space again as the three against one position.

    Stewart Baker:

    Why don’t we try liability for the person who are sending the hate speech, right? If they’re sending threats to you, Nina, they can be prosecuted.

  • 00:19:00

    Nina Jankowicz:

    Want to be my defamation lawyer, Stewart?

    Stewart Baker:

    It’s a deal.

    Nina Jankowicz:

    All right.

    [applause]

    John Donvan:

    And that concludes debate on our first question.

    [applause]

    So let’s move on to the second question. The question is does America need a governing body to regulate disinformation? Let’s start with you, Stewart. Are you a yes or a no on that?

    Stewart Baker:

    This one’s easy. No.

    John Donvan:

    And Charles?

    Charles Carithers:

    I’m a no.

    John Donvan:

    And Nina?

    Nina Jankowicz:

    I’m a yes.

    John Donvan:

    Okay, and finally Michael?

    Michael Chertoff:

    No.

    John Donvan:

    All right, so, Nina, you’re in the three against one position. But let’s start our first debate around this one will be Stewart. Again, Stewart, you are a no on the question, does America need a governing body to regulate disinformation? You have 90 seconds to tell us why.

  • 00:19:55

    Stewart Baker:

    The problem with having a government body that tells us what is misinformation and what is not is that governments cannot make those decisions without letting politics interfere in the decision. Government is about politics, and when they decide something is misinformation, it will be informed by their political interests. And, you know, I would look at, we talked a little bit about the CDC. When they started offering us advice, we all wanted to believe them, and we wanted to believe that everything they said was true. And the first words out of their mouth practically was, “You don’t need a mask,” followed six or eight weeks later by, “Oh, yeah, maybe you should get a mask.” And the reason they told us we didn’t need a mask is because they thought that health workers ought to have the mask, but they didn’t tell us that. They didn’t say they’d work, but you shouldn’t get them. They said you don’t need them. That repeated itself time and again, even with the CDC, even with something like medical information. I think it’s really dangerous they asked the government to make these calls across the board.

  • 00:21:00

    John Donvan:

    All right, thank you. I’m going to skip to the yes for a little back and forth. So, Nina, your 90 seconds.

    Nina Jankowicz:

    I am shocked and ask myself every day how is it that an industry with such power like social media, with such an impact on our everyday personal and professional lives we don’t know how it works really? What we know about it comes from researchers like me, based on the data that the social media platforms give us access to, and whistleblowers. I believe what we need is a Federal Internet Commission, kind of like we have the FCC and the FAA. If a plane’s crashing over and over, we’re going to go investigate that airline, right? I think we’ve had a lot of plane crashes with social media. And I think we need to make effective policy based on, frankly, you know, information that is unbiased.

  • 00:21:46

    Right now, we have so much polarization over what’s going on with social media that we need to kind of pull back the lid and understand that a little bit more. And again, this wouldn’t be about removing speech. It would be about understanding the algorithms that Secretary Chertoff mentioned before. It would be understanding the business practices and the content moderation decisions that are made. I believe all Americans need to know that, and it would increase the trust that that Charles was talking about before. So that is what that regulatory body would do in my opinion, and then eventually enforce some rules that would come later down the pike.

    John Donvan:

    Charles, it is your turn. You are a no.

    Charles Carithers:

    I can’t believe I’m saying this. I agree with Stewart Baker. I think Stewart is absolutely right. He’s spot-on. Do we really want to add another layer of bureaucracy on the already, you know, bureaucratic United States government? That’s number one. Second, you know, Nina mentions polarization. So who are going to be the arbiters of truth here, right? How are we going to honestly determine what’s true or what’s not, especially in 2022? I think you honestly would have an easier time mopping the ocean than getting individuals to decide together what’s true or what’s not.

  • 00:23:03

    John Donvan:

    Michael Chertoff, you are a no.

    Michael Chertoff:

    Sure. So Full disclosure, I co-chaired a committee for the Homeland Security Advisory Council that actually recommended not having a governance board of DHS for disinformation.

    John Donvan:

    Thank you. We appreciate the disclosure. Thanks.

    Michael Chertoff:

    And I agree with the fact that we don’t want to have the government telling us what is true or not true or disinformation or not in not disinformation. Here’s a thought experiment. Imagine Donald Trump is appointing the members of that board, and every time someone put something online that he lost the election in 2020, it has to be taken down. I mean, we could live under Putin if we wanted to live that kind of experience. We don’t need to live it in the U.S. Now, that doesn’t mean there can’t be any rules. I think you could have Congress create neutral rules, for example, saying things like you have to disclose who actually posted something, or you have to disclose your algorithms, or you have to disclose what your terms of service are.

  • 00:24:03

    But those would be rules of general application. They would not be rules that are designed to govern the content of the information you put on.

    John Donvan:

    But who would enforce the rules?

    Michael Chertoff:

    Well, the answer is if you violated the rules, it was a statute, then, presumably the Department of Justice would sue you, get an injunction, or something of that sort. I mean, there’s several ways you can do that, but the point is, there wouldn’t be about content. It would be about disclosure and transparency.

    John Donvan:

    Okay, well let’s mix it up. Let’s start the conversation. Nina?

    Nina Jankowicz:

    I got to clear something up here, Secretary Chertoff, and I’m hesitant to even open this can of worms. But full disclosure, I was the chair, the executive director of the Disinformation Governance Board. And the Disinformation Governance Board was never going to tell people what was true or false online. It was an internal coordination body, meant to, as you all know, DHS is a huge, sprawling organization meant to hurt some government cats. That’s all it was meant to do. And the fact that we’re even talking about today that it might have decided what was true or false online is extremely sad to me.

  • 00:25:07

    Michael Chertoff:

    No, and you’re absolutely right. And the reason we decided to actually suggest it not be used is because the title of it, Governance Board, conveyed a misimpression.

    Nina Jankowicz:

    Sure.

    Michael Chertoff:

    Having the government coordinate to play by the rules is appropriate. But having something that suggests, even erroneously, that the government is going to govern speech is a very bad idea.

    Nina Jankowicz:

    Yeah, yeah. The name was bad. I did not come up with it, but the fact that then we said, “All right, we’re not going to put this body forward and address this issue within the Department of Homeland Security,” which has so many equities that cover, you know, disinformation, at the border, related to natural disasters, with our elections, it was a very sad moment.

  • 00:25:55

    John Donvan:

    All right, I got to get Stewart and Charles into this. So, Stewart, you made a sort of philosophical argument. Well, it’s also a practical argument. You said the government can’t be doing politics. And Charles, you also made a practical argument that we just don’t need one more layer of bureaucracy. So let’s dig a little bit into what you’re talking about, and again, let Nina respond. So why don’t you take that first, Charles?

    Charles Carithers:

    I just — this would be a hard sell to the American people in this day of age, given how politically divisive our nation is, right? No matter how you define it, how you spin it, at the end of the day, you know, there are going to be certain aspects of the American populace that will call this the Truth Police. That’s going to be the perception. Then on top of that, you’re going to formally establish this within the United States government. So, then we have to ask ourselves, okay, how is oversight going to be conducted on this? How can we possibly be unbiased in conducting oversight and trying to regulate disinformation? You’re going to open up a can of worms here by establishing this. And then, you know, it’s going to be a real sticky situation because I don’t see how you reconcile this with protecting our First Amendment.

  • 00:27:07

    John Donvan:

    All right, let’s stack up some no’s before we come back to you, Nina. So, Stewart.

    Stewart Baker:

    So, Nina, with deepest sympathy, I want to say welcome to Washington.

    [laughter]

    You didn’t deserve what happened to you, but the reason it happened is not because coordination of rumor control is a bad thing, but because in the last 10 years, it’s not just what the government is saying, which we’ve all learned to listen to the government and to take it with a grain of salt, and to ask what’s their motivation in telling us this in particular? And we’re all comfortable with that, but we’re not comfortable with the idea that no one can say anything that contradicts it, which is unfortunately, under the rule of the four companies, pretty much how they administer their misinformation practices, and that’s why it provoked such a reaction.

  • 00:28:03

    And we shouldn’t be in the business of saying this is what can’t be said. And I’m afraid people, understandably, read it that way.

    Nina Jankowicz:

    And I understand that people are rightfully skeptical of government intervention in this area. Again, that’s not what I think should happen, could happen. Wouldn’t have taken the job if that were the description, but I want to talk a little bit more about what Charles was saying that this isn’t about — you know, you don’t want another layer of bureaucracy here. I think we actually do need a little bit more bureaucracy in coordinating the efforts of our government related to counter disinformation activities. Right now, they are happening in parallel across the government, at the DOJ, within the IC, at the Department of State.

    Michael Chertoff:

    Well, where we left it at DHS was to encourage coordination among the existing institutions, including the general counsel on the privacy.

  • 00:28:53

    But, you know, one thing is when you’re creating new organizations, as Charles has suggested, it winds up looking for a mission. And the area of free speech is an area that’s really fraught in this respect. One thing I will also reserve to people, this is not a new problem. Go back to the founding of the nation. The Burr-Hamilton duel was fought over scurrilous newspaper articles accusing one of the combatants of fathering an illegitimate child. This kind of news has always been out there, and we’ve relied upon the First Amendment as the principal defender. Now again, I’ll emphasize where that has changed is the use of algorithms because algorithms allow you to turbocharge misinformation by using data that has been taken, or purchased, or even stolen from you in order to figure out what are your particular hot buttons. And that’s an area where I do think there is more room for some kind of rule or regulation as opposed to the content itself.

    John Donvan:

    So do you have something, sort of a concept for that?

    Michael Chertoff:

    Yeah. The concept should be to require the disclosure of data that is taken from private individuals that is being used to communicate information to them, and also disclosure so researchers can look at algorithms.

  • 00:30:06

    So there could be a way of people understanding, “I’m getting this message because someone has looked at all my job searches or my online searches for the last year, and they figured I’m looking for a particular job.”

    John Donvan:

    Nina, would that be enough?

    Nina Jankowicz:

    No. I mean, what we’re talking about there is essentially micro targeting, right? And that’s part of the problem. That’s part of the problem, but it’s not the entirety of the problem. A lot of times, what we see is networked disinformation where there is coordination and amplification of the most enraging material. People are knowingly lying because they know the most engaging content online is the most enraging content, right? That gets more at the algorithmic question there. Should the companies be promoting things that are inciting violence and hatred that are based in lies?

  • 00:30:55

    But it’s a broader question than either one of those two things. And again, I think when it comes to questions of public safety, public health, and the functioning of our democracy, there is that responsibility that we were talking about before. And someone needs to conduct the oversight about the content moderation decisions that are being made not only by algorithms, but by humans. Then we won’t have this polarization, this Ministry of Truth conversation, because we’ll be able to pull back the lid and see what’s going on.

    [music playing]

    John Donvan:

    More from Intelligence Squared U.S. when we return.

    [music playing]

    Welcome back to Intelligence Squared U.S. I’m John Donvan. Let’s get back to our debate.

    John Donvan:

    We’re going to go to a question in a moment. Is anybody ready to jump in?

    Suzanne Spaulding:

    Suzanne Spaulding from the Center for Strategic and International Studies. I wanted to ask if the government is being told to be hands-off, are we unilaterally disarming in the face of a serious adversary threat that will amplify domestic voices?

  • 00:31:55

    John Donvan:

    With the implication being should the government be doing something? Should the question, doesn’t the governing body need to —

    Michael Chertoff:

    Sure. Sure

    John Donvan:

    — regulate disinformation would be yes. You’re —

    Michael Chertoff:

    That answer is what do you mean by hands-off? If you’re asking me should the government censor and shut things down, the answer is no. should the government disclose publicly and loudly, this is coming Vladimir Putin, sitting in Moscow, pretending to be John Donvan, yes, the government can do that. Government can correct and amplify the correction. What it can’t do, except in some specific cases, is shut it down and stop it.

    Stewart Baker:

    That’s already the law, basically. If you are a media organization, owned or controlled by Vladimir Putin, you need to register and disclose that when you send out your news articles.

    Charles Carithers:

    And to your first question, which was rescinded, absolutely yes. Russia’s doing that right now, and other state actors and non-state actors. But when it comes to foreign elements, doing just as you suggested, CISA’s actively engaging and trying to mitigate that right now.

  • 00:33:01

    Nina Jankowicz:

    So if only it were so simple, right, that we could say, “Here’s a Russian entity. They’re spreading Russian disinformation. Americans, be warned.” Right? Unfortunately, the Russians, the Iranians, the Chinese have gotten a lot smarter over the past six years while we’ve been sitting here treading water. What they are doing now is more information laundering. Rather than just putting something out through trolls or bots, or through RT or CGTN, they are finding willing individuals, either witting or unwitting, amongst us, who are happy to amplify those narratives. And that’s where we get into the domestic disinformation problem. I agree, Suzanne, if we are throwing our hands up and saying there’s very little that we can do, only if it’s clearly foreign disinformation, we are winning Vladimir Putin’s battles for him. And that is very scary to me.

    John Donvan:

    I think Suzanne’s question also implied should there be a government agency that is directly involved in response?

    Nina Jankowicz:

    Yes.

  • 00:33:55

    Michael Chertoff:

    John, it depends on what you mean by response. If you mean, should a government agency be tasked with detecting this and then announcing that, in fact, this is coming from Putin, I think that’s perfectly fine. If it’s you have to take it down, assuming it’s not something that’s illegal like child pornography, then I think that goes too far.

    John Donvan:

    Charles, last word for you in this round.

    Charles Carithers:

    I agree with Secretary 100 percent.

    John Donvan:

    In that case, Stewart?

    Stewart Baker:

    I’m there too.

    John Donvan:

    That concludes conversation and debate on the second question.

    [applause]

    Now going to do the third and the final question. The question is, was Twitter right to ban Donald Trump? Let’s look at where you stand on this question. Charles, can you go first?

    Charles Carithers:

    Yes.

    John Donvan:

    Stewart?

    Stewart Baker:

    No.

    John Donvan:

    And Nina?

    Nina Jankowicz:

    Yes.

    John Donvan:

    And Michael?

    Michael Chertoff:

    Yes.

    John Donvan:

    So we have three yesses and one no. Stewart, you get that little extra time again.

    Stewart Baker:

    Yeah. Some people will do anything for a little extra speaking time.

  • 00:34:58

    John Donvan:

    All right, we’re going to ask Charles to go first. And the question was, was Twitter right to ban Donald Trump? You’ve got 90 seconds, Charles.

    Charles Carithers:

    Thank you. Obviously, I support the First Amendment. I do not think politicians should be removed from social media platforms unless they incite violence, unless they incite harm. And I am of the opinion that former Pres. Donald Trump did just that, more specially with his tweet on December 19th. That’s where I am.

    John Donvan:

    Thank you. And next would be — well, let’s go with you, Stewart. You’re the no.

    Stewart Baker:

    So I think, look, he’s the first Republican candidate for president in 40 years that I haven’t voted for twice now, and like sort of hope I don’t have a third opportunity. But when you say he can’t speak, you’re not just disrespecting him and showing contempt for him. You’re showing contempt for the people who voted for him, who believe what he says and who want to hear from him. And that’s a serious, serious thing to do.

  • 00:36:02

    And before we decide we’re just going to take him out and take somebody who has that kind of responsibility out of the public square, you better have a good reason. And, you know, Twitter’s reason was he’s inciting violence. That was on January 8th that they said that. That wasn’t January 6th. Whatever he had incited, and, you know, the causation chain is what it is was over. And what they said was, “We think he’s inciting future violence. He’s glorifying future violence.” And the case for that, well, where is that violence that he incited, is really weak. And so the idea that we took him off because he was inciting violence and then realized that he wasn’t actually doing something that directly incited violence, I think, discredits the decision.

  • 00:36:59

    Michael Chertoff:

    Yeah, so again, this decision of the platform would be a different answer perhaps if we said the government required it to be taken off. I would also observe he then started his own platform. I don’t know how successful it is, but he wasn’t deprived of his ability to speak. But the bottom line is incitement to violence, as Oliver Wendell Holmes, said you can’t shout fire in a crowded theater. This takedown occurred a few days after January 6th, which I have to say, based on the public evidence, looks an awful lot like an insurrection, which he fueled with his statements. Now, did he spell out, “Oh, I want you to go kill Mike Pence,” or “Oh, want to overthrow the U.S. government?” No. But let me tell you, I mean, he grew up in the milieu of New York, the construction industry when it was dominated by the mob. And I investigated and prosecuted the mob on the stuff that he was involved in.

  • 00:37:51

    And you know what mobsters do when they want someone killed? They don’t go, “Go kill so and so.” They go, “This guy’s a problem.” And then the guy winds up dead with cement shoes in the in the East River. So what Trump was doing was saying things that the people of Twitter legitimately had a concern about, whether they were mind readers or not, legitimately had a concern about was fueling another round of violence. And when Trump said in one of the tweets, “I’m not going to the inauguration,” they were concerned that was being read as, “There isn’t going be any inauguration because there’s going to be round two of this.” We were dealing with a very fraught situation. Again, I’m not saying the government had the power to do this, but what I am saying is it’s reasonable for Twitter to say this is too close to incitement.

    John Donvan:

    Yeah, and the question is about whether Twitter was right to do it as opposed to the government’s role. But, Stewart, I’m going to let you do the three against one jump back into the conversation.

    Stewart Baker:

    Yeah, so you’re right that that’s what he said. This the two tweets that Twitter objected to and said they were glorifying violence. I’ll read them out. “The 75 million great American patriots who voted for me will have a giant voice long into the future. They will not be disrespected or treated unfairly in any way, shape or form.” And then, “To all those who have asked, I will not be going to the inauguration on January 20th.”

  • 00:39:05

    Now, Twitter says that’s a glorification of violence. You’ve been a Court of Appeals judge. If a District Court wrote an opinion saying that’s a glorification of violence, and I’m enjoining speech because of the glorification, how long would it take you on appeal to reverse that decision?

    Michael Chertoff:

    You’re absolutely right, but here’s, again, I keep coming down to this difference. For the government to do it is a different standard than for me as an individual to say, “This is my platform. My free speech rights are I don’t want to be in any way, shape or form, being seen as endorsing or promoting violence.” Those are two dramatically different things.

    John Donvan:

    Okay, I want to call a timeout because Nina did not get her 90 seconds.

    Stewart Baker:

    Yes.

  • 00:39:49

    Nina Jankowicz:

    So, I would agree with what Charles and Secretary Chertoff have said. I would also say that, you know, we have these different standards for politicians, right, because it is in the public interest to hear what they’re saying. I actually think that politicians should be held to a higher standard of speech because of the platforms that they have. And yes, Donald Trump got his poor little Twitter account shut down, but he still was the president of the United States, the most powerful man in the world, from January 8th to January 20th, and had plenty of, you know, a stage and audience to hear him speak.

    I think even if this had happened earlier that there would have been plenty of room and kind of oxygen and amplification of the words that he said. He could, you know, call cameras to the Oval Office at any time. When you are inciting to violence and maybe not in those two tweets, Stewart, but certainly we could look back earlier before January 6th, in December, as well as during the summer of 2020 during the George Floyd protests and the president’s response to that, that polarized us, that led extremists to, frankly, commit violent acts not only in the capitol, but in several other instances before that. So again, I believe that politicians should be held to a higher standard than the rest of us, and that those terms of service should be applied equally to everybody.

  • 00:41:05

    John Donvan:

    Charles, you and your fellow yes votes are making the argument primarily that his behavior on Twitter was an incitement to violence and that justifies Twitter’s decision to take him off. And while Stewart has some pushback on that, he also made a second argument that none of you have responded to, which is it’s an insult to the people who voted to them. It’s a slight to the people who voted for Donald Trump to have him removed. I’m curious what your take is on that argument.

    Charles Carithers:

    My take is this, that the individual whom you voted for and wanted to be president of the United States did a disservice to the country. When a politician uses a platform that can reach millions, if not billions, of individuals, and says that the election is fraudulent, that the results should not be trusted, how do you not think that those remarks aren’t going to galvanize individuals into thinking that democracy is now in question, that democracy is being challenged?

  • 00:42:08

    And then in the same said tweet he says, “Come to D.C. It’s going to be wild.” And if I’m one of his supporters, you know, and to Secretary Chertoff’s point, I’m taking that as, you know, a call to action. And that’s exactly what happened. And the January 6th Select Committee methodically spells this out. There’s testimony and depositions from individuals who said that they were directly galvanized to come to D.C. to storm the capitol because Pres. Trump told them to do so. You can’t ignore that.

    John Donvan:

    So yeah, while Stewart is saying it’s an insult to his followers, you’re saying the fact that his followers would believe and trust him is the issue. But I want to take to Michael. Also, Stewart’s point that Twitter’s decision to take Trump offline was a slight to the people who voted for him.

  • 00:43:02

    Michael Chertoff:

    I don’t agree with that. You’re not taking the followers offline. You’re not even taking them off if they retweet something. But if someone, for example, decides they want to tweet pictures of child pornography, among other things, the fact that people follow that individual doesn’t mean that if you take him down for violating the law, that you’re insulting the followers. The followers are totally free to follow Trump on whatever it is, Truth Social, whatever he’s got. They’re not flocking there, but that’s their decision.

    They’re not being disqualified because they followed him, so I think this is an appropriate response. And I think, as Charles pointed out, it comes against the context of a lot of dog whistles, like it’s going to be wild, like those right wing, you know, people who ran over that woman in Charlottesville are good people. I mean, when you look in context, the people who run Twitter have to say, “Do we really want to continue to let someone propagate calls to action that involve violence and overthrowing the U.S. government?” Followers are free to follow them elsewhere, but the speaker isn’t free.

  • 00:44:06

    John Donvan:

    So Stewart, you have not been persuasive with your opponents on that particular point, so I want to know where you are on that in terms of their responses to it, and how strongly you want to defend that point.

    Stewart Baker:

    So, it’s true that his followers can continue to speak, but they wanted to listen. They wanted to hear what he has to say. And this is a rebuke to their views. It is saying your views are not even acceptable in, you know, decent company. And I think that if you’re going to say that you better be sure. And we talked about the responsibility of the President, and he was not responsible. But everybody up here agreed that Twitter had a responsibility and that they should explain themselves, that we need more visibility into their decision processes.

  • 00:44:55

    They gave us visibility. They said our determination is that those two tweets, the one about how his followers won’t be disrespected and he’s not showing up for the inauguration, that they are likely to inspire others to replicate the events of January 6th and that there are multiple indicators they’re being received and understood as encouragement to do so. And yet, in fact, there has not been any such.

    Nina Jankowicz:

    Well, absence of evidence doesn’t mean that it would have not happened.

    Stewart Baker:

    They said, “We’re worried they’re going to show up on Inauguration Day and cause a problem.” Nobody did.

    Michael Chertoff:

    Right. It’s great that the U.S. government arrested a bunch of people.

    Nina Jankowicz:

    Yeah.

    Stewart Baker:

    This is their implausible excuse to ban him. And then, what are we going to do? Is he being punished with a time out? Or is this something where we still think he is about to incite violence? He hasn’t.

    John Donvan:

    So you just said — I think you implied they were looking for an excuse to ban him.

    Stewart Baker:

    Yeah, of course they were.

    John Donvan:

    Why didn’t they earlier?

  • 00:46:00

    Stewart Baker:

    Precisely for the reason we talked about, there are very good reasons for somebody who has that kind of a following to leave them up. You know, frankly, I’m not even sure it’s good for him to be on Twitter. It gives us an insight into his mind that’s, you know, not the most appetizing thing. And we all should — we understand him and Elon Musk better than anybody else of similar providence. And I think Twitter has done us a public service by letting these folks air themselves. And that’s one of the reasons why they left him up. I think they should have done it, and now they’re stuck. They don’t know when — when are they going to let him back in?

    John Donvan:

    So, I just want to ask the panel in general, whoever wants to jump in on this, I think Stewart is implying that while Twitter said he was banned for these two tweets, he was banned for four years of tweets and it broke the camel’s back. Does anybody feel that that’s actually what happened? And is it somehow disingenuous for Twitter to cite these two tweets as the reason?

  • 00:46:59

    Michael Chertoff:

    I can’t read the mind of people in Twitter, but it seems to me it’s hard to evaluate this without recognizing a big thing happened on January 6th, and that made both his prior statements look much more problematic and meant that anything subsequent against that background was problematic. Now again, I want to keep going back to my point. I’m not saying the government should have banned it, but certainly as a platform owner or as an editor, in the analogy I’m using, an editor could say, “This person wants to use my newspaper or my television show to inspire violence and shout fire in a crowded theater, and I don’t want him to do it.” And I think that’s appropriate. And, you know, whether they’re mind readers or not, it certainly seems to be reasonable.

    John Donvan:

    Thank you. So once again, we polled you before you heard all the arguments, and we polled you again afterwards. And we’re just interested in seeing how people might have, if that all, changed their minds.

  • 00:47:56

    So on the first question before the debate, where the question was should tech companies moderate misinformation that their users post, before the debate, 75 percent said yes and 25 percent said no. After the second poll, 70 percent said yes and 30 percent said no. On the question, does America need a governing body to regulate disinformation, before the debate, 45 percent said yes, 55 percent said no. After the debate, 26 percent said yes, and 74 percent said no. And finally, on was Twitter right to ban Donald Trump, before the debate, 59 percent said yes and 41 percent no. After, on the second poll, 60 percent said yes, and 40 percent said no. Right, there’s no bow taking.

    Michael Chertoff:

    Everybody won.

  • 00:48:51

    John Donvan:

    Well, no, it’s not kid soccer but —

    [laughter]

    So let’s wrap it up. We really appreciated this, enjoyed it again. We appreciate our debaters and the Homeland Security Experts Group for having us here, and our founder and chairman, Robert Rosenkranz, and Clea Conner, our CEO, and to you, our audience. And I’m John Donvan for Intelligence Squared. We will see you next time.

    [applause]

    [music playing]

    Thank you everybody for tuning in to this episode of Intelligence Squared, made possible by a generous grant from the Laura and Gary Lauder Venture Philanthropy Foundation. As a nonprofit, our work to combat extreme polarization through civil and respectful debate is generously funded by listeners like you by the Rosenkranz Foundation and friends of Intelligence Squared. Robert Rosenkranz is our chairman. Clea Conner is CEO. David Ariosto is head of editorial. Julia Melfi and Marlette Sandoval are our producers. Lia Matthow is our consulting producer. Damon Whittemore and Kristin Mueller are our radio producers. Andrew Lipson is director of production. Raven Baker is our events operations manager. And I’m your host John Donvan. We’ll see you next time.

  • 00:50:08

    [end of transcript]

    This transcript has been lightly edited for clarity. Please excuse any errors.

Breakdown
Should tech companies moderate misinformation that their users post?
BIGGEST SHIFT Yes

+7.14% -Yes
-7.14%-No
Does America need a governing body to regulate disinformation?
BIGGEST SHIFT No

+3.57%-No
-3.57% -Yes
Was Twitter right to ban Donald Trump?
BIGGEST SHIFT No

0.00%-No
0.00% -Yes
JOIN THE CONVERSATION
2

Have an idea for a debate or have a question for the Open to Debate Team?

DEBATE COMMUNITY
Join a community of social and intellectual leaders that truly value the free exchange of ideas.
EDUCATIONAL BRIEFS
Readings on our weekly debates, debater editorials, and news on issues that affect our everyday lives.
SUPPORT OPEN-MINDED DEBATE
Help us bring debate to communities and classrooms across the nation.