User login

Join The Debate

Cast your vote and join the conversation.

Membership is free.


Get Started

You are here

Constitutional Free Speech Principles Can Save Social Media Companies from Themselves

Constitutional Free Speech Principles Can Save Social Media Companies from Themselves

Constitutional Free Speech Principles Can Save Social Media Companies from Themselves
The BriefGet Up To Speed

How should the world’s largest social media companies respond to a pernicious online climate, including hate speech and false content posted by users? For some, the answer is clear: take the fake and offensive content down. But for others, censorship – even by a private company – is dangerous in a time when digital platforms have become the new public square and many Americans cite Facebook and Twitter as their primary news sources. Rather than embracing European hate speech laws or developing platform-specific community standards that are sometimes seen as partisan, they argue, social media companies should voluntarily adopt the First Amendment and block content only if it violates American law. Should First Amendment doctrine govern free speech online? Or are new, more internationally focused speech policies better equipped to handle the modern challenges of regulating content and speech in the digital era?

Background

"Their positions give these young people more power over who gets heard around the globe than any politician or bureaucrat—more power, in fact, than any president or judge."

Monday, April 29, 2013
Jeffrey Rosen
"Anyone with a Twitter account can follow the president. Well, almost anyone."
Wednesday, October 11, 2017
Lincoln Caplan
"The First Amendment protects individuals from government censorship. Social media platforms are private companies, and can censor what people post on their websites as they see fit. But given their growing role in public discourse, it’s important to ask ourselves–what exactly are their censorship policies? How do they compare to each other, and to the First Amendment’s protections?"
Tuesday, January 1, 2019
Lata Nott

"Social media companies want to create a “safe” environment for users, and yet they would like to be seen as upholding the American value of free speech. When it comes to hate speech, what should they do?"

Sunday, September 23, 2018
Knowledge Wharton
"The case could have broader implications for social media and other media outlets. In particular, a broad ruling from the high court could open the country's largest technology companies up to First Amendment lawsuits."
Tuesday, October 16, 2018
Tucker Higgins

"Yesterday, however, that decision was overturned. A federal appeals court ruled that a Facebook Like is, indeed, a form of expression that is covered by the First Amendment. Clicking a button is, per the decision, a protected form of speech."

Thursday, September 19, 2013
Megan Garber

"Ultimately, Facebook hedged. The company announced that the post would generally be removed because it originated from the neo-Nazi website Daily Stormer but that the post could still be shared if accompanied by a condemnatory caption."

For the Motion

"The great 21st-century platforms — Facebook, Twitter, YouTube, Snapchat, and the rest — have this year found themselves in the middle of the speech wars. Twitter is struggling to contain vile trolling and harassment, and Facebook has gotten scalded on the little toe it dipped into curating journalism."

Thursday, June 2, 2016
Nabiha Syed and Ben Smith
"In the battle over what limits should be imposed on online free speech, regulators worldwide are on the offensive."
Sunday, January 14, 2018
Mark Scott

"The First Amendment only limits governmental actors—federal, state, and local—but there are good reasons why this should be changed. Certain powerful private entities—particularly social networking sites such as Facebook, Twitter, and others—can limit, control, and censor speech as much or more than governmental entities."

Tuesday, January 1, 2019
David L. Hudson, Jr.

"Facebook, Google and Twitter on Tuesday sought to defend themselves against accusations from Republican lawmakers who said the tech giants censor conservative news and views during a congressional hearing that devolved into a political sniping match."

Tuesday, July 17, 2018
Tony Romm
"But the removal of the video shows that the public cannot trust social media sites like Facebook to serve as a free, open marketplace of information. It is still a walled garden."
Friday, July 8, 2016
Jack Smith IV
Against the Motion
"The justices of the U.S. Supreme Court on Monday heard arguments in a First Amendment case that experts have said could have ramifications for how the nation's largest social media companies are permitted to moderate the content on their platforms."
Monday, February 25, 2019
Tucker Higgins
"The platform has cast itself as the internet’s kindest place. But users argue harassment is rampant, and employees say efforts to stem it aren’t funded well or prioritized."
Monday, October 15, 2018
Taylor Lorenz
"Although a free and open internet creates immense social value, it can also be used to engage in hateful activities on a large scale. For example, white supremacist and other organizations that incite hate are using online platforms to organize, fund, recruit supporters for, and normalize racism, sexism, religious bigotry, as well as anti-LGBTQ and anti-immigrant animus, among other activities."
Wednesday, October 24, 2018
Southern Poverty Law Center

"For several reasons, internet companies will be in different places as they embark on addressing the issues raised in the recommended corporate policies and terms of service. First, many companies have already undertaken significant steps, whether driven by altruism, employee concerns, a commitment to human rights, or being publicly blamed for violence or other hateful activities—or a combination of all of these."

Thursday, October 25, 2018
Henry Fernandez

"Of course, none of this will matter unless the leadership of Facebook (and other tech companies) are willing to stand up to government bullies who seek a way around the First Amendment. Nothing is going to be more important in the days to come than making sure the governance of online speech is truly private."

Wednesday, March 14, 2018
John Samples
Content Moderation Policies

"Twitter CEO Jack Dorsey testified before the House Energy & Commerce Committee about how his company uses artificial intelligence and humans to manage content on its platform."

Wednesday, September 5, 2018
C-SPAN
"Moderating billions of posts a week in more than a hundred languages has become Facebook’s biggest challenge. Leaked documents and nearly two dozen interviews show how the company hopes to solve it."
Thursday, August 23, 2018
Jason Koebler and Joseph Cox

"Each company will describe their content moderation and removal operations, such as org charts, department names and job titles, headcount, who determines the policies, escalation paths, and 'best practice' tips."

Friday, February 2, 2018
Santa Clara University
Section 230
47 U.S. Code § 230 - Protection for private blocking and screening of offensive material provided by the Legal Information Institute of Cornell University Law School. 
Monday, March 11, 2019
Cornell University Law School

"Section 230 of the Communications Decency Act of 1996 (a common name for Title V of the Telecommunications Act of 1996) is a piece of Internet legislation. It provides immunity from liability for providers and users of an interactive computer service who publish information provided by others."

Tuesday, January 1, 2019
Minc

"Section 230 has a simple, sensible goal: to free internet companies from the responsibilities of traditional publishers. Sites like Facebook and Twitter host comments and commentary that they don’t produce, edit, or even screen themselves, and Section 230 of the act ensures that those companies can’t be sued for content they host for which they haven’t assumed responsibility."

Monday, February 25, 2019
Joshua A. Geltzer
"Today, this law still sits at the heart of a major question about the modern Internet: How much responsibility do online platforms have for how their users behave or get treated?"
Wednesday, March 21, 2018
Alina Selyukh

"After all, it has the bulwark of cyberlaw on its side. In each of the four lawsuits, Airbnb’s lawyers confidently buttressed their defense with a 20-year-old federal statute: Section 230 of the Communications Decency Act. Tucked into the mammoth Telecommunications Act of 1996, this landmark piece of legislation is often cited as the most important tool ever created for free speech on the internet."

Tuesday, January 3, 2017
Christopher Zara
"In the United States, 28 U.S.C. § 230—which provides that online service providers are not considered "publishers" of third-party content posted or shared through their sites—has helped turn the internet into both a robust forum for public speech and the driving economic force behind an industry that increasingly relies on user-generated content."
Thursday, October 4, 2018
Jacquelyn N. Schell, Christopher M. Proczko, and Charles D. Tobin
German Laws
"Germany is set to start enforcing a law that demands social media sites move quickly to remove hate speech, fake news and illegal material."
Monday, January 1, 2018
BBC
"Social media firms must remove hate speech or face fines up to £44m under controversial law that came into force on 1 January"
Friday, January 5, 2018
Philip Oltermann

"Essentially, the laws are a bold imperialist SuperNannyState move by the Western nation still most determined not to repeat its past. As a mother to two wildbad kids, I love a good nanny. You know how some days you want to debate free speech and the hazards of nationalizing the internet, and other days you want no Nazis on Twitter? Today’s one of those no-Nazis-on-Twitter days. For now and maybe only for now: Life is superb in fake Bad Wildbad."

Monday, February 5, 2018
Virginia Heffernan

"The new German law that compels social media companies to remove hate speech and other illegal content can lead to unaccountable, overbroad censorship and should be promptly reversed, Human Rights Watch said today. The law sets a dangerous precedent for other governments looking to restrict speech online by forcing companies to censor on the government’s behalf."

Wednesday, February 14, 2018
Human Rights Watch

"Over the past year, lawmakers from Brussels to Washington have discussed whether and how to regulate social media platforms. In Germany, a central question has been whether such platforms—which Germans call social network providers (SNPs)—should be held liable if they fail to delete or remove illegal content. In 2017, the German Bundestag provided an answer to this question when it enacted the Network Enforcement Act (NEA), which came into effect in January 2018."

Thursday, December 27, 2018
Nele Achten
European Commission

"Tech companies including Facebook, Google and Twitter remove 72 percent of illegal "hate speech" on their platforms, the European Commission said on Monday."

Monday, February 4, 2019
Elizabeth Schulze
"Facebook, Twitter and YouTube have streamlined their ability to respond and remove hate speech content on their platforms, according to EU officials. But some see the crackdown as an attempt at online censorship."
Tuesday, April 2, 2019
DW News
"After four media giants have agreed with the EU to identify and remove hateful and violent materials online, social media users discussed the rights of free speech using a hashtag that supposedly condones hate speech."
Wednesday, January 6, 2016
Cherie Chan
"European Union officials have given Facebook, Google and Twitter a rare pat on the back, for cutting the time it takes to scrub hate speech from their platforms."
Monday, February 4, 2019
David Meyer
High-Profile Incidents
"The social network has fueled ethnic cleansing of the Rohingya."
Friday, April 6, 2018
Jen Kirby
"Facebook is heeding criticism for its content takedown and account deactivation appeals process, which was initiated alongside the company’s expanding content moderation practices."
Thursday, November 15, 2018
Caroline Haskins
"The anti-Semitic online screeds tied to the man police say killed 11 people at a Pittsburgh synagogue are rekindling a debate in Congress over the role that social media companies should play in policing their platforms — and the penalties they should face if they fail."
Monday, October 29, 2018
Tony Romm

"Public health experts have pointed fingers at social media platforms, saying that false claims that vaccines cause autism and other diseases have frightened parents into refusing to vaccinate, resulting in the current measles outbreak that started in Washington state."

Tuesday, February 26, 2019
Elizabeth Cohen and John Bonifield