Those challenges underlie a case that the Supreme Court will hear next month, when it will consider whether Google, which owns YouTube, can be sued for helping the terrorist group ISIS promote its message and attract followers. The case illustrates the hazards of increased civil liability for social media companies, which critics on the right and the left wrongly see as the key to better moderation practices.
Since 1996, federal law has shielded websites from most kinds of civil liability for content posted by users. Under 47 USC 230, "no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
Section 230 also protects "any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." These two kinds of immunity aim to avoid potentially crippling litigation that would impede the availability of user-generated information and deter content moderation, making the internet as we know it impossible.
In 2021, the U.S. Court of Appeals for the 9th Circuit ruled that Section 230 barred a lawsuit against Google by the family of Nohemi Gonzalez, a 23-year-old U.S. citizen who was killed in a 2015 ISIS attack while studying in Paris. The plaintiffs originally argued that Google was liable under the Anti-Terrorism Act for allowing ISIS videos to remain on YouTube and for increasing exposure to them through its "up next" feature, which suggests videos similar to ones users have watched.
On appeal to the Supreme Court, Gonzalez's family concedes that Section 230 means Google, which bans YouTube videos "intended to praise, promote, or aid violent extremist or criminal organizations," cannot be sued for failing to fully enforce that policy. But the plaintiffs argue that the company can be sued for pointing users to such videos when they view similar content, and the Biden administration agrees.
If YouTube's "algorithmic tools" expose Google to liability for content it did not create, in other words, every provider of an "interactive computer service" will have to worry about the legal risk of guiding users through a massive morass of material that would otherwise be unmanageable. This is just one facet of a broader problem with making it easier to sue websites over third-party content.
The fact that two sets of critics blame Section 230 for either too little or too much content moderation suggests something is wrong with their reasoning. In reality, the First Amendment protects both "hate speech" and editorial discretion.
Seven years after Islamic State extremists murdered their daughter, the family of Nohemi Gonzalez, the only American killed in the 2015 Paris terror attacks, heads to the U.S. Supreme Court on Tuesday seeking to pin some responsibility for the tragedy on social media giant YouTube.
\"If some changes can be done to prevent these terrorist people [from] keeping killing human beings, then that is a big thing,\" Beatrice Gonzalez, Nohemi Gonzalez's mother, told ABC News in the family's first interview about the case.
Beatrice Gonzalez alleges that Google's YouTube algorithms -- a series of proprietary software instructions which recommend video content to users -- effectively amplified Islamic State-produced materials in support of the extremists that killed her daughter, a 23-year-old college student who had been studying in France.
The family wants to bring a case against the company under the Anti-Terrorism Act but has been blocked from doing so because of a landmark federal law that has given sweeping legal immunity to social media companies for more than 25 years.
Section 230 of the Communications Decency Act of 1996 states that internet companies, including social media platforms, cannot be sued over third-party content uploaded by users -- such as photos, videos and commentary -- or for decisions site operators make to moderate, or filter, what appears online.
Oral arguments at the Supreme Court set for Tuesday in Gonzalez v. Google, the parent company of YouTube, will focus on the scope of that immunity, whether it covers algorithms, and whether Gonzalez should be able to pursue her claims in court.
\"Hopefully this will change the laws and it'll be for the good by being more careful about the social media, so [other parents] never have the pain that we're feeling,\" said Nohemi Gonzalez's stepfather, Jose Hernandez.
YouTube says it bans terrorist content across its platform and that its algorithms help catch and remove violent extremist videos, noting 95% of those removed last year were automatically detected -- most before receiving fewer than 10 views.
\"Undercutting Section 230 would make it harder for websites to do this work,\" YouTube spokesperson Ivy Choi told ABC News. \"Websites would either over-filter any conceivably controversial materials and creators, or shut their eyes to objectionable content like scams, fraud, harassment and obscenity to avoid liability -- making services far less useful, less open and less safe.\"
\"There are enormous amounts of money at stake if the platforms were to be held liable for every time a terrorist attack could in any way be tangentially traced to material that the platforms carried,\" said Michael Karanicolas, executive director of the Institute for Technology Law & Policy at UCLA.
Section 230 was passed by bipartisan majorities in Congress and has long been considered a cornerstone of the modern internet, protecting online platforms as spaces for creativity, innovation and open public debate.
The crucial 26 words in the statute say: \"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.\"
Internet companies \"get to decide what to carry. They get to decide what not to carry,\" said Karanicolas. \"And they get to decide how to design their algorithms -- to amplify certain types of content or to de-emphasize other types of content.\"
\"Large companies can maybe throw a battalion of lawyers at a problem and litigate their way forward, but new startups will simply not be able to get over that [financial burden],\" said Matthew Schruers, president of the Computer and Communications Industry Association.
\"When this statute was enacted in 1996, it was for the express purpose of protecting kids from seeing obscene material online and protecting companies who take obscene material offline to protect kids. And it's been turned on its head,\" said Matthew Bergman, an attorney and founder of the Social Media Victims Law Center, who represents hundreds of plaintiffs alleging harm from social media use.
Frances Haugen, the former Facebook insider who has warned Congress about the harms of internet companies' algorithms, said setting new limits on legal immunity could incentivize companies to improve their products.
\"We have the tools, but all these things decrease usage. They make the companies a little less money,\" Haugen said. \"So in a world where our business models are fueled by clicking on ads, there aren't independent market incentives for making products that help people be healthy and happy.\"
Haugen believes Section 230 immunity does not have to be all or nothing but says regulators need to update the law to reflect current internet use and the proliferation of documented psychological harms.
\"The Supreme Court isn't really the right actor for dealing with this issue. You know, they can come in and do a very blunt judgment. They can't, for example, set up a new regulatory framework that might be a more effective way to govern the internet,\" Haugen said.
But Bergman, the attorney for social media users claiming harm, and the Gonzalez family argue that the justices need to act under a plain reading of the law and permit the Gonzalez family to move forward with their suit against YouTube's parent company.
\"It will certainly provide a more sensible opportunity for families to hold companies accountable,\" Bergman said. \"All it will do is allow them to seek discovery and prove their case. Everyone is entitled to a defense, as are the social media companies, but it will simply kind of open the courthouse door.\"
I write to you in the wake of the horrific acts of terrorism at two mosques in Christchurch, New Zealand, which killed at least 50 people and wounded 50 more. I was deeply concerned to learn that one of the shooters live-streamed his terror attack on Facebook, and the video was subsequently reuploaded on Twitter, YouTube, and other platforms. The video was widely available on your platforms well after the attack, despite calls from New Zealand authorities to take these videos down.
I respectfully request a briefing before the Committee on Homeland Security on March 27, 2019, regarding your response to the dissemination of the video of the New Zealand terrorist attack on your platforms and how your companies intend to prevent this disturbing incident from happening again.
This strategy, developed with Jigsaw, an incubator company within Google and YouTube's parent company Alphabet, and London-based anti-extremist tech firm Moonshot CVE, basically sends YouTube requests for certain keywords tied to violent extremism to a playlist of videos "debunking violent extremist recruiting narratives," the companies said in an online post Thursday.
Extremist groups including ISIS use video on YouTube to recruit and radicalize prospective terrorists. This redirect method aims to deter such movements to drive "people away from violent extremist propaganda and steer them toward video content that confronts extremist messages and debunks its mythology," the companies say in the post.
d3342ee215