February 27, 2023

Social Media at SCOTUS

“The Supreme Court [last] Tuesday debated the scope of a 27-year-old federal law that shields social-media companies from liability for content published by others. At issue in Gonzalez v. Google is whether Section 230 of the Communications Decency Act protects internet platforms when their algorithms target users and recommend someone else’s content…

“The case was filed by the family of Nohemi Gonzalez, a 23-year-old American woman who was studying in Paris when she was killed in an ISIS attack there in 2015. Their lawsuit alleges that Google, which owns YouTube, violated the Antiterrorism Act’s ban on aiding and abetting terrorism by (among other things) recommending ISIS videos to users through its algorithms, thereby aiding ISIS’s recruitment.” SCOTUSblog

Many on both sides argue against gutting Section 230 and that Congress, not the courts, should take the lead in updating the laws governing the internet:

“Every minute, for instance, about 500 hours of video are uploaded to YouTube. The company must use an algorithm to organize that content and present it in a coherent way. It is going to make some mistakes along the way, inadvertently promoting videos that are defamatory or violent or otherwise objectionable. And if each mistake opens up YouTube to civil liability, the company can no longer continue to function. That’s true of many other companies that weighed in on YouTube’s side—including Yelp, Reddit, Twitter, ZipRecruiter, and Wikipedia…

“There are, indeed, reasonable arguments that Section 230 needs new exceptions. Nonconsensual pornography (‘revenge porn’) is an especially thorny and horrible problem, as some websites have refused to remove these images after they’ve been identified as nonconsensual. But the justices are not the ones who should be making these decisions. On Tuesday, a majority of them appeared to recognize that fact, ready to dump all these quandaries into Congress’ lap where they belong. That’s a far better option than breaking the internet by judicial fiat.”
Mark Joseph Stern, Slate

“The arguments put the lawyer representing Twitter, Seth Waxman, in the uncomfortable position of having to say that not everything ISIS posts is terrorism… but it’s true. ISIS, and ISIS supporters, post a lot of things, including cat pics. But if you think there is a bright line between ‘terrorist recruitment’ and ‘cat pics,’ know that ISIS is posting pictures of their fighters with cats to soften their image so it’s easier to recruit new members. Does failing to take these things down constitute ‘substantial assistance’ to global terrorism or specific terrorist attacks?…

“Section 230 is an old law that Congress should have updated multiple times as the Internet and social media developed. JASTA is a new law (passed over the veto of President Barack Obama, by the way) that is maddeningly unclear about its key requirement for liability. It’s really not too much to ask Congress to define with clarity what constitutes aiding and abetting terrorism. It’s really not too much to ask Congress to decide whether social media companies should be responsible for their terrorist users… Unelected judges should not be the ones making this decision for our entire society.”
Elie Mystal, The Nation

“The most striking part of Tuesday’s oral arguments was how frustrated the court seemed to be with having to hear the case in the first place. ‘We’re a court. We don’t know about these things,’ Justice Elena Kagan said at one point… Justice Brett Kavanaugh agreed: ‘Isn't it better to keep [Section 230] the way it is, for us, and to put the burden on Congress to change that, and they can consider the implications and make these predictive judgments?’…

“This has been a recurring theme throughout the court’s most recent terms: the justices are begging, and in some cases even demanding, that Congress please do its job… Sure, the court could try and determine the bounds of algorithmic regulation and whether Section 230 offers tech companies more protection than they deserve. But that’s not its job. It’s Congress’s since Section 230 is Congress’s law. Lawmakers’ habit of transferring power to unelected persons, whether they be bureaucrats or Supreme Court justices, fundamentally undermines the way our government is supposed to work.”
Kaylee McGhee White, Washington Examiner

“Section 230 passed in 1996, two years before Google was founded, three years before the word ‘blog’ was invented, and when Mark Zuckerberg was 11 years old. That’s why it seems to talk past today’s controversies. Do social-media sites have immunity for fact-checks they append to disputed posts? What if search engines use language models to directly answer user queries, with text synthesized from the web?…

“It’s hard to see how the internet as we know it would function without the core liability protection of Section 230, and any GOP attempt to create a Fairness Doctrine to monitor speech on the web would be a grave mistake. But lawmakers could mandate more transparency about how moderation policies are enforced. They could set rules to stop government officials from secretly jawboning platforms into censorship. They could also clarify how a law from the AOL era applies to an AI age that was unimaginable in 1996.”
Editorial Board, Wall Street Journal

Other opinions below.

See past issues

From the Left

“[Section 230 treats platforms] as benign providers of digital space… But things have changed since those early days of the internet

“The justices directed a considerable amount of attention to the extent to which ‘targeted recommendations’ turn social media platforms from neutral, public spaces to publishers of potentially harmful content. Social media’s development and deployment of these content-targeting algorithms transformed companies from passive platforms hosting only third-party content to active aggregators and purveyors of new content…

“Social media companies are failing to remove terrorist content flagged not only by their own in-house content moderators, but also by third-party watchdogs… A ruling for the plaintiffs will go a long way to ending the charade that social media companies are doing everything possible to protect the safety and security of their users.”

Marc Ginsberg, CNN

From the Right

“Gutting Section 230 would result in the worst of both worlds… Social media users would find two types of resulting platforms: a) those that are highly moderated and would, of course, anger virtually everyone (and conservatives especially), and b) those that would quickly resemble one’s spam file or an open sewer…

“Even Musk has learned how hard it is to create a more open Twitter without getting ensnared in endless debates about when something goes too far or whether a particular account engages in extremist pontificating. There’s no way to resolve this conflict, which is only exacerbated by the fact that both sides want different outcomes. The only way to resolve it is in the public square, where private companies can set their own rules. We might not like those rules, but I can guarantee we’ll like them better than an alternative that rests this power in the courts and government agencies.”

Steven Greenhut, American Prospect

Get troll-free political news.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.