The Supreme Court appears unlikely to gut tech companies’ coveted legal protections that cover how they recommend and curate content for users.
Despite widespread fear in the tech community about such a blow, a majority of the justices during oral arguments Tuesday seemed reluctant to upend almost three decades of legal precedent that has effectively immunized search engines and social media companies from liability for their decisions about policing content, including practices that amplify or hide particular posts.
The case before the high court — Gonzalez v. Google — seeks to hold Google’s YouTube liable for the death of Nohemi Gonzalez, a California college student killed in a 2015 terrorist attack in Paris blamed on ISIS. Her family claims Section 230 of the Communications Decency Act doesn’t protect YouTube’s use of algorithms to recommend ISIS recruitment content to users.
Both liberal and conservative justices suggested Congress is the best body to amend Section 230, not the courts. Justice Elena Kagan warned it would be best for lawmakers to take a scalpel to the law — whereas the court’s reinterpretation of the statute could upend years of legal precedence and lead to a deluge of lawsuits.
“We’re a court. We really don’t know about these things. These are not like the nine greatest experts on the internet,” Kagan said, prompting laughter from across the courtroom and the bench.
Even Justice Clarence Thomas — who for years had urged in separate dissents for the court to take up a Section 230 case — seemed unconvinced that algorithms aren’t covered by the liability shield. “I see these as suggestions and not really recommendations, because they don’t really comment on them,” Thomas said of YouTube’s use of algorithms to promote videos.
Thomas also said he didn’t see the ties between YouTube’s use of algorithms to recommend ISIS videos as a “aiding and abetting” terrorism under the Anti-Terrorism Act when YouTube relies on a neutral algorithm to recommend content.
“I’m trying to get you to explain to us how something that is standard on YouTube for virtually anything that you have an interest in suddenly amounts to aiding and abetting because you’re in the ISIS category,” the conservative justice said.
Kagan, an appointee of President Barack Obama, said she didn’t have to accept the tech industry’s “sky is falling” arguments to accept that “there is uncertainty about going the way [the plaintiff] would have us go, in part just because of the difficulty of drawing lines in this area.”
“Once we go with you, all of a sudden, we’re finding that Google isn’t protected, and maybe Congress should want that system. But isn’t that something for Congress to decide, not the court?” she said to Eric Schnapper, a University of Washington law professor representing the Gonzalez family.
Similarly, Justice Brett Kavanaugh brought up the concerns raised by many tech companies in their amicus briefs that a completely different interpretation could “really crash the digital economy.”
“Those are serious concerns and concerns that Congress — if it were to take a look at this and try to fashion something along the lines of what [the plaintiff] is saying could account for — we are not equipped to account for that,” the conservative justice said.
Despite expectations that the conservative justices would aggressively challenge Google’s claim for far-ranging legal immunity, the most hostile and outspoken voice Tuesday against the firm and the tech industry’s broader arguments was Justice Ketanji Brown Jackson, who’s emerging as one of the high court’s most liberal members.
Jackson, the court’s only justice appointed by President Joe Biden, repeatedly argued that tech companies’ protection from liability should be limited to the actual hosting and transmission of user-created content, with all decisions about how to organize, rank and display that content subject to potential litigation under ordinary legal standards.
Jackson said the broad protection Google was claiming “seems to bear no relationship to the text of the statute.” She insisted the primary purpose of the law was to encourage policing of “offensive” content and that the result the tech firms were seeking would have the perverse effect of immunizing companies when they deliberately amplify inflammatory videos or other posts.
“What the people who were crafting this statute were worried about was filth on the internet,” said Jackson. “That seems to me to be a very narrow scope of immunity that doesn’t cover whether you were making recommendations or promoting it. … How is that even conceptually consistent with what it looks as though this statute was about?”
Google’s lawyer, Lisa Blatt of Williams & Connolly, said the law had dual purposes and one key part was to promote robust debate in a critical field of emerging technology.
“This is about diversity of viewpoints, jumpstarting an industry having information flourishing on the internet and free speech,” Blatt said.
Even Justice Samuel Alito, who has appeared skeptical of protections for tech firms in other contexts, said he was baffled by the argument made by Schnapper that Section 230 gives immunity for hosting others’ content and for search engine activities, but not for implicit or explicit recommendations.
“I don’t know where you’re drawing the line. That’s the problem,” Alito said.
The Biden administration largely sided with the Gonzalez family on the central question at issue at the high court, arguing that the Section 230 protections should not extend beyond simple hosting of third-party content. However, Deputy Solicitor General Malcolm Stewart told the court that even without immunity for recommendations or curation of content, tech firms would only rarely be liable for such activity.
But Kagan and Kavanauagh warned that even a small opening for such litigation could have a dramatic effect on the internet ecosystem and potentially swallow up the protections Congress was trying to grant to companies hosting other people’s content.
“You can’t present this content without making choices,” Kagan said. “But still, I mean, you are creating a world of lawsuits really anytime you have content.”
How the Supreme Court rules in Gonzalez could also relate to its conclusions about a similar tech case scheduled for arguments Wednesday — Twitter v. Taamneh. That case asks whether Twitter, Google and Facebook can be held liable under the Justice Against Sponsors of Terrorism Act for allegedly aiding and abetting terrorists by sharing ISIS recruitment content.
The decision in Tuesday’s case could also tee up the justices for a potential ruling in two other cases the court punted to next term involving GOP-backed laws from Texas and Florida that ban platforms from removing users’ viewpoints and deplatforming candidates. The companies contend that the laws violate their free speech rights.
The pair of tech-related disputes being argued this week are the first closely-watched cases the justices have taken up this year, after hearing attention-grabbing cases last fall about redistricting procedures and the power of state legislatures over Congressional elections. Next week, the high court is to take up one of the cases of most intense interest to the Biden administration: the president’s controversial plan to forgive the college debt of many students.
So far, the court has issued only one substantive opinion, a unanimous ruling in an obscure case. Rulings in all the cases argued this term are expected between March and June.