“We are a court. We really don’t know anything about these things. These are not the nine greatest experts on the internet.”
Supreme Court Justice Elena Kagan made the wryly self-deprecating comment early on in oral arguments for Gonzalez vs Googlea potential landmark case covering Section 230 of the Communications Decency Act 1996. The remark was a nod to many people’s worst fears about the case. gonzales could overturn core legal protections for the internet, and it’s being decided by a court with an appetite to overturn precedent and reexamine longstanding free speech.
But during a remarkably entertaining question and answer session today, the court took an unexpectedly measured look at Section 230. The result in gonzales is far from certain, but so far the debate suggests a reassuring awareness from the court of just how important the ruling will be – and the potential consequences if it is screwed up.
Gonzalez vs Google covers a very specific type of online interaction with potentially huge impact. The lawsuit stems from an Islamic State shooting in Paris in 2015 that killed student Nohemi Gonzalez. Her surviving family argued that YouTube endorsed terrorist videos, violating laws against aiding and abetting foreign terrorist groups. While Section 230 normally protects websites from liability for user-generated content, the petition argues that YouTube created its own speech with its recommendations.
“Every time someone looks at something on the internet, there is an algorithm at play.”
Today’s hearing focused heavily on “thumbnails,” a term that Gonzalez family attorney Eric Schnapper defined as a combination of a user-provided image and a YouTube-generated web address for the video. Several judges seemed dubious that creating a URL and recommendation sorting system should strip websites of Section 230 protections, particularly since thumbnails didn’t play a major role in the original order. Kagan and others questioned whether the thumbnail problem would go away if YouTube simply renamed videos or provided screenshots, suggesting the argument is a confusing technicality.
The subtle distinctions surrounding Section 230 were a recurring theme at the hearing, and with good reason. gonzales targets “algorithmic” recommendations like the content that will autoplay after a given YouTube video, but as Kagan pointed out, pretty much everything you see on the web involves some sort of algorithm-based sorting. “This was a pre-algorithm statute and everyone is trying their best to figure out how that statute is applied,” Kagan said. “Every time someone looks at something on the internet, there is an algorithm at play.”
Introducing liability for these algorithms raises all sorts of hypothetical questions. Should Google be penalized for returning search results that point to defamation or terrorist content, even when responding to a direct search query for a false statement or terrorist video? And conversely, is a hypothetical website in the clear if it writes an algorithm intentionally designed to “run in cahoots with ISIS,” as Judge Sonia Sotomayor put it? Though (somewhat surprisingly) it didn’t come up in today’s arguments, at least a verdict has found that a website’s design can appear actively discriminatory, regardless of whether the result contains user-filled information.
A wrong balance here could turn basic technical components of the internet – like search engines and URL generation – into a legal minefield. There were a few skeptical remarks about fears that a web apocalypse without Section 230 will be exaggerated, but the court repeatedly asked how changing the legal boundaries would practically affect the internet and the companies it supports.
The court at times seemed frustrated that it had even taken up the case
As legal writer Eric Goldman alludes to in a report on the hearing, judges sometimes seemed frustrated that they picked up the hearing gonzales case at all. There’s another hearing for tomorrow Twitter against Taamneh, which also covers when companies are liable for allowing terrorists to use their platform, and Judge Amy Coney Barrett suggested the possibility of using this case to rule that they simply aren’t – something the court could avoid touching on Section 230 by making that questions about it are moot. Judge Kavanaugh also considered whether Congress, rather than the court, should be responsible for making sweeping changes to Section 230.
However, that doesn’t clear up Google or the rest of the internet. gonzales will almost certainly not be the last Section 230 case, and even if that case is dismissed, Google attorney Lisa Blatt has been left with the question of whether Section 230 still serves one of its original purposes: to encourage websites to moderate effectively without fear of being penalized for doing so.
Blatt conjured up the specter of a world that is either “Truman Show or Horror Show” — in other words, where web services either remove anything legally questionable or refuse to even look at what’s on their site . But we don’t know how compelling these defenses are, especially in emerging areas like artificial intelligence search, which has been repeatedly raised by Justice Neil Gorsuch as an indicator of the platforms’ odd future. The Washington Post spoke to prominent Section 230 critic Mary Anne Franks, who expressed a faint hope that the judges seemed ready to change the rule.
Still, today’s arguments were a relief after last year’s nightmarish legal cycle. Even Judge Clarence Thomas, who has written some creepy, ominous opinions on “Big Tech” and Section 230, spent most of his time wondering why YouTube should be penalized for providing an algorithmic recommendation system alongside videos about cute cats also covers terror videos and ” Pilaf from Uzbekistan.” For now, that might be the best we can expect.