YouTube has been accused of facilitating the recruitment of terrorists for years. This allegedly happens when a user clicks on a terrorist video hosted on the platform and then spirals down a rabbit hole of extremist content that is automatically queued “next” by YouTube’s recommendation engine. In 2016, the family of Nohemi Gonzalez — who was killed in a 2015 terrorist attack in Paris after extremists allegedly relied on YouTube for recruitment — sued YouTube owner Google, forcing the courts to rule out YouTube’s alleged role in helping and to consider aiding and abetting terrorists. Since then, Google has defended YouTube. Then, last year, the Supreme Court agreed to hear the case.
Now the Gonzalez family is hoping the Supreme Court will agree that the protections of Section 230, which is designed to protect websites from liability for hosting third-party content, should not be expanded to also protect platforms’ right to host harmful content recommend.
However, Google believes that this is how liability protection should work. Yesterday, Google argued in a court filing that Section 230 protects YouTube’s recommendation engine as a legitimate tool “designed to facilitate the communication and content of others.”
“Section 230 includes sorting content via algorithms by defining ‘interactive computer service’ to include ‘tools’ that ‘select, select’, ‘filter’, ‘search, segment, organize’ or ‘reorganize’ content ‘ argued Google. “Congress wanted to protect these features and not just host third-party content.”
Google claimed that denying that Section 230 protections apply to YouTube’s recommendation engine would remove all shields protecting all sites that use algorithms to sort and display relevant content – from search engines to online shopping -Sites. This, Google warned, would trigger “devastating spillover effects” that would turn the internet “into a disorganized mess and litigation minefield” — exactly what Section 230 was intended to prevent.
It seems that according to Google, a judgment against Google would turn the internet into a dystopia where all websites and even individual users could potentially be sued for sharing links to content deemed objectionable. In a statement, Halimah DeLaine Prado, Google’s general counsel, said such liability would lead some larger sites to over-censor content as an extreme caution, while sites with fewer resources would likely go the other way and not censor anything.
advertising
“A decision that undermines Section 230 would result in websites either removing potentially controversial material or turning a blind eye to objectionable content,” said DeLaine Prado. “You would be forced to choose between overly curated mainstream sites or fringe sites flooded with objectionable content.”
The Supreme Court will begin hearing the case on February 21.
Google has asked the court to uphold the 9th Circuit of Appeals’ ruling, which found that Section 230 does in fact shield YouTube’s recommendation engine. The Gonzalez family is seeking a ruling that Section 230 immunity does not directly cover YouTube’s action in recommending terror videos posted by third parties.
Ars could not immediately reach any of the legal teams for comment.
Next up: deciding the fate of Section 230
In the court filing, Google argued that YouTube was already working to counter recruitment efforts with community guidelines that ban content promoting terrorist organizations.
Since 2017, Google has taken steps to remove content and block its reach, including refining YouTube’s algorithms to better detect extremist content. Perhaps most pertinently in this case, YouTube also implemented a “redirect” method at the time using targeted advertising to divert potential ISIS recruits away from radicalization videos.
Today, Google says in the court filing, YouTube operates differently than it did in 2015, with the video-sharing platform investing more in prioritizing stronger enforcement of its violent extremism policy. In the final quarter of 2022, YouTube automatically detected and removed about 95 percent of videos that violated its policy, court filings said.
According to Google, companies protected under Section 230 are already motivated to make the internet safer, and the Supreme Court needs to consider how a decision to reform the interpretation of Section 230 might upset that delicate balance.
Google argues that it shouldn’t be up to the Supreme Court to make decisions that would reform Section 230, but should be up to Congress instead. So far, recent legislative attempts to reform Section 230 have failed, but this week Joe Biden called on Congress to join him in reversing the course of how the liability shield works. If Biden has his way, platforms like YouTube could be held liable for hosting objectionable third-party content. Such a change in regulations might give the Gonzalez family peace of mind, knowing that by law YouTube would have to proactively block all terror videos, but Google’s argument suggests that such extreme Section 230 reform would inevitably “turn the internet on its head.” would.