In recent days, dozens of tech companies have filed briefs in support of Google in a case before the Supreme Court examining the liability of online platforms for recommending content. Obvious advocacy groups like Meta and Twitter, as well as popular platforms like Craigslist, Etsy, Wikipedia, Roblox, and Tripadvisor, urged the court to uphold Section 230 immunity in the case, lest they risk tarnishing the avenues users rely on to connect with each other and find information online .
Of all these briefs, however, Reddit was perhaps the most compelling. The platform was arguing on behalf of everyday internet users, whom it claims could be buried in “frivolous” Reddit frequenting lawsuits if Section 230 is weakened by the court. Unlike other companies that hire content moderators, the content Reddit displays is “driven primarily by humans — not centralized algorithms.” Because of this, Reddit’s brief paints a picture of trolls suing not big social media companies, but individuals not receiving compensation for their work by recommending content in communities. This legal threat extends to both volunteer content moderators, Reddit argued, and more casual users who accumulate Reddit “karma” by upvoting and downvoting posts to uncover the most engaging content in their communities.
advertising
“Section 230 of the Communications Decency Act is known to protect Internet platforms from liability, but what is missing from the discussion is that it provides crucial protections for Internet users – ordinary people – when they engage in moderation, such as removing unwanted content from their communities or the Upvoting and downvoting user posts,” a Reddit spokesperson told Ars.
Reddit argues in the brief that such frivolous lawsuits have been brought against Reddit users and the company in the past, and Section 230 protections have consistently allowed Reddit users to avoid litigation “quickly and cheaply” in the past.
The Google case was raised by the family of a woman, Nohemi Gonzalez, who was killed in a Paris bistro during an ISIS terrorist attack in 2015. Because ISIS allegedly relied on YouTube to recruit recruits prior to this attack, the family sued to hold Google liable for alleged aiding and abetting of terrorists.
A Google spokesperson linked Ars in a statement, saying, “A decision that undermines Section 230 would result in websites either removing potentially controversial material or turning a blind eye to objectionable content to avoid being exploited.” Experienced. They would have a choice between overly curated mainstream sites or fringe sites flooded with objectionable content.”
Eric Schnapper, an attorney representing the Gonzalez family, told Ars that the question before the Supreme Court “applies only to companies like Reddit itself, not individuals. This decision would not change anything regarding moderators.”
“The issue of recommendations arises in this case because the complaint alleges that the defendants recommended ISIS terrorist recruitment videos that could, under certain circumstances, create liability under the Counter-Terrorism Act,” Schnapper told Ars, noting , that the issue of this liability is the subject of another SCOTUS case involving Twitter, Meta and Google.