The Supreme Court of the United States, in a picture from last week.EVELYN HOCKSTEIN (Portal)
Social networks return to the United States Supreme Court. A few months ago, judges upheld technology companies’ exclusion of liability for content published by their users. Now the platforms’ own content moderation policies are at stake. The states of Texas and Florida have passed laws against these policies, arguing that they impose censorship on their users. Several technology companies have appealed, and now the decision will be made by the Supreme Court, which has announced that it will allow the case to be decided in the new judicial year, which begins on Monday.
Texas and Florida, Republican-majority states, passed social media laws in protest because they believed conservative voices were being silenced, such as those defending voter fraud, saying the election was stolen from Donald Trump, and those spreading disinformation published about vaccines. or the origins of the coronavirus in the midst of a pandemic. They believed that vetoing these messages and some users (including Trump himself) violated the First Amendment’s freedom of speech.
The companies believe exactly the opposite: that Florida and Texas laws restricting their right to moderate content violate the First Amendment because they give them the right to decide what they post on their platforms. The resources were provided by industry associations (Netchoice and Computer and Communications Industry Association), whose members include Google (which controls YouTube), Meta (Facebook and Instagram), X (formerly known as Twitter), TikTok, Yahoo, Snap and Pinterest, among others. The cases are Moody v. NetChoice and NetChoice v. Paxton.
The laws (whose content is similar but have their own nuances) have already been challenged in federal court, with conflicting results: One ruling struck down the Florida law, while another upheld the Texas law, so it seemed clear that the Supreme Court doing so would admit to unifying the doctrine in this regard. The Supreme Court judges temporarily suspended the application of the law last year in a decision by five votes to four. The Supreme Court has a majority of six conservative justices compared to three progressive ones.
Essentially, it’s about deciding how laws written before the digital age are applied on the internet. The tech companies fear the laws prevent social networks from eliminating extremist and hate speech and have welcomed the complaint being allowed to be processed. “We are pleased that the Supreme Court has agreed to hear our groundbreaking cases,” said Chris Marchese, director of litigation at Netchoice, in a statement. “Online services have the First Amendment right to host, preserve, and share content as they see fit. The Internet is an important platform for freedom of expression and must remain free from state censorship. We trust the court will agree,” he added.
Joe Biden’s administration has sided with companies: “The act of selecting and curating the content that users see is inherently expressive, even if the collected speech is almost entirely provided by users,” he argued . Attorney General Elizabeth B. Prelogar wrote in the case.
The Supreme Court must decide in parallel whether it is permissible to process a challenge to a ruling that limits the ability of the federal government and its agencies to require social networks to delete messages that may be harmful to public health and safety.
You can follow EL PAÍS technology on Facebook and X or sign up here to receive our weekly newsletter.