The United States Supreme Court today launched a legal class filled with important cases involving affirmative action in university access, the environment, voting rules and discrimination against gays, among many other issues. As the hearing begins, which will resume in person for the first time since the outbreak of the pandemic, the Supreme Court has announced it will admit new cases. Among them, two stand out that measure the responsibility that big technology companies have in relation to the content published by their users on social networks.
Google, Facebook, Twitter, Amazon and other companies with social networks are thus experiencing a major judicial review of the moderation of their content, which is controversial and subject to different regulations in the states. One of the admitted cases, Gonzalez, Reynaldo and others v. Google, will examine the extent to which Google can be held responsible for the Mataclan massacre in Paris for allowing the distribution of videos inciting Islamist violence on its YouTube platform . The other involves Twitter, Google and Facebook related to the 2017 attack on a nightclub in Istanbul that left 39 dead.
Those suing Google through YouTube are the relatives of Nohemi Gonzalez, a 23-year-old American university student who was one of 131 people killed by Islamic State terrorists in a series of attacks that struck Paris on November 13, 2015 in the Bataclan concert hall and other places in the French capital shocked. Gonzalez was killed at a restaurant where she was eating that day. The lower courts dismissed the lawsuit, but the family has appealed to the Supreme Court, which now agrees to take the case.
US law states that internet companies are not responsible for the content posted by their users, but the issue is controversial for a variety of reasons. Several perpetrators of several murders have broadcast their actions live. On the other hand, the content of the networks has become the subject of political propaganda. While Democrats denounce the far-right and conspiracy propaganda circulating on the networks, Republicans bemoan the content moderation policies practiced by some big techs and that they are considering censorship.
Nohemi Gonzalez’s family criticizes that YouTube is not limited to a passive role by allowing users to see what they can see, but that its algorithm recommends videos based on each user’s story. As a result, those who watched videos containing Islamist propaganda received more content of this type, which favored their radicalization. They complain that the Google group company, of which Alphabet is now the parent, allowed the distribution of radical propaganda videos that incited violence.
“If § 230 [la norma que en principio descarga de responsabilidad a las compañías por los contenidos de sus usuarios] Applying it to these algorithmically generated recommendations is of tremendous practical importance,” the family argues in the resource. “Interactive computing services are consistently relaying these recommendations in one form or another to virtually every adult and child in the United States who uses social media.” The victim’s family believes that Google violated anti-terrorism law by allowing the distribution of these videos.
Google counters that the only connection between the Paris attacker and YouTube is that one of the attackers is an active user of the platform and once appeared in an ISIS propaganda video. “This court should not take Section 230 lightly, as it threatens the fundamental organizational choices of the modern internet,” argues Google.
In the other case, lower courts ruled that Twitter, Facebook and Google should take some responsibility for the content circulated in connection with the massacre at an Istanbul nightclub, Reina Club, at a 2016 New Year’s Eve party. The case of Twitter and Others v. Taamneh, Mehier and Others has also been admitted by the Supreme Court.
Both cases will be a first impetus in a battle stretching over the immunity companies should or should not have over their users’ content, while at the same time over the leeway they have over their moderation policies.
The court decides each year on the cases it admits within a period that lasts until the end of June or beginning of July, when the court case ends.
you can follow THE AGRICULTURAL TECHNOLOGY on Facebook and Twitter or sign up here to receive our weekly newsletter.