Elections and misinformation will collide like never before in 2024

Elections and misinformation will collide like never before in 2024

Elections in the Russian-occupied Ukrainian territories (Portal/Alexander Ermochenko)

This year, Billions of people will take part in important elections -more or less half of the world's population, by some calculations – in one of the greatest and most transcendent democratic exercises in recent memory. The results will impact the way the world is governed in the coming decades.

You may be interested in: Artificial intelligence in education: Teachers also use ChatGPT

At the same time, false narratives and conspiracy theories have become an increasingly global threat.

Unfounded allegations of electoral fraud have undermined trust in democracy. Foreign influence campaigns typically target polarizing domestic issues. Artificial intelligence has increased disinformation efforts and distorted perceptions of reality. All this while the major social media companies have reduced their protections and their election teams.

You may be interested: Robots will build your future homes

“Almost all democracies are under stress, regardless of technology,” said Darrell West, a senior fellow at the Brookings Institute, a think tank. “When you add misinformation to that, it just creates a lot of opportunities to cause problems.”

According to West, it is a “perfect storm of misinformation.”

You may be interested: Google Bard will have a paid version: what exclusive features will it offer?

The global calendar includes at least 83 elections, the largest concentration in at least 24 years, according to consulting firm Anchor Change.

Election Calendar 2024 (NYT)

These elections are taking place all over the world, including in Europe, where 27 European Union member states will take part in their parliamentary elections in June.

According to estimates, this corresponds to more than four billion people.

At least seven elections will take place in January alone. Taiwan, trying to protect itself from Chinese disinformation campaigns, will elect a new president on January 13.

Electoral map in 2024 (NYT)

Pakistan And IndonesiaThe most populous Muslim countries, which are struggling to balance free speech with efforts to combat misinformation, will hold elections a week apart in February.

In Indiawhere the Prime Minister warned against fake news about AI content, the general election is scheduled for the spring.

European Parliament elections will take place in June as the European Union continues to enforce a new law to curb corrosive online content.

A presidential election in Mexico This same month could be affected by a feedback loop of false narratives from other parts of America.

USAAlready in the midst of a presidential campaign marked by a resurgence of lies about voter fraud, he will go to the polls in November.

Voting map, by number of voters (NYT)

National elections are also planned in places where democracy has struggled to take hold. Russia and Ukraine, which have called presidential elections, are making contradictory speeches about continuing their war.

One of the continent's most critical elections is taking place in Africa South Africawhich has faced xenophobic disinformation campaigns in the past.

Democracy, which spread around the world after the end of the Cold War, faces increasing challenges around the world: from mass migration to climate disruption, from economic inequalities to war. In many countries, the struggle to adequately respond to these challenges has undermined trust in liberal, pluralistic societies and opened the door to incitement by populists and authoritarian leaders.

Autocratic countries led by Russia and China have exploited currents of political discontent to advance narratives that undermine democratic governance and leadership, often supported by disinformation campaigns. If these efforts are successful, the elections could accelerate the recent rise of authoritarian-minded leaders.

Fyodor Lukyanov, an analyst who runs a Kremlin-linked think tank in Moscow, the Foreign and Defense Policy Council, recently said that 2024 “could be the year the Western liberal elites lose control of the world order.” .

According to Katie Harbath, founder of the tech policy company Anchor Change and former director of public policy at Facebook for managing elections, the traditional political class in many countries, as well as intergovernmental organizations such as the Group of 20, appear ready for a revolt. Disinformation – spread through social media, but also through the press, radio, television and word of mouth – risks destabilizing the political process.

We will reach the year 2025 and the world will look very differentHarbath explained.

Among the biggest sources of misinformation in election campaigns are autocratic governments seeking to discredit democracy as a global model of governance.

In recent months, researchers and the U.S. government have suggested that Russia, China and Iran are likely to try to conduct influence operations to disrupt the electoral process in other countries, including this year's U.S. presidential election. For these countries, the coming year is “a real opportunity to embarrass ourselves on the world stage, exploit social divisions and simply undermine the democratic process,” said Brian Liston, an analyst at Recorded Future, a digital security company, who recently reported on the issue possible threats to the American war.

Russia will hold elections this year (Portal/Alexander Ermochenko)

The company also investigated a Russian influence operation called “Doppelgänger,” which Meta first identified last year, which appeared to impersonate international news organizations and create fake accounts to spread Russian propaganda in the United States and Europe. Doppelganger reportedly used easily accessible artificial intelligence tools to create news outlets dedicated to American politics, with names like Election Watch and My Pride.

The false narratives circulating around the world are often shared by diaspora communities or orchestrated by state-backed agents. Experts expect election fraud narratives to continue to evolve and gain traction, as was the case in the United States and Brazil in 2022 and in Argentina in 2023.

An increasingly polarized and combative political environment breeds hate speech and misinformation that drive voters further into isolated echo chambers. A motivated minority of extremist voices, aided by social media algorithms that reinforce users' biases, often drowns out a moderate majority.

We are in the process of redefining our social norms around freedom of expression and how we can hold people accountable for that expression on and off the internet.Harbath commented. “There are a lot of different views on how to implement this in this country, let alone around the world.”

Some of the most extremist voices seek each other out on alternative social media platforms like Telegram, BitChute and Truth Social. According to Pyrra, a company that monitors threats and misinformation, calls to preemptively stop voter fraud — which historically has been statistically insignificant — has been a trend on these platforms recently.

The “prevalence and acceptance of these narratives are increasing” and even directly influencing electoral policy and legislation, Pyrra noted in a case study.

“These conspiracies are rooted in the political elite, who use these narratives to please the public while compromising the transparency and control of the system they are supposed to defend,” the firm’s researchers wrote.

According to a report from the Universities of Chicago and Stanford, artificial intelligence is in demand “It is promising for democratic management”. Politically focused chatbots could inform voters about important issues and better connect voters with elected officials.

Technology could also be a vector of misinformation. Fake AI-generated images have already been used to spread conspiracy theories, such as the baseless claim that there is a global conspiracy to replace white Europeans with non-white immigrants.

Lawrence Norden, who directs the elections and government program at the Brennan Center for Justice, a public policy institute, said artificial intelligence could mimic large amounts of election office materials and cause widespread dissemination. Or it could provide last-minute surprises in October, such as the audio showing signs of artificial intelligence intervention that was circulated during the close Slovakian elections in the fall.

“All the things that have been threatening our democracy for some time can get even worse with artificial intelligence.”Norden made the comments while participating in an online panel in November. (During the event, organizers presented an artificially manipulated version of Norden to highlight the technology's capabilities.)

Some experts fear that the mere presence of artificial intelligence could undermine trust in information and allow political actors to reject real content. Others said fears were overblown for now.

Artificial intelligence is “just one of many threats,” said James Lindsay, senior vice president of the Council on Foreign Relations, a think tank.

“I wouldn’t lose sight of all the old ways of spreading false information or disinformation,” he said.

In countries where parliamentary elections are scheduled for 2024, misinformation has become a key concern for a large majority of people surveyed by UNESCO, the United Nations' cultural organization. And yet efforts by social media companies to restrict toxic content, which were ramped up after the 2016 U.S. presidential election, have been scaled back or completely reversed.

According to a recent report from Free Press, an advocacy group, Meta, YouTube and restructured. Some offer new features, such as: B. one-sided private transmissions, which are particularly difficult to monitor.

Companies are starting the year with “low bandwidth, very little written accountability and billions of people around the world turning to these platforms for information,” a less than ideal scenario for protecting democracy, said Nora Benavidez, Free Press Legal Counsel.

Newer platforms like TikTok will likely play a larger role in political content. Substack, the newsletter startup that said last month that it would not ban Nazi symbols or extremist rhetoric on its platform, wants the 2024 election season to be “the Substack election.” Politicians are planning live-streaming events on Twitch that will also feature a debate between AI-generated versions of President Joe Biden and former President Donald Trump.

Meta, the company that owns Facebook, Instagram and WhatsApp, said in a November blog post that it was in one “strong position to protect the integrity of next year’s elections on our platforms.” (Last month, a company-appointed content advisory board spoke out against Meta's automated tools and their handling of two videos related to the Israel-Hamas conflict.)

YouTube wrote last month that its “election-focused teams have been working around the clock to ensure the right policies and systems are in place.” The platform signaled this summer that it would stop removing false narratives about voter fraud. (YouTube said it wanted voters to hear all sides of a debate, but noted that “this is not a license to spread harmful misinformation or promote hateful rhetoric.”)

This type of content continued to spread “Many social media companies rely heavily on unreliable AI-based content moderation tools, leaving human teams bare-bones and constantly on alert to put out fires,” explained Popken, who later explained moved to content moderation company WebPurify.

“Election integrity is such a gigantic task that you really need a proactive strategy, lots of people, minds and war rooms,” he said.

© The New York Times 2024