Meta’s Oversight Board is unprepared for a historic 2024 election cycle

Darrell M. West and
Darrell West
Darrell M. West Senior Fellow - Center for Technology Innovation, Douglas Dillon Chair in Governmental Studies

Natasha White
Natasha White
Natasha White Research Intern - 麻豆官网首页入口免费

July 3, 2024

  • The record number of global elections taking place in 2024 will test how Meta deploys its election-related content moderation policies and language capabilities across its 3 billion daily users.
  • Meta’s 22-person Oversight Board is responsible for binding decisions regarding the implementation of Meta’s content moderation policies, but the board appears underprepared to moderate misinformation for elections across 64 countries.
  • Non-English speaking communities are at increased misinformation risk; Meta’s Oversight Board should revise their counter-misinformation spending to allocate additional funds to non-English language and local content moderation efforts.
Meta platforms
The European Commission informed Meta - owner of Facebook, Instagram and WhatsApp - that its policy of asking users to either accept personalized advertising across its services or pay may violate the European Union's digital "gatekeeper" rules. Source: Deutsche Presse-Agentur GmbH/REUTERS

It鈥檚 time for Meta to bolster and enforce its election-related content moderation policies and language capabilities.

The 2024 election cycle presents a unique challenge for social media giants like Meta who bear responsibility for mediating the spread of election-related misinformation content on their platforms. With over across the Meta ecosystem of Facebook, Instagram, and WhatsApp, and a of national elections taking place globally this year, the stakes surrounding Meta and its Oversight Board鈥檚 powers as disseminators of false and misleading election information are at an all-time high.

Early flexes of AI鈥檚 disinformation muscle

In January, an of President Joe 麻豆官网首页入口免费 went viral, encouraging New Hampshire voters to abstain from voting in the 2024 New Hampshire presidential primary election. The AI-generated voice recording was later linked to a consultant working for a rival candidate, whose commissioning of the call violated federal the state鈥檚 laws against voter suppression in elections. The robocall is proving to have been a mere preview of coming attractions. The ready availability of AI sound, image, and tools has made proliferation of deepfake election content considerably more accessible鈥攁nd .

But the threat to elections isn鈥檛 limited to deepfakes and misleading AI on social media. Many worry that campaigns will to spread false information and that tech companies won鈥檛 adequately address these issues as their platforms continue to host democracy-compromising content ahead of elections.

The problem is not a new one but one that faces increased scrutiny due to the bounds in technological progress made since the last major round of elections in 2020, and correspondingly increased complexity of threats. After found that social media played a key role in election interferences during 2016, leading social media firm Meta an oversight board ahead of the 2020 U.S. election, with the of 鈥淸exercising] independent judgment over some of the most difficult and significant content decisions鈥 on its platforms.

Meta’s Oversight Board: Champions of accountability or mere spectators?

, comprised of 22 multi-national, cross-industry experts, exists as an autonomous body that judges a selection of user appeals to content decisions, and hands Meta binding decisions about whether posts should be reinstated or removed according to Meta鈥檚 content policy. The board also plays a key role in shaping the future of Meta鈥檚 content policy through their policy recommendations, which Meta must respond to within 60 days. According to the Oversight Board鈥檚 Transparency Report from the second half of 2023, however, an astounding of its 鈥榖inding鈥 recommendations were declined, still awaiting implementation, or otherwise not acted upon by Meta.

Despite the significant benefit provided by the Oversight Board, the 22-person body appears woefully underprepared to provide the much-needed moderation of election-charged misinformation in a year when which collectively hold of the world鈥檚 population, will head to the polls. In the last year and a half, like many other tech firms, Meta made the decision to significantly downsize its civic integrity and Trust and Safety teams. Following the 2020 elections, the firm its civic integrity team, and in October 2023, it over 180 content moderators based in Kenya. On April 29th, chairperson of the Oversight Board Trust, Stephen Neal, that the board would be laying off members of its team, in order “to further optimize [] operations by prioritizing the most impactful aspects of [their] work.” Most crucially, these layoffs will impact staff who support the board through administrative, research, and translation tasks.

In the meantime, legislative processes in the United States are struggling to keep up and set guidelines for tech firms. In the past year, that look to regulate AI鈥檚 election-related disinformation potential have been introduced or passed in 39 state legislatures. The Supreme Court recently from a pair of disputed laws from Texas and Florida that allow the government to determine what political content social media companies must allow to remain online, protecting content from 鈥榮elective鈥 moderation by companies like Meta. In response, tech proponents that they have the right to curate what their users see. But what degree of moderation is realistic and possible for Meta鈥檚 Oversight Board?

Lost in translation: Non-English speaking communities at increased misinformation risk

The global implications of Meta and similar companies鈥 de-emphasis of manipulated media and misinformation policies have already become visible鈥揺specially in countries where English is not the first language. During Brazil鈥檚 2022 election campaign, political violence staggering heights, with social media posts on platforms like Meta adding to calls for violence. In June 2023, Meta鈥檚 Oversight Board that a clip of a Brazilian general urging people to “hit the streets,” promoted political violence on its platforms, and the social media firm to remove the post. In this case and many others, moderation came too late.

In the United States, where only 38% of the first-generation Latino immigrant population English proficiency, and election misinformation leaves many communities especially vulnerable. As home to of the nation鈥檚 broader Hispanic population and of its population with limited English proficiency, California will be a key state in which election information must be moderated in both English and Spanish. For immigrants and communities with low English proficiency in America, language barriers are often compounded by distrust in democratic systems and an overreliance on social media sites like Facebook and for news. With fewer resources to debunk election-related deepfake and misinformation content, these communities are more frequently targeted with in their and less frequently fact-checked native languages.

In its Transparency Report for the second half of 2023, Meta鈥檚 Oversight Board its role in election-related content in relation to the tech company鈥檚 need to update their lists of banned language for countries holding elections in 2024. Considering that the social media-based threats to election integrity already seen in 2024 far outstrip such quick fixes, the Oversight Board should revise their , allocating a greater portion of funds to non-English language and local content moderation efforts in countries holding national elections. On Meta鈥檚 Facebook platform, alone, 87% of counter misinformation funds cover English language cases, although English speakers account for just 9% of global Facebook users. When it comes to voting and election information, the persistence of such trends across Meta鈥檚 other platforms, Instagram and WhatsApp, suggests a disturbing reality about the neglect of non-English speakers in Meta鈥檚 content moderation efforts.

No room for oversight: The time for Meta to safeguard democratic institutions is now

To ensure that election-related content moderation in languages other than English do not fall by the wayside, Meta should consider crafting special guidelines for countries holding elections in 2024. In particular, Meta might consider expediting election-specific policies related to equity of cases the Board hears from regions outside the U.S. and Canada and increasing its responsiveness and enforcement rate to the Oversight Board鈥檚 rulings and recommendations. The Oversight Board, itself, , among others, as problem areas. So why, as elections draw closer and closer, does it still not act to protect its most vulnerable users?


  • Acknowledgements and disclosures

    Meta is a general, unrestricted donor to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the authors and are not influenced by any donation.