Facebook owner Meta’s policy on manipulated media has been criticized as incoherent and confusing by an oversight board. The board, which was established to make judgments on online freedom of expression for Meta, called for the company to revise its policy in order to address the increasing problem of online disinformation targeting elections worldwide. The review conducted by the board revealed gaps in the policy, particularly in relation to an altered video of President Joe Biden that spread on Facebook.
According to the oversight board, Meta should expand its policy to cover all types of manipulated media, not just videos created using artificial intelligence (AI). This includes fake audio recordings, which have already been used to convincingly impersonate political candidates in the United States and other countries. The board also recommended that Meta clarify the harms it aims to prevent and label manipulated images, videos, and audio clips rather than removing them entirely.
The feedback from the oversight board comes at a time when tech companies are facing intense scrutiny for their handling of election falsehoods. As generative AI deepfakes and lower-quality cheap fakes on social media continue to mislead voters, platforms are attempting to respond to false posts while protecting freedom of speech. The oversight board’s co-chair, Michael McConnell, described Meta’s current policy as making little sense and called for the company to close the gaps while ensuring that political speech remains protected.
In response to the recommendations, Meta stated that it is reviewing the guidance provided by the oversight board and will publicly respond within 60 days. The company emphasized that although its current manipulated media policy does not specifically mention audio deepfakes, they are eligible to be fact-checked and will be labeled or down-ranked if deemed false or altered by fact-checkers. Additionally, Meta takes action against any content that violates the company’s Community Standards.
Facebook, which celebrated its 20th anniversary this week, remains the most popular social media platform for Americans to consume news. However, other platforms such as Instagram, WhatsApp, Threads, YouTube, and TikTok, which are owned by Meta, also have the potential to spread deceptive media and mislead voters.
Meta established its oversight board in 2020 to act as a referee for content on its platforms. The recent recommendations provided by the board were made after reviewing an altered clip of President Biden and his adult granddaughter. Even though the clip was misleading, it did not violate Meta’s existing manipulated media policy, and consequently, the board upheld the company’s decision to leave it on Facebook.
The oversight board advised Meta to update its policy and label similar videos as manipulated in the future. It argued that the focus should not be on how the content was created but rather on the harm that manipulated posts can cause, such as disrupting the election process. Meta welcomed the board’s ruling and stated that it will update the post in question based on the recommendations.
While Meta is required to adhere to the oversight board’s rulings on specific content decisions, it is not obligated to follow the board’s broader recommendations. However, the board has succeeded in prompting some changes by Meta over the years, including providing more specific messages to users who violate its policies.
Jen Golbeck, a professor at the University of Maryland‘s College of Information Studies, believes that Meta has the opportunity to lead the way in labeling manipulated content. However, she emphasizes that the enforcement of these changes is crucial, especially when faced with political pressure from those who intend to spread misinformation. Golbeck points out that failing to follow through on implementing and enforcing policy changes can further erode trust, which is already undermined by the spread of misinformation.
As tech companies grapple with the challenges posed by manipulated media and election falsehoods, Meta’s policy review by the oversight board serves as a reminder of the need for consistent and effective strategies to combat disinformation. With elections taking place in over 50 countries this year, it is essential to ensure that platforms prioritize the protection of users and the integrity of democratic processes.