Facebook and Instagram, owned by parent company Meta, have announced new rules for political ads on their platforms. These rules state that political advertisers will now be required to disclose if they have digitally manipulated images or videos in their ads. However, the rules do not cover misleading deepfake videos.
The implementation of these rules will begin from the start of 2024 and will apply to any advertisement classified as political, electoral, or related to a social issue. The rules specifically target photorealistic images and videos, as well as realistic sounding audio. Illustrations and cartoons are exempt from these regulations.
Though Meta did not explicitly state the reason for the timing, it is likely that these changes are in response to concerns about the spread of misleading information online during the upcoming series of elections scheduled for 2024.
Under the new rules, advertisers must disclose when they have altered or digitally created an image, such as through the use of artificial intelligence (AI). Facebook and Instagram will then add a note to the ad, indicating that it has been altered. This information will also be included in Facebook’s library of political ads, which aims to provide transparency regarding the source and targeting of these ads.
Failure to comply with these disclosure requirements may result in the ad being blocked by Meta. Advertisers who repeatedly fail to disclose alterations may face additional penalties.
It is important to note that these rules only apply to advertising content and not to misleading material that is posted by regular users. Existing rules on Facebook and Instagram already prohibit the posting of digitally manipulated videos that could mislead viewers into believing false statements were made by individuals in the video.
There has been ongoing debate regarding the effectiveness of these rules. Some argue that simply labeling digitally altered and AI-generated multimedia is not enough, and that Meta should consider banning such ads altogether. Critics also raise concerns about bad actors finding ways to circumvent the rules.
In conclusion, Meta, the parent company of Facebook and Instagram, has introduced new rules requiring advertisers to disclose if they have digitally manipulated images or videos in their political ads. These rules aim to promote transparency and combat the spread of misleading information. However, questions remain about whether labeling alone is sufficient and if bad actors will find ways to evade the regulations.