Google will soon require that political ads on its platforms, including YouTube, will have to disclose if AI was used to alter images or sounds in the ad.
The rule requires a disclosure if AI was used to meaningfully alter the message or claims of an ad by changing any of its visual or audio content.
A disclaimer must be prominently displayed in such a way that a viewer will notice it.
The new rules will go into effect in November, about a year before the presidential election in the U.S. The timing of the changes may also affect election advertising in places like the EU, India and South Africa.
The change comes as some political campaigns, such as that of Florida's Republican Gov. Ron DeSantis, are already using AI tools in their messaging. A June ad included AI-generated images of Donald Trump hugging Dr. Anthony Fauci, the former director of the National Institute of Allergy and Infectious Diseases.
The Republican National Committee aired an ad in April that used AI-generated images of boarded-up stores and military deployment.
Meanwhile, the Federal Election Commission is taking steps to potentially regulate the use of AI in political ads.
Some states have advanced legislation that governs the use of deepfakes in political content. U.S. Senators have also sponsored legislation that would require disclaimers in the case of deceptive AI use in political ads.
Trending stories at Scrippsnews.com