YouTube Requires Creators to Disclose AI-Generated Content

YouTube Requires Creators to Disclose AI-Generated Content

By
Ekaterina Petrovna Zolotova
1 min read

YouTube announced a new tool requiring creators to disclose when content made with altered or synthetic media, including generative AI, could be mistaken for a real person or event. The goal is to prevent deception as generative AI tools make it harder to differentiate between real and fake. This initiative addresses the potential risks posed by AI and deepfakes, particularly with the upcoming U.S. presidential election. While the new policy doesn't apply to clearly unrealistic or animated content, creators must disclose digitally altered faces, synthetic person voices, or altered footage of real events or places. The labels will soon roll out across all YouTube formats, with enforcement measures for non-compliance.

You May Also Like

This article is submitted by our user under the News Submission Rules and Guidelines. The cover photo is computer generated art for illustrative purposes only; not indicative of factual content. If you believe this article infringes upon copyright rights, please do not hesitate to report it by sending an email to us. Your vigilance and cooperation are invaluable in helping us maintain a respectful and legally compliant community.

Subscribe to our Newsletter

Get the latest in enterprise business and tech with exclusive peeks at our new offerings