After announcing the initiative late last year, YouTube is now launching its requirement for creators to disclose the use of “altered or synthetic” content which includes AI.
In a blog post today, YouTube announced that creators are now required to confirm whether or not their video includes “altered or synthetic content.” This is described as content that looks realistic, but is made with either altered or synthetic media. The main target of this new disclosure is to help viewers be aware of videos made using generative AI.
YouTube’s examples of altered and/or synthetic content include:
- Using the likeness of a realistic person: Digitally altering content to replace the face of one individual with another’s or synthetically generating a person’s voice to narrate a video.
- Altering footage of real events or places: Such as making it appear as if a real building caught fire, or altering a real cityscape to make it appear different than in reality.
- Generating realistic scenes: Showing a realistic depiction of fictional major events, like a tornado moving toward a real town.
But exactly where the cutoff lies is a question we don’t fully know the answer to. Are video effects considered “altered” content? To a certain level, no, as YouTube considers “clearly unrealistic content,” color or lighting adjustments, “special effects like background blur,” and “beauty filters or other visual enhancements” to be excluded even when AI-generated content is being used.
A support page further dives into more examples of what does and does not need disclosure. One area YouTube very much requires disclosure is with deepfakes, explained as “digitally generating or altering content to replace the face of one individual with another’s.”
YouTube says and not disclosing the use of AI could open a channel up to penalties including suspension from the Partner Program used to monetize videos on the platform.
When content is undisclosed, in some cases YouTube may take action to reduce the risk of harm to viewers by proactively applying a label that creators will not have the option to remove. Additionally, creators who consistently choose not to disclose this information may be subject to penalties from YouTube, including removal of content or suspension from the YouTube Partner Program.
YouTube says that viewer-facing labels for “altered or synthetic” content will appear on mobile, desktop, and TV “in the weeks ahead,” starting with the YouTube app on mobile, but the disclosure checkbox for creators starts rolling out today.