AI Disclosure Laws Are Coming: What Brands/Influencers Need to Know
By Brooke Watson, Dynamis LLP
Image Generated by ChatGPT
Artificial intelligence is now woven into nearly every layer of the fashion industry. From image generation, to fabric design, to influencer marketing, AI is changing the face of design. But as adoption accelerates, regulators are drawing new boundaries around one simple but powerful word: disclosure.
Recent backlash against AI-generated campaigns and lawsuits over automated copying have made one thing clear: transparency is no longer optional. Between the EU AI Act , FTC crackdowns on deceptive AI claims, and emerging state laws, brands must decide how to disclose AI involvement—before regulators decide for them.
The Regulatory Landscape: Global Rules, Real Impact
The EU AI Act Article 50
Europe’s AI Act, now in phased enforcement, sets the world’s most explicit transparency obligations for synthetic media.
Under Article 50, providers and users of AI systems that generate or substantially manipulate images, audio, or video must ensure such content is clearly identifiable as artificial, unless the use involves only standard or assistive editing that does not materially change the meaning of the original.
For fashion and design brands operating or advertising in the EU, this means:
AI-generated or heavily AI-edited campaign imagery must include visible or metadata-level disclosure (e.g., “Created with AI assistance”).
Product visuals that alter fit, color, texture, or setting in a way that changes reality must likewise be labeled as AI-generated or synthetic.
Minor corrections—lighting, dust removal, tone balancing—generally fall outside Article 50’s scope.
Violations of the transparency provisions can bring penalties of up to €15 million or 3 percent of global annual turnover, whichever is higher.
Although Article 50 primarily regulates AI providers, the disclosure duty also applies to deployers—brands, agencies, or studios publishing synthetic content.
For creative industries like fashion, the takeaway is simple:
If AI changes reality, disclose it. If it only refines what’s already real, document it.
United States: FTC and State Action
The Federal Trade Commission (FTC) has confirmed that AI-generated content falls squarely within existing consumer-protection statutes.
Key points from recent FTC guidance:
Material misrepresentation: If AI use would influence a purchasing decision, nondisclosure may be deceptive under Section 5 of the FTC Act.
Endorsements: AI-generated “influencers” or testimonials that appear human can violate endorsement rules.
Native advertising: AI-generated content must be clearly distinguishable from human-created material when that distinction could mislead.
Meanwhile, several U.S. states are developing complementary disclosure laws:
Meanwhile, several U.S. states are developing complementary disclosure laws:
California AB 2655 (2024) requires large online platforms to label or remove deceptive AI content, originally aimed at election misinformation but expected to influence broader ad-labeling standards.
California AB 1836 (2024) addresses digital replicas and simulated likenesses, mandating disclosure and consent when AI recreates a person’s image or voice for commercial use—directly relevant to campaigns featuring virtual models.
New York’s proposed Synthetic Performer Disclosure Bill (A8887-B / S1228-C) would require conspicuous labeling of ads using AI-simulated human performances once enacted.
United Kingdom: ASA Clarifies AI Ad Standards
In May 2025, the Advertising Standards Authority (ASA) issued guidance clarifying that AI use in advertising must be disclosed when it could mislead consumers about authenticity or performance.
The ASA encourages “clear contextual language”—captions or taglines explicitly identifying AI-generated elements.
Together, these developments mark a global shift toward mandatory transparency in creative marketing.
When Fashion Brands Should Disclose AI Use
Brands should disclose whenever AI materially changes what a consumer sees or understands.
Disclose when:
Campaign images or videos are generated or substantially altered by AI.
Virtual models or AI influencers appear as real people.
Product images change fit, color, or texture beyond minor retouching.
Generative design concepts are presented as hand-made, custom, or exclusive.
Disclosure is generally not required when:
AI performs assistive edits—lighting correction, background cleanup—that don’t alter meaning or perception.
Best Practices for Disclosure
Sample Language:
“Images created with AI assistance by [studio/creator].”
“This campaign features AI-generated imagery .”
“Product photography AI-enhanced for styling visualization.”
Technical Tools:
Embed provenance metadata using the Coalition for Content Provenance and Authenticity (C2PA) standard.
Record AI usage via IPTC metadata fields or internal asset systems.
Add AI labels or disclosure tags to social media posts.
Workflow Tips:
Train marketing and creative teams to flag AI-generated assets.
Require disclosure sign-off during campaign approvals.
Revisit and update AI policies quarterly as regulations evolve.
The Authenticity Advantage
Consumers don’t necessarily reject AI: they reject deception.
A Yahoo & Publicis media survey found that 72% of consumers believe AI makes it difficult to determine what content is truly authentic. Although 61% already assume AI is used in ads, they don’t know how to identify where. As such, transparency is fast becoming a competitive asset.
Brands such as Aerie have gained traction by pledging not to use AI-generated bodies in campaigns—its October 2025 Instagram post was its most-liked in a year.
Whether your brand embraces hybrid creativity (AI as a tool), radical transparency (openly documenting AI use), or certified human (hand-made authenticity), the key is clarity and consistency.
The Road Ahead
Disclosure laws are not future tense: they’re already here.
For fashion brands, authenticity is compliance.
Those that build AI transparency into their creative workflow today will lead tomorrow’s market in trust, reputation, and legal resilience.
In an era of synthetic everything, honesty about process is the new luxury.
Disclaimer:
This article is provided for informational and educational purposes only and does not constitute legal advice. Reading or relying on this content does not create an attorney–client relationship with Dynamis LLP or any of its attorneys. The information contained herein is general in nature and may not reflect current legal developments or apply to your specific circumstances. For advice regarding a particular situation, please contact Brooke or another qualified attorney at Dynamis LLP.