6 min read

What Happens When AI Impersonates You Online — and How to Fight Back

AI-generated content is being posted under real artists' names, passed off as their work, and used to sell products they never endorsed. Here is what your options are.

H
Humartz EditorialVerified Human
What Happens When AI Impersonates You Online — and How to Fight Back

It is happening with increasing frequency. An artist discovers that AI-generated images in their style are being sold under their name on print-on-demand platforms. A musician finds that a synthetic version of their voice is being used in advertisements they never agreed to. A writer sees blog posts circulating under their byline that they didn't write.

This is AI impersonation — distinct from the style appropriation problem, and in many ways more immediately actionable. Here is what's happening and what you can do about it.

The Different Forms of AI Impersonation

Commercial impersonation. AI-generated work is being sold using a real artist's name, brand, or identity — on Etsy, Redbubble, Amazon, and similar platforms. The buyer thinks they're purchasing something connected to the real artist. The seller is using the artist's reputation to generate revenue.

Voice and likeness impersonation. Synthetic voices, generated in the style of or explicitly modeled on a real person, are being used in videos, advertisements, and social content without consent. In some cases, the impersonation is explicit — the voice is attributed to the real person. In others, it's implied through context.

Content impersonation. Articles, social media posts, and other content are being generated and attributed to real people — using their writing style, their stated views, or simply their name — and distributed in contexts where readers assume it's authentic.

Portfolio theft. Real human-made work is being reposted with false attribution to AI accounts, or AI-generated work is being posted claiming to be by a real artist, diluting their authentic portfolio with fake content.

The Legal Frameworks That Apply

Unlike the more ambiguous question of style appropriation in training, direct impersonation has clearer legal grounding:

Right of publicity. The unauthorized commercial use of your name, voice, or likeness is a right-of-publicity violation in most US states. AI-generated content that uses your name to sell products or services without your consent falls squarely within this.

The Lanham Act. Federal trademark law prohibits false designations of origin — representing that goods or services come from a source they don't. Selling AI-generated content under a real artist's name is potentially a Lanham Act violation.

Defamation. If AI-generated content attributed to you makes false statements or creates false impressions about your views, beliefs, or actions, defamation law may apply.

Platform terms of service. Every major platform explicitly prohibits impersonation. This gives you a path to reporting and removal that doesn't require a lawsuit.

Emerging legislation. The NO FAKES Act in the US, if passed, would create explicit federal protection against AI replicas of voice and likeness. Several states have already passed their own versions.

Immediate Steps When You Find Impersonation

Document everything before acting. Before filing any reports or takedowns, screenshot or record the content, the URL, the account name, and any commercial context. Once you file a report, platforms may remove the content quickly — and you need documentation for any follow-up action.

Report to the platform. Use the platform's impersonation reporting tool. On most major platforms, this is faster than a DMCA takedown because impersonation policies are typically handled separately from copyright claims, often with a quicker response window.

File a DMCA notice if your actual work is reproduced. If the impersonation includes unauthorized reproduction of specific copyrighted works you own, a DMCA takedown notice is appropriate and legally required to be addressed.

Contact the platform's trust and safety or legal team directly. For significant or ongoing impersonation, going through standard reporting channels is a start, but direct contact with the platform's trust and safety team produces faster results in serious cases.

Consult an IP attorney. If the impersonation is commercial and causing material harm, a cease-and-desist letter from an attorney — often without any litigation — can produce results quickly.

Building Your Defense

Impersonation is harder to combat when you don't have clear documentation of your authentic work and identity:

Maintain a public archive of your genuine work. A well-documented portfolio with timestamps, process notes, and provenance creates a baseline against which impersonation can be clearly identified.

Make your identity and process visible. The more clearly your authentic creative voice and process are documented and public, the easier it is to show that impersonated content doesn't match them.

Certify your work. A certified record of your genuine creative output creates a verifiable counterpoint to impersonated content. When you can show a court or a platform that this certified work is yours — with a tamper-evident process trail — the impersonated content has much less ambiguity to hide in.

Alert your community. Your audience, collaborators, and professional network are the fastest early warning system for impersonation. Making them aware that impersonation happens, and how to recognize your genuine work, creates distributed vigilance that no monitoring tool can replicate.

Impersonation is one of the cleaner cases legally. The tools for fighting it are better than people often realize. The key is acting quickly, documenting thoroughly, and having a baseline of verified authentic work to point to.

Read Next
Humartz

Protect your creative legacy

Don't let your work disappear into the noise. Get a verified human badge that holds up legally and commercially.