- AI infringements are becoming more frequent, and the US Federal Trade Commission is looking to amend an existing rule that prohibits impersonating businesses or government agencies.
- As deepfake-generating tools become more sophisticated, we will see these laws amended to cover a wider range of deepfakes and more national law proposals passed.
Amendment of Regulations
Spurred by the growing threat of deep counterfeiting, the Federal Trade Commission is seeking to amend an existing rule that prohibits impersonating businesses or government agencies to cover all consumers.
The revised rules – depending on the final wording and public comments received by the FTC – could also make it illegal for GenAI platforms to offer that goods or services they know or have reason to know are being used to harm consumers through counterfeiting.
“Fraudsters are using artificial intelligence tools to impersonate individuals with amazing precision and on a much larger scale,” FTC Chair Lena Khan said in a press release. “With the rise of voice cloning and other AI-driven scams, protecting Americans from counterfeit scams is more important than ever.”Our proposed extension of the final impersonation rule will do just that, strengthening the FTC’s toolkit to address scams that use AI to impersonate individuals.”
Life beset by AI
It’s not just people like Taylor Swift who have to worry about deepfakes. Online relationship scams involving deepfakes are on the rise.
In a recent YouGov poll, 85% of Americans said they were very or somewhat concerned about the spread of misleading videos and audio. Another survey by the Associated Press-NORC Center for Public Affairs Research found that nearly 60 percent of adults believe AI tools will increase the spread of false and misleading information in the 2024 U.S. election cycle.
Also read: Taylor Swift AI images are ‘shocking and scary’, says Satya Nadella
No federal law explicitly prohibits deepfakes.High-profile victims such as celebrities could theoretically resort to traditional existing legal remedies to fight back, including copyright law, image rights, and infringement (e.g., invasion of privacy, intentional infliction of emotional distress).A civil lawsuit against the patchwork of laws, however, is both time-consuming and inefficient.
In the absence of congressional action, 10 states across the country have enacted statutes that criminalize deepfakery – although most are nonconsensual pornography.No doubt, as deepfake-generating tools become more sophisticated, we will see these laws amended to cover a wider range of deepfakes and more state-level laws passed. (Minnesota’s law, for example, already targets deepfakes used in political campaigns.)






