Welcome to Consumer Reports Advocacy

For 85 years CR has worked for laws and policies that put consumers first. Learn more about CR’s work with policymakers, companies, and consumers to help build a fair and just marketplace at TrustCR.org

Consumer Reports submits comments on FTC individual impersonation rulemaking and artificial intelligence

Consumer Reports submitted comments in support of the Federal Trade Commission’s proposed rule on individual impersonation.


The rule, in part, would create liability for companies that know or have reason to know their products or services will be used to impersonate individuals in or affecting commerce, such as scams.


Imposter scams are prevalent – in 2023, 853,935 imposter scams were reported to the Commission’s Consumer Sentinel Network – and costly to consumers. 


Artificial intelligence programs capable of accurately mimicking specific voices can supercharge impersonation scams. AI voice cloning software is cheap and readily accessible, and some tools require only a brief snippet of a person’s voice – potentially scraped from publicly available social media videos – to create a convincing clone. Some families have reportedly sent as much as $15,000 after thinking they’ve spoken to a loved one in need, only to realize later they were conversing with a synthetic voice.


Consumer Reports urged the Commission to take action, writing that the proposed rule was well within the agency’s Section 5 authority. We think companies should have to take reasonable steps to prevent consumer harms that could arise from the use of their platforms, products and services. This is consistent with nineteen years of Commission holdings that companies have data security obligations to protect consumers from malicious behavior by third-party actors.


Even outside of the data security context, the Commission has used its authority to take action against platforms that fail to address fraudulent practices conducted by third-parties. The FTC should similarly hold that companies have a duty to take reasonable actions to prevent the use of their services or products for impersonation scams.  


We also shared consumer experiences with the Commission. When we asked our members about individual impersonation scams, they shared stories of scammers impersonating family members and leaving them feeling “vulnerable,” “shaken by the experience,” and “really weirded out.” 


Some members shared experiences in which fraudsters seemingly mimicked a loved one’s voice, potentially by using generative artificial intelligence: 


  •  “My Grandpa got a call from someone claiming to be me. Supposedly, I was traveling, and my car broke down and I needed to have him send money so I could complete my travels. Grandpa said there was no doubt in his mind that I was the caller and was preparing to do as asked….Scary that the tools they use could imitate my voice that closely as to fool a close relative,” – member from Minnesota
  • “The initial caller’s voice sounded very much like my nephew’s. He knew family details, pleaded with me not to call his father and promised to pay me back as soon as he got home – all very convincing. I should add that I spent more than 60 years in law-enforcement and intelligence work. This scam was so carefully arranged and executed that I fell for it nevertheless,” – member from Massachusetts
  • “The voice on the other end sounded just like my grandson and it said ‘Gramie, I’ve been in an accident,’” – member from Florida
  • “I was skeptical, and told [the scammer] I had heard of scams such as this. So, he said, ‘I’ll let Nate say a few words to you.’ It sounded exactly like my Nate!!  He has a rather unusual voice, so I was then almost convinced,” – member from Indiana