Klobuchar, Collins Call on FTC, FCC to Address Rise in AI Voice Cloning Scams

Press Release

Date: Nov. 3, 2023
Location: Washington, D.C.

Dear Chair Khan and Chair Rosenworcel:

We write to express our grave concerns about the rise of voice cloning technology that can be
used to defraud and scam Americans by impersonating familiar voices. These scams prey on our
best instincts to help our loved ones in need, cause great distress, and raise security concerns.

Across the country, fraudsters are using artificial intelligence (AI) voice cloning technology to
trick people into giving up their personal information or money. Recently in Minnesota, a father
received a call from someone that sounded just like his son, crying "mom, dad, can you hear
me?" He and his wife were terrified--they had not heard from their son, a Marine deployed
abroad, in weeks. Thankfully, the father was able to decipher that the call was not from his son,
but was an impersonation of his son's voice generated by AI technology. One can only imagine
the emotional toll on any parent thinking their child, particularly one abroad serving our country,
was in peril.

As generative AI advances, these stories are becoming far too common. Scammers only need a
short sample of an individual's voice to generate an authentic-sounding imitation. They can pull
the sample and backstory from public sources like social media. This summer in Iowa, one
couple received a panicked call from a voice that sounded like their son, who was on his way to
basic training, saying he was in jail and needed $7,000 for bail.1 Luckily this couple was able to
reach their son, who was safe at home, before wiring the money. In Utah, another man got a call
from a voice clone that sounded like his grandson, claiming to need $5,000 in bail after a car
crash.2 These scams are putting too many parents and grandparents through this emotional toll,
and as technology improves, voice clones will only become more convincing.

1 CBS 2 Iowa, "A.I. scammers are cloning voices to take advantage of unsuspecting victims," Nada Shamah, June 7, 2023. https://cbs2iowa.com/news/local/ai-scammers-are-cloning-voices-to-take-advantage-of-unsuspecting-victims
2 KSLTV, "Expert: Scams more convincing with artificial intelligence," Andrew Adams, August 4, 2023. https://ksltv.com/574505/expert-scams-more-convincing-with-artificial-intelligence/#:~:text=In%20the%20moment%2C%20David%20Allan,bail%20him%20out%20of%20there

While we appreciate the informational notices on this topic that your agencies recently issued,
more can be done to educate Americans about these frauds and to help prevent them from
happening. Towards that end, we look forward to partnering with your agencies to prevent
exploitative scams that use voice cloning technology. Please respond to the following questions
by November 17.

1. What steps are your agencies taking to educate the American people, prevent these
frauds, and enforce current laws to crack down on fraudulent uses of voice cloning
technology?
2. Based on the consumer alerts and investigations you have already done, what have you
learned about this issue?
3. What resources are you devoting to combat this type of fraud, and are additional
resources necessary?
4. Are current laws sufficient to prevent these frauds and punish perpetrators, or do you
believe additional authority is needed?

Thank you very much for your efforts to combat this distressing type of scam. We look forward
to working with you to protect all Americans from this growing fraud.

Sincerely,


Source
arrow_upward