The FCC’s unanimous decision on February 2 gives state attorneys general “new tools” to crack down on robocall scammers who use voice cloning technology, Rosenworcel said. added.
Robocall scams using AI-generated voices were already considered illegal, according to the FCC, but Thursday’s ruling makes it clear that using AI to generate audio for robocalls is itself illegal. That’s what it means.
AI-generated voice technology is becoming increasingly sophisticated and can now create incredibly realistic voices. This technology has made it easier and cheaper to carry out telephone fraud.
The technology’s growing popularity was demonstrated before January’s New Hampshire primary when voters received calls from a voice impersonating Biden. The voice called the election a “disaster” and urged voters to “save your vote for November’s election.” Biden was not on the ballot in that primary, but a group of Democrats had organized a write-in campaign to show support for the president.
New Hampshire Attorney General John Formella (R) this week announced a criminal investigation into a Texas-based company suspected of being involved in thousands of phone calls to voters in the state. And he issued a warning to others who might try to use the technology to interfere with elections.
“Don’t try it,” he said. “If you do, we will work together to investigate and work with our partners across the country to locate you and take all possible enforcement actions under the law. The consequences of your actions will be severe. Sho.”