Ringo Telecom has agreed to pay $1 million in a settlement with the Federal Communications Commission over AI-generated robocalls designed to imitate President Joe Biden’s voice and disrupt the 2024 New Hampshire presidential primary.
The call was orchestrated by political consultant Steve Cramer, who was subsequently indicted on 13 felony counts of voter suppression and 13 misdemeanor counts of impersonating a candidate and was fined $6 million by the FCC in May of this year.
This is the latest in a series of attempts to use AI to sway voters ahead of the US presidential election in November.
In July, Fake AI Video Tech billionaire Elon Musk circulated a video online falsely portraying Vice President Kamala Harris saying things she never said. Musk later clarified that the video was intended as satire and shouldn’t be taken at face value.
Although Michigan-based Ringo Telecom did not create the deepfake material, the FCC sued the company for failing to comply with know-your-customer obligations. Know your upstream provider The regulations will be changed, according to a statement on Wednesday.
In addition to the fine, Lingo Telecom has agreed to a number of measures to prevent its services from being used in this way again in the future, including:
- A-level authentication, the highest level of trust that can be given to a phone number, is only applied to calls where Lingo Telecom itself has provided the caller with a caller ID number.
- Obtain independent supporting records to verify the identity and business lines of each customer and upstream provider.
- Strong robocall mitigation mechanisms are in place to only send traffic from upstream providers that respond to traceback requests.
Decryption The FCC and Lingo Telecom reached out for comment but had not responded at the time of writing.
In a statementFCC Enforcement Director Loian A. Egal said the settlement is expected to provide communications service providers with a first line of defense against the threat of deepfakes and sends a “strong message” that the FCC will hold them accountable.
The potential for deepfakes to mislead voters has emerged as a major concern during the current election cycle. Earlier this week it was reported that Donald Trump was using AI-generated deepfakes of Taylor Swift, Elon Musk, and political rival Kamala Harris in his re-election campaign.
Editor: Sebastian Sinclair
Generally highly intelligent Newsletter
A weekly AI journey narrated by generative AI model Gen.