The Phillips campaign gave Cramer about $250,000 to get Phillips, a three-term congressman from Minnesota running against Biden, on the ballot in New York and Pennsylvania, according to federal campaign filings. paid. Kramer said the Federal Communications Commission issued a subpoena for his involvement.
After the robocall, the FCC adopted a ruling clarifying that it is illegal to use AI to generate audio for robocalls, stating that “New Hampshire uses AI-generated audio to make illegal spoofed robocalls.” A cease and desist order was immediately issued to Kramer. Public notice to U.S.-based voice providers regarding blocking of traffic related to calls.
FCC spokesman Will Wiquist said in a statement: “The agency is working diligently to ensure that harmful misuse of AI technology does not compromise the integrity of our communications networks, including by utilizing all available tools in our investigations. “There is,” he said.
Kramer also provided details about how he created the robocalls, corroborating several details that had previously been the subject of speculation. He used software from artificial intelligence voice cloning company Eleven Labs to create a deepfake voice of Biden in less than 30 minutes.
He said the call was made by Voice Broad, an entity associated with Life Inc., which was at the center of a criminal investigation launched into Biden AI robocalls by New Hampshire Attorney General John Formella in early February. It added that it was delivered by casting. Kramer said he created the robocall to raise awareness about the dangers AI poses to political campaigns.
“If anyone could do it, what would people with real money or entities with real money do?” he said.
Mr. Kramer’s case highlights the ease with which AI-generated technologies could be introduced into the 2024 election cycle, allowing nearly anyone to use a wide range of tools to sow confusion and confusion into the voting process. I have to.
It also portends new challenges for state regulators. That’s because increasingly sophisticated AI tools are creating new opportunities to interfere in elections around the world by creating fake audio recordings, photos, and even videos of candidates to muddy the waters of reality.
The New Hampshire Attorney General’s investigation into robocalls “remains active and ongoing,” said Michael Garrity, a spokesman for the office.
Phillips and his camp condemned Robocall. Katie Dolan, a spokeswoman for the Phillips campaign, said the company’s contract with Kramer was terminated before it became aware of his involvement in the robocalls.
“We are outraged to learn that Mr. Kramer was behind this call and categorically condemn his actions,” she said. Kramer’s involvement was first reported by NBC News.
Investigators say robocalls using an AI-generated voice resembling Biden targeted thousands of New Hampshire voters the weekend before the state’s Democratic presidential primary, leading to changes in their votes. I told him there was no one.
The call began with Biden’s catchphrase calling the election “a big deal” and urging voters to “save your vote for the November election.” The call was made by former New Hampshire Democratic Party Chair Kathy, who was cooperating with an effort to get voters to write Biden’s name to show their support, even though he was not on the ballot. The call appears to have come from Sullivan’s number. Sullivan and others reported the call to the state attorney general.
Formera announced in early February that it was opening a criminal investigation into the matter and sent a cease-and-desist letter to the communications company Life Corp., ordering it to immediately stop violating state laws prohibiting voter suppression in elections.
A multi-state task force is also preparing for possible civil lawsuits against the company, and a federal The Communications Commission ordered Ringo Telecom to stop allowing illegal robocall traffic.
“Don’t ever try it,” Formella said at a press conference in February. “If you do, we will work together to investigate and work with our partners across the country to locate you and take all possible enforcement actions under the law. The consequences of your actions will be severe. Sho.”
AI experts say the robocall incident is one of several that highlights the need for better policies within technology companies to ensure their AI services are not used to skew elections. .
In late January, OpenAI, the creator of ChatGPT, banned a developer from using the tool after the developer built a bot that imitated Philips. His campaign supported the bot, but after the Post reported on it, OpenAI deemed it in violation of rules against using its technology in campaigns.
Paul Barrett, deputy director of New York University’s Stern Center for Business and Human Rights, said in an email that it is clear how powerful AI deepfakes can be in disrupting elections. “New technology makes it much easier for non-experts to create deceptive and persuasive content that can mislead people about when, how and where to vote. “It has become,” he said.
This isn’t the first time Kramer has used AI to disguise the voices of politicians. Last year, he created an AI-generated robocall of Sen. Lindsey O. I asked them who they would support if they were not on the paper.
Cramer said he plans to support Biden if he wins the Democratic nomination and hopes his actions will prompt regulators to take notice of AI’s potential impact on elections.
“It’s here now,” he said, referring to AI, “and I did something about it.”
Clara Ence Morse, Eva Dou and Razzan Nakhlawi contributed to this report.