Biden AI Robocalls Were Made By Democratic Strategist: 'With A Mere $500 Investment, Anyone Could Replicate'

In a shocking revelation, veteran Democratic strategist, Steve Kramer, has confessed to orchestrating fraudulent robocalls that impersonated President Joe Biden, targeting voters in New Hampshire last month.

What Happened: Kramer admitted to dispatching an automated call to approximately 5,000 potential Democratic voters on the night of January 20th, just two days prior to the New Hampshire primary. The call featured an AI-generated imitation of President Biden’s voice, produced using readily available online technology, reported The Hill.

According to an NBC News report last Friday, Kramer allegedly hired magician Paul Carpenter to create the robocalls using AI technology. NBC News reportedly referenced text messages, call records, and Venmo transactions.

Kramer, in a statement, underscored the urgency for more stringent regulations to prevent similar occurrences.

“With a mere $500 investment, anyone could replicate my intentional call,” said Kramer in the statement, urging immediate intervention from all regulatory authorities and platforms.

Kramer denied being directed to make the robocalls by his then-client, Rep. Dean Phillips (D-Minn.), who is contesting Biden for the Democratic nomination. Phillips’ campaign has disavowed Kramer’s alleged involvement, asserting that his actions were not connected to their campaign.

Carpenter, the magician purportedly employed by Kramer, confirmed to NBC that he created the call but did not distribute it. He stated that he was paid to perform a task and had no ill intent.

The New Hampshire attorney general’s office is currently probing the robocalls. The Federal Communications Commission prohibited the use of AI-generated voices in robocalls weeks after the Biden robocalls were executed.

See Also: Xbox’s Phil Spencer Announces Day-One Release Of All Activision Games On Xbox Game Pass: Is Call Of Duty Included?

Why It Matters: The robocall incident has sparked renewed concerns about the potential misuse of artificial intelligence in spreading election misinformation.

The New Hampshire attorney general’s office had confirmed the existence of a robocall that mimicked President Biden’s voice, advising voters to abstain from the primary election. This incident was flagged as an illegal attempt to disrupt the primary, according to a prior report.

There is growing concern over AI-generated deepfakes threatening the integrity of U.S. elections. The White House expressed alarm over the circulation of false images and voice alterations of public figures, including President Biden.

Photo by Trevor Bexon on Shutterstock

Read Next: Elon Musk Rips Senior Director For Gemini Experiences: ‘This Nut Is A Big Part Of Why Google’s AI Is So Racist & Sexist’


Engineered by Benzinga Neuro, Edited by Shivdeep Dhaliwal


The GPT-4-based Benzinga Neuro content generation system exploits the extensive Benzinga Ecosystem, including native data, APIs, and more to create comprehensive and timely stories for you. Learn more.


Market News and Data brought to you by Benzinga APIs
Posted In: NewsPoliticsGeneral2024 electionAIartificial intelligenceBidenDean PhillipsDemocratsJoe BidenNew HampshirePaul CarpenterPOTUSShivdeep DhaliwalSteve Kramer
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!

Loading...