AI-Driven Deceptive Political Calls: Company Agrees to $1M Fine

A company that sent AI-generated robocalls mimicking President Biden's voice to New Hampshire voters agrees to a $1 million FCC fine. The deceptive calls falsely claimed voting in the primary would preclude general election voting. The case raises concerns about AI's influence on democracy.


Devdiscourse News Desk | Meredith | Updated: 22-08-2024 02:57 IST | Created: 22-08-2024 02:57 IST
AI-Driven Deceptive Political Calls: Company Agrees to $1M Fine
AI Generated Representative Image
  • Country:
  • United States

A company responsible for sending deceptive AI-generated calls to New Hampshire voters that mimicked President Joe Biden's voice agreed on Wednesday to pay a $1 million fine, according to federal regulators.

Lingo Telecom, the voice service provider behind the robocalls, accepted the settlement to resolve enforcement action by the Federal Communications Commission (FCC), which initially sought a $2 million fine. This case is viewed as an unsettling early example of AI being used to influence voter groups and democracy.

Political consultant Steve Kramer, who orchestrated the calls, faces a proposed $6 million fine from the FCC and state criminal charges. The messages, sent to thousands on January 21, falsely suggested that voting in the primary would preclude participation in the general election. The FCC emphasized the need for clear caller ID authentication and transparency about AI usage.

(With inputs from agencies.)

Give Feedback