FCC bans AI-generated ‘deepfake’ robocalls ahead of presidential election
The Federal Communications Commission announced an immediate ban on fake AI-generated robocalls — a move designed to send a clear message that exploiting the technology to scam people or mislead voters won’t be tolerated ahead of November’s critical presidential election.
The unanimous ruling by the FCC targets robocalls made with AI voice-cloning tools — or “deepfakes” — under the Telephone Consumer Protection Act, a 1991 law restricting junk calls that use artificial and prerecorded voice messages.
“Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters,” said FCC Chairwoman Jessica Rosenworcel in a statement.
“We’re putting the fraudsters behind these robocalls on notice.”
The move comes after authorities earlier this week warned two Texas companies who were allegedly behind robocalls that used AI to mimic President Biden’s voice and discourage people from voting in New Hampshire’s primary last month.
The FCC will now be empowered to fine companies that use AI voices in their calls. The agency will also have the authority to block the service providers that carry them.
Those who receive the robocalls can also file lawsuits against the companies that are behind them.
Under the consumer protection law, telemarketers generally cannot use automated dialers or artificial or prerecorded voice messages to call cellphones, and they cannot make such calls to landlines without prior written consent from the call recipient.
The new ruling classifies AI-generated voices in robocalls as “artificial” and thus enforceable by the same standards, the FCC said.
Those who break the law can face steep fines, maxing out at more than $23,000 per call, the FCC said.
The agency has previously used the consumer law to clamp down on robocallers interfering in elections, including imposing a $5 million fine on two conservative hoaxers for falsely warning people in predominantly black areas that voting by mail could heighten their risk of arrest, debt collection and forced vaccination.
The law also gives call recipients the right to take legal action and potentially recover up to $1,500 in damages for each unwanted call.
Last year, as the presidential race got underway, several campaign advertisements used AI-generated audio or imagery, and some candidates experimented with using AI chatbots to communicate with voters.
Bipartisan efforts in Congress have sought to regulate AI in political campaigns, but no federal legislation has passed, with the general election nine months away.
Never Miss a Story
Sign up to get the best stories straight to your inbox.
Thanks for signing up!
The AI-generated robocalls that sought to influence New Hampshire’s Jan. 23 primary election used a voice similar to Biden’s, employed his often-used phrase, “What a bunch of malarkey” and falsely suggested that voting in the primary would preclude voters from casting a ballot in November.
New Hampshire Attorney General John Formella said Tuesday that investigators had identified the Texas-based Life Corp. and its owner, Walter Monk as the source of the calls, which went to thousands of state residents, mostly registered Democrats. He said the calls were transmitted by another Texas-based company, Lingo Telecom.
New Hampshire issued cease-and-desist orders and subpoenas to both companies, while the FCC issued a cease-and-desist letter to the telecommunications company, Formella said.
A task force of attorneys general in all 50 states and Washington, D.C., sent a letter to Life Corp. warning it to stop originating illegal calls immediately.
According to the FCC, both Lingo Telecom and Life Corp. have been investigated for illegal robocalls in the past.
In 2003, FCC issued a citation to Life Corp. for delivering illegal pre-recorded and unsolicited advertisements to residential lines.
More recently, the task force of attorneys general has accused Lingo of being the gateway provider for 61 suspected illegal calls from overseas.
The Federal Trade Commission issued a cease and desist order against Lingo’s prior corporate name, Matrix Telecom, in 2022. The next year, the task force demanded that it take steps to protect its network.
Lingo Telecom said in a statement Tuesday that it “acted immediately” to help with the investigation into the robocalls impersonating Biden and quickly identified and suspended Life Corporation when contacted by the task force. A man who answered the business line for Life Corp. declined to comment on Thursday.
Earlier this week, YouMail, a free robocall-blocking app and call protection service for mobile phones, released data showing that nearly 4.3 billion robocalls were made in January — a 13% increase compared to the previous month, but a 5.2% decline from the 4.5 billion robocalls made in January of last year.
With Post wires