UK opposition leader targeted by AI-generated fake audio smear

Jason Macuray
An audio clip posted to social media on Sunday, purporting to show Britain’s opposition leader Keir Starmer verbally abusing his staff, has been debunked as being AI-generated by private-sector and British government analysis.

An audio clip posted to social media on Sunday, purporting to show Britain’s opposition leader Keir Starmer verbally abusing his staff, has been debunked as being AI-generated by private-sector and British government analysis.

The audio of Keir Starmer was posted on X (formerly Twitter) by a pseudonymous account on Sunday morning, the opening day of the Labour Party conference in Liverpool. The account asserted that the clip, which has now been viewed more than 1.4 million times, was genuine, and that its authenticity had been corroborated by a sound engineer.

Ben Colman, the co-founder and CEO of Reality Defender — a deepfake detection business — disputed this assessment when contacted by Recorded Future News: “We found the audio to be 75% likely manipulated based on a copy of a copy that’s been going around (a transcoding).

“As we don’t have the ground truth, we give a probability score (in this case 75%) and never a definitive score (‘this is fake’ or ‘this is real’), leaning much more towards ‘this is likely manipulated’ than not,” said Colman.

“It is also our opinion that the creator of this file added background noise to attempt evasion of detection, but our system accounts for this as well,” he said.

The audio was criticized on a bipartisan basis, despite the highly contested political environment in the United Kingdom — with polls generally showing the Labour Party 17 points ahead of the incumbent Conservatives.

Simon Clarke, a Conservative Party MP, warned on social media: “There is a deep fake audio circulating this morning of Keir Starmer – ignore it.” The security minister Tom Tugendhat, also a Conservative MP, also warned of the “fake audio recording” and implored Twitter users not to “forward to amplify it.”

“Deepfakes threaten our freedom. That’s why the Defending Democracy Taskforce and the work the PM is doing on AI are critical for protecting us all,” added Tugendhat. The word “deepfake” is used colloquially to refer to any kind of synthetic media generated by AI technologies.

The Defending Democracy Taskforce was established in November 2022 with the mission of reducing “the risk of foreign interference to the U.K.’s democratic processes, institutions, and society, and ensure that these are secure and resilient to threats of foreign interference,” accordion to a parliamentary question previously answered by Tugendhat.

Recorded Future News understands an analysis of the audio file by the British government confirmed it to be fake.

Screenshot of the social media post featuring the audio file.

Authorities in the U.K. are bracing for this kind of interference ahead of the country’s general election next year, in the wake of similar attempts to influence the recent elections in Slovakia.

Two days before the polls opened there on September 30, faked audio clips were published on social media attempting to incriminate an opposition party leader and a journalist with rigging the election by plotting to purchase votes.

Publicly debunking the audio was a challenge because of the country’s election laws, which strictly ban both the media and politicians making campaigning announcements in the two days before the polls open.

As reported by Wired, as an audio post the fake also “exploited a loophole in Meta’s manipulated-media policy, which dictates only faked videos — where a person has been edited to say words they never said — go against its rules.”

It is not clear who produced the fake audio in either the Slovakian or British cases.

The account which posted the Keir Starmer smear had previously tweeted: “Let me be clear. I am unequivocally PRO smear tactics against those who engage in smear tactics themselves. People lie about Keir Starmer? Good. And I’m one of them.”

That tweet has now been deleted, although the fake audio remains available.

GovernmentNewsTechnology
Get more insights with the

Recorded Future

Intelligence Cloud.

Learn more.

No previous article

No new articles

Alexander Martin is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.

 

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Ukraine, Israel, South Korea top list of most-targeted countries for cyberattacks

Next Post

Former US Cyber Director Inglis on Israel, Russia and ONCD’s future

Related Posts

Google Introduces Project Naptime for AI-Powered Vulnerability Research

Google has developed a new framework called Project Naptime that it says enables a large language model (LLM) to carry out vulnerability research with an aim to improve automated discovery approaches. "The Naptime architecture is centered around the interaction between an AI agent and a target codebase," Google Project Zero researchers Sergei Glazunov and Mark Brand said. "The agent is provided
Avatar
Read More