The social media giant Meta announced on Thursday that it had started rolling out end-to-end encryption (E2EE) as a default “for all personal chats and calls on Messenger and Facebook.”
It follows “years of investment and testing,” according to Loredana Crisan, the head of Messenger, who claimed that the technology meant the services were “safer, more secure and private.”
The move will ensure that Meta’s users are protected from abusive legal requests from non-democratic governments. Globally the company receives hundreds of thousands of government requests for user data annually, according to its transparency center — including thousands from Mexico in the first half of this year, which the company assessed only 56% of were legitimate.
However the implementation of end-to-end encryption, which means Meta cannot itself access the content of messages even if it received a legal order to do so, has provoked enormous concern from law enforcement agencies in democratic countries.
More than 87% of the United Kingdom’s 9,710 requests for user data were answered between January and June. A senior official at the United Kingdom’s National Crime Agency (NCA) said the move was “hugely disappointing” and undermined the agency’s role in protecting children from sexual abuse and exploitation.
James Babbage — formerly the head of Britain’s National Cyber Force and now the director general for threats at the NCA — said the social media company had “an important responsibility to keep children safe on their platform and sadly, this will no longer be possible.”
Meta currently submits thousands of reports to the U.S. National Center for Missing & Exploited Children (NCMEC) annually when it detects child predators on its platforms attempting to contact children, alongside millions of reports when users upload media containing child sexual exploitation.
NCMEC warned that 70% of these reports — and as many as 85% to 92% according to the NCA — could be lost due to the implementation of E2EE, which they argue will blind Meta’s monitoring teams to content that reveals abusive behaviors.
The company in turn claims that it is able to use other signals such as the metadata of chats and messages to detect predators, similar to the tools it uses with WhatsApp.
But security officials in Britain have argued that these signals are insufficient, noting that the bar is much lower for Meta to ban users based on suspect signals than it is for law enforcement to prosecute offenders and safeguard children.
Officials also argue that unlike WhatsApp — where a person needs some prior knowledge of another user’s number in order to contact them — Meta’s platform is designed to help people discover other users.
According to the British government, Meta’s reports to NCMEC led in 2017 to more than 2,500 arrests and almost 3,000 children being safeguarded in the United Kingdom alone.
“That is in only one country. That is in only one year. That is based on referrals from only one company. That is what we stand to lose,” said Chloe Squires, then the director for national security at the Home Office.
The NCA estimates there are between 680,000 and 830,000 adults in the United Kingdom “that pose some degree of sexual risk to children.”
Meta has introduced a number of safety features to secure the accounts of underage children, and says alongside developing new technologies to protect children it works with industry and non-governmental organizations to regularly keep these updates.
Babbage said: “The alternative safety measures developed by the company relying on metadata alone will rarely, if ever, produce sufficient evidence for a search warrant. This means that in practice, the volumes will be so great that they are likely to be of very little value. The onus should not be entirely on children to report abuse.”
Labyrinth Protocol
Although Meta has long used the Signal protocol to secure messaging for WhatsApp users, as explained in a white paper explaining how the company implements E2EE, the new system on its social media platforms will also use something the company calls its Labyrinth Protocol to handle server-side storage of messages.
A spokesperson for Meta said the server-side storage system would not allow Meta to decrypt messages for law enforcement by adding another endpoint. They added that the protocol had been extensively reviewed by cryptographic experts including Matthew Green at Johns Hopkins University and Cooper Quintin at the Electronic Frontier Foundation.
The British government has been campaigning against Meta introducing the technology for years, with Squires — now the director general for Homeland Security — writing to the U.S. Senate Judiciary Committee back in 2019.
In her written testimony, Squires attempted to build support in Congress for Britain’s technical capability notice regime, introduced in the Investigatory Powers Act (2016), which could enable the British government to require Meta to preserve its ability to deliver message content when separately served with a warrant.
As previously reported by Sky News, there are significant concerns about whether such a power could be used on a company based in the United States without congressional approval.
An amendment to the Investigatory Powers Act currently being considered by Parliament intends to extend the law’s coverage to explicitly cover a service like Meta providing “a telecommunications service to persons in the United Kingdom.”
The law also includes a legal obligation for companies not to make changes that would later make it impossible to comply with a technical capability notice, although Meta’s rollout on Thursday precedes this amendment becoming law.
Recorded Future
Intelligence Cloud.
No previous article
No new articles
Alexander Martin is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.