Facebook Tells US Attorney General It’s Not Prepared To Get Rid Of Encryption On WhatsApp And Messenger

“People’s private messages would be less secure and the real winners would be anyone seeking to take advantage of that weakened security,” the company wrote to leaders in the US, UK, and Australia.

Facebook said it would not weaken end-to-end encryption across its messaging apps, despite pressure from world governments, in a letter to US Attorney General Bill Barr and UK and Australian leaders.

The letter, sent Monday, came in response to an October open letter from Barr, UK Home Secretary Priti Patel, Australian Minister for Home Affairs Peter Dutton, and then–acting US homeland security secretary Kevin McAleenan, which raised concerns that Facebook’s continued implementation of end-to-end encryption on its WhatsApp and Messenger apps would prevent law enforcement agencies from finding illegal activity such as child sexual exploitation, terrorism, and election meddling. The US, UK, and Australian governments asked the social networking company to design a backdoor in its encryption protocols, or a separate way for law enforcement to gain access to user content.

“It is simply impossible to create such a backdoor for one purpose and not expect others to try and open it,” wrote WhatsApp head Will Cathcart and Messenger head Stan Chudnovsky in Facebook's response. “People’s private messages would be less secure and the real winners would be anyone seeking to take advantage of that weakened security. That is not something we are prepared to do.”

End-to-end encryption prevents anyone — governments, security agencies, or hackers — from accessing or viewing the contents of a message between two parties and is a key feature on popular apps such as WhatsApp and Signal. Government agencies have long desired a means of accessing message content on encrypted apps, arguing that it’s in the interest of public safety despite broader privacy concerns.

Facebook’s letter came as Jay Sullivan, Messenger’s director of product management for privacy and integrity, prepared to testify on Tuesday at a Senate Judiciary Committee hearing on “encryption and lawful access” along with an executive from Apple and New York County District Attorney Cyrus Vance Jr. In his opening statement, which was made public ahead of his appearance, Sullivan is expected to discuss how Facebook and other companies can work with governments to support law enforcement without weakening encryption.

“We can be certain that if we build a backdoor for the U.S. government, other governments, including repressive and authoritarian regimes around the world, will demand access or try to gain it clandestinely, including to persecute dissidents, journalists, and their political opponents,” his statement read. “Preserving the prominence of American values online requires strong protections for privacy and security, including strong encryption.”

In March, following more than a year of public scrutiny of the company’s lax data and user privacy practices, Facebook and its CEO, Mark Zuckerberg, made a much-publicized turn toward privacy. Most recently, Zuckerberg defended the company’s move toward encryption in leaked internal comments obtained by the Verge, calling it “socially important” and saying it’s “the right thing to protect people’s privacy more.”

The letter from the US, UK, and Australian governments to Facebook in October, which was first published by BuzzFeed News, said that companies should not deliberately design systems to thwart government intervention and investigation, highlighting the possibilities for child exploitation on encrypted apps. Facebook responded firmly at the time and has since given no indication that it will weaken encryption on WhatsApp and Messenger, each of which has more than a billion users.

In the response Monday, Facebook’s leaders highlighted the company’s investment in artificial intelligence and human moderation, talking points Facebook has repeatedly touted as it has tried to separate itself from a mishap-strewn 2018. They also noted that WhatsApp detects and bans 2 million accounts every month based on “abuse patterns” and scans of unencrypted information, including profile and group information.

The letter also gave a nod to the company’s argument that it’s able to detect more bad content because of its sheer size. Facing multiple antitrust investigations and calls for regulators to break it up, Facebook has said that its size and portfolio of properties are an advantage in dealing with bad actors.

“Our teams are constantly developing new ways to try to detect patterns of activity, by finding bad activity upstream, and by reviewing what we know across the accounts we provide,” the letter read. “So, if we know someone is doing something bad on Facebook or Instagram we can often take action on their account on Messenger and WhatsApp, and vice versa.”


Topics in this article

Skip to footer