advertisement
advertisement

Signal’s newest feature shows why putting privacy first is so hard

The privacy-first messaging app recently rolled out an opt-out feature that was criticized by security experts and panned by users.

Signal’s newest feature shows why putting privacy first is so hard
[Photos: n_patana/iStock; NeONBRAND/Unsplash]
advertisement
advertisement

Signal has become the privacy-focused consumer’s go-to messaging app. But a recent change to its back-end systems that was designed to make the app more accessible and competitive with other encrypted messaging services could be putting user data at risk.

advertisement
advertisement

At the core of Signal’s appeal is a level of digital protection and commercial disinterest in its users’ communications rarely seen by messaging service providers. Signal is now used broadly not just by hackers and professional paranoids, but activists, journalists, politicians, and any number of people who believe that their text messages and phone calls should be as private as an in-person conversation. Few other apps offer a similar level of security and privacy.

But while privacy is central to Signal, it is not immune from security challenges. In May, the company released a new PIN-based system that would have allowed users to back up their contact lists—and eventually their messages—to the cloud. The new feature was built on a system with known security flaws, as many ardent Signal users angrily pointed out on Twitter in the ensuing weeks.

Signal’s recent stumble stemmed from its attempts to be different from other messaging services, even those with relatively good privacy such as WhatsApp or iMessage. Most messaging services use encryption to protect messages from being spied on when the message is being sent. But even those services that do encrypt messages usually store them and the user’s address book in plaintext, unencrypted.

Yet building contact and message backups—something that all of Signal’s primary competitors offer—into a system whose primary feature is that all communications are encrypted end-to-end is a difficult task to accomplish in modern computing.

That’s why Signal’s address book has been tied to the address book on the user’s phone ever since it was first released. This flaw has long been criticized by privacy experts because it’s easy to track somebody from just their phone number.

For the same reason, Signal also hasn’t offered an easy, default system for backing up messages or contact lists because building those features so they protect user security and privacy has been nearly impossible. Essentially, there’s no good way for Signal to connect data like profile name, user contacts, and group membership, which it does not have access to, to a single user. In addition, its service does not include “trackers, ads, or analytics in our software at all,” as Signal founder Moxie Marlinspike wrote on June 5.

advertisement

Currently, users who want to back up and restore Signal messages must involve themselves in a time-consuming process that requires downloading their message database and manually transferring it, protected by a 30-digit code (on Androids) or a QR code (on iPhones.) And your Signal user list is entirely reliant on your phone’s address book. Those restrictions might be changing, thanks to new technology invented by Marlinspike and his team.

To get around the problem of storing the address book in plaintext, Signal developed a Secure Value Recovery system that allows users to store data on Signal’s servers without Signal knowing who they are or what the data contains. Currently, that data is limited to the user’s address book, though it could also be expanded to include message backups. The immediate goal is to create an address book that’s separate from the one on your phone and eliminate phone number-based communication altogether, making it even harder to track who’s communicating with whom on Signal.

To start moving its millions of users away from the phone number address book system, Signal started asking them to adopt a PIN code to use the app. While some of its users didn’t mind the PIN, many resisted it by ignoring the PIN prompt, uncertain of what using it meant for their data.

Worse, the system that Signal chose to build its new encrypted servers on came with its own set of security problems. Secure Value Recovery is built on Intel’s Software Guard Extensions (SGX), which has known security vulnerabilities that could place Signal users and their data at risk, says Matthew Green, a cryptography professor at the Johns Hopkins University Information Security Institute.

“Signal is trying to deploy a system that’s safer than storing data on their servers in plaintext,” as most messaging services do, Green told Fast Company. But Green worries that the PIN code Signal wants to use to protect address books now eventually could be expanded to include all of a user’s messages—and SGX hasn’t been secure enough to do that.

advertisement

In addition, these new features were initially mandatory or opt-out for Signal’s users, when Green believes they should be opt-in. “If it’s opt-out, you get more of your current users using it. But it’s not clear that people are making an informed choice about their security,” he says, which runs counter to what drove broad interest in Signal. “It’s obvious that once you back up some data, why not backup all data?”

Signal founder Moxie Marlinspike did not respond to requests for comment, but he did defend the use of SGX and mandatory PINs on Twitter. “I think all the attention SGX is getting is great,” Marlinspike tweeted on July 13. “[A] lot of people are looking at it and finding bugs that are being patched.”

While patching vulnerabilities and ensuring that those patches are installed in a timely manner is a serious challenge in securing hardware against hackers, Green notes that Signal has abandoned opt-out status for the new features in its most recent beta version, which was released in July. Now, users will be allowed to opt-in to using a PIN. He cautions that Signal users choose a different PIN from the one they use to unlock their phones.

All of this comes at a time when the United States and countries around the world are attempting to kneecap encryption with backdoors that would give them—and any hacker who finds the backdoor—access to the ostensibly encrypted data. Precisely because Signal appears to care about getting encryption right, the company faces significantly more scrutiny than other services which don’t promise the same level of protection.

Marlinspike tweeted that he was aware of the treacherous ground Signal was treading. “[I]t’s a big change that we have to do carefully,” he said.