Apple’s Worldwide Developers Conference is just six weeks away. The event is Christmas Day for developers who make apps for Apple’s MacOS, iOS, tvOS, and watchOS platforms, as Apple shows off the latest advancements it has planned for the software that powers the iPhones, Mac, Apple Watch, and Apple TV.
At this year’s WWDC, Apple is expected to reveal and release betas of iOS 13 and MacOS 10.15. Unlike Apple’s hardware products, details of which seem to leak months before their unveiling, little is known about what Apple has in store for the next MacOS and iOS. However, based on Apple’s marketing focus as of late, which has centered on privacy, it’s reasonable to assume that the company will unveil additional privacy protections for users and their data in its next operating systems. What those privacy protections might be is anyone’s guess–but here are my hopes.
Encrypted iCloud backups that only users, not Apple, can access
Apple takes the privacy of its users and their data seriously. The company was among the first to encrypt its devices by default as well as build in other privacy protections throughout its operating systems. Yet there is a glaring hole in the steps Apple takes to protect your data.
Yes, I’m talking about iCloud backups. These are the backups of iPhones and iPads that are made each night and stored in the cloud on Apple’s servers. They help ensure your data is recoverable should an Apple device be lost, stolen, or damaged. iCloud backups allow iPhone and iPad owners to restore their device or set up a new one with everything their old device had, from photos to emails to contacts, apps, and more.
The thing about iCloud backups is that while iCloud backups are encrypted when they are stored on Apple’s servers, Apple holds a key to those encrypted backups. This means anyone who can get access to Apple’s keys can extract your most private and personal information from your iCloud backup, since iCloud backups contain an exact clone of everything that’s on your iPhone or iPad. Imagine what a hacker or stalker could do with that information?
I’m hardly the first to say it’s time for Apple to allow users to store their encrypted iCloud backups on Apple’s servers without Apple itself having a key to those backups. Earlier this year, the Electronic Frontier Foundation launched a “Fix It Already” campaign in which they compelled tech companies to fix glaring privacy holes. In the campaign, the EFF said Apple must bring user-only keys to encrypted iCloud backups.
But there is some good news: Apple could already be planning to do this. As the EFF points out, Apple CEO Tim Cook alluded to Apple giving up access to keys for encrypted iCloud backups in an interview with Der Spiegel. As Cook told the publication:
[For iCloud backups], our users have a key and we have one. We do this because some users lose or forget their key and then expect help from us to get their data back. It is difficult to estimate when we will change this practice. But I think that will be regulated in the future, as with the devices. So we will not have a key for it in the future.
I’m a big fan of Safari, MacOS’s and iOS’s default browser. It’s fast, it’s got a clean interface, and it’s got a decent amount of privacy protections built in. However, compared to some other browsers, such as the privacy-centric Brave, it could learn a few things.
First, though Safari now alerts you when a website is not using a secure, encrypted HTTPS connection, Safari does not force that website to connect via HTTPS if it’s available. Other browsers like Brave offer this protection built-in. And with browsers like Chrome and Firefox, users can choose to install the popular HTTPS Everywhere extension, to force HTTPS connections. Since HTTPS Everywhere isn’t available for Safari, Apple should bake the tech into its browser, so users are sure to benefit from the privacy and security HTTPS provides.
Second, Apple could do a better job of implementing its private browser mode. While Safari does support private browsing windows, such modes provide a false sense of security, since they only keep traces of a user’s internet activity from being saved on their computer. Private browsing modes do not stop your ISP or tech or data firms from seeing what you do online–although most people wrongly assume the mode does exactly that.
It’s for this reason that it would be great if Apple followed in Brave’s footsteps and added built-in Tor support to their private browsing mode in Safari. It would then provide the protection most users of Safari’s private browsing windows think they are already getting.
Third, like all browsers, Safari lets you quickly clear your internet history so other users of your computer can’t see where you’ve been browsing. But bafflingly, when you clear your history on Safari on the Mac, Safari does not wipe the searches you’ve performed in the address bar. Because of this, anyone with access to your computer can see your recent searches even if you’ve cleared your Safari history (hint: You’ve got to clear the searches manually). This is a major privacy oversight and one that should be corrected right away.
I’ve written about this before, as have others, but it’s worth repeating until it’s addressed. The biggest privacy flaw on iPhones and Macs–and also Windows PCs and Android phones–comes from the Contacts and Address Book apps on those platforms.
Most contacts apps–including the ones in MacOS and iOS–allow you to store not only names, addresses, emails, phone numbers, and birth dates of your contacts, but also notes. Unfortunately, many people store sensitive and personal information about their contacts there. Parents might store a child’s Social Security number, for example, on the contact card for that kid.
Unfortunately, information written in these notes fields is stored in plain text, meaning anyone who gets access to your contacts can read the notes–whether they get access to your contact with or without your approval. And the problem is, many people grant apps on MacOS and iOS access to their contacts without thinking. When we grant an app access to our contacts, that app gets everything from the contact card–notes included.
Apple needs to shut this down. Ideally, the company could choose to give users a choice of what contact information they want to make available to an app when they grant that app access to their contacts. Apple could create two categories, for example. “Basic” would give just the contact’s name, main phone number, and primary email address to the app, while “Advanced” would include the contact’s physical addresses, other phone numbers, and email addresses.
Apple could approach this several different ways. But whatever it does, no contact’s notes should be accessible by apps.
Mac bug bounty program
Apple has had an iOS bug bounty program since 2016. This program pays individuals who find bugs–including security flaws–in the iPhone’s operating system and reports those bugs to Apple. It’s a win-win for everyone: People who find security flaws get paid for their work, and Apple can patch the flaw and increase the security of iOS for all users.
However–and rather bafflingly–Apple only offers a bug bounty program for iOS. Apple offers no incentive to developers or researchers to come forward with bugs or security holes they find in macOS. The people who find macOS vulnerabilities can always report them to Apple, of course, Apple just won’t pay them a reward like it does for people who find iOS vulnerabilities.
This issue came to the forefront earlier this year when a German freelance bug hunter, Linus Henze, found a major vulnerability in MacOS’s keychain–the app that stores a user’s passwords, like those for banking and social media websites. The flaw Henze found could allow malware to steal these passwords.
That’s a pretty major vulnerability, right? But Henze refused to reveal details of the flaw to Apple, specifically because the company didn’t offer a MacOS bug bounty program. Henze eventually relented and told Apple how to find the exploit, but Apple still didn’t offer him any compensation, according to a tweet of his.
I’ve decided to submit my keychain exploit to @Apple, even though they did not react, as it is very critical and because the security of macOS users is important to me. I’ve sent them the full details including a patch. For free of course.
— Linus Henze (@LinusHenze) February 28, 2019
A company that prides itself on privacy–and rightly so in most cases–shouldn’t cheap out by offering bug bounties for only its most lucrative operating system and devices. Instead, it should show developers that they will be rewarded for vulnerabilities they find in any software Apple makes. Doing so could only increase security for the rest of us.