The Bill of Rights covers only what the government can do to you. Unless you work for the government, many of your rights to free speech and freedom from search and seizure stop when you walk in, or log in, to your job. “If you’re on your employer’s communications equipment, you’ve got virtually no privacy in theory and absolutely none in practice,” says Lew Maltby, head of the National Workrights Institute.
The lack of workplace digital privacy has become a hot topic with the recent firing of four Google employees for what Google says were violations such as unauthorized accessing of company documents and the workers say was retaliation for labor organizing or criticizing company policies. One of them, Rebecca Rivers, recounts how her personal Android phone went blank when she learned that she’d been placed on administrative leave in early November. (Google subsequently fired Rivers.)
“At nearly the exact same time, my personal phone was either corrupted or wiped,” she said at a Google worker rally in November. The loss was especially painful for Rivers, who is transgender. “Everything on my phone that was not backed up to the cloud was gone, including four months of my transition timeline photos, and I will never get those back,” she said, her voice quavering.
How did this happen? Likely through an Android OS feature called a work profile, which allows employers to run work-related apps that the employer can access and manage remotely. Apple iOS has a similar capability called MDM, mobile device management, in which work apps run in a virtual “container” separate from personal apps. Various companies make MDM applications with varying levels of monitoring capabilities.
Everything on my phone that was not backed up to the cloud was gone.”
There are many legitimate reasons why a company might want to use this tech: It allows them to implement security measures for protecting company data in email and other apps that run in the separate work profile or container, for instance. They can easily install, uninstall, and update work apps without you having to bring the device in.
But they can also spy on you, or wipe out all your data—whether deliberately or negligently. That’s why mixing work networks and personal devices is a bad idea. If a company says you have to be online, they should provide the gear to do it. Annoying as it is to lug two gadgets around, the annoyance (and danger) of surrendering your own device to corporate control is a lot worse.
My smartphone is your smartphone
All modern phones have GPS capability. With a work profile or MDM toehold in your phone, an employer could install an app to track everywhere you go, as Owen Williams at OneZero points out. He gives the example of MDM maker Hexnode, which goes into great detail on how it can track device location at all times.
Williams also notes that a company may require your phone to connect to the internet through its encrypted virtual private network. This security measure makes sense for business, but it means that all of your data, even personal data, may be passing through the company’s network. That makes the data fair game for the company to look at, since there is simply no law or legal precedent to stop it. “That’s not really different from using your company’s desktop computer to send a personal email from your cubicle,” says attorney and security expert Frederick Lane. “If you send unencrypted personal data across a network owned and controlled by your employer, then you should assume that it can be captured and stored.”
Rivers recently tweeted a line of her employment contract that spells this out: “I have no reasonable expectation of privacy in any Google Property or in any other documents, equipment, or systems used to conduct the business of Google.” I asked Google about this policy. A spokeswoman said that it should not come as a surprise and is standard practice at large companies. A notice of the privacy policies also pops up when the phone profile is installed, she said.
What happens if you lose your data?
What Rivers hadn’t expected was losing personal data on her own device. But this is increasingly common, says Maltby, who calls it a bigger danger than being spied on. “They’re wiping your personal device with the goal of getting rid of the company data, but when you wipe the phone, you wipe everything,” he says.
Google told me that a suspended employee may lose personal data because they stored it in a work account, and they can ask Google to retrieve it for them.
When you wipe the phone, you wipe everything.”
It’s unclear exactly what happened to Rivers’s phone, or whether Google has a backup. But companies often completely wipe employees’ own phones without providing a way to back up personal information, Maltby says. “It’s not that they want to cause you trouble,” he says of employers. But “they would have to spend a little time and money to set up a system that would protect your privacy for the personal information that happens at work. And they don’t bother to do it.”
Worker advocates such as Maltby believe that total wiping of phones should be illegal under a law called the Computer Fraud and Abuse Act. The CFAA basically prohibits unauthorized access to a computing device, such as stealing data or planting malware. But advocates have struggled to find a legal case that can set a precedent for employee cellphones. “The courts insist on seeing tangible monetary damages, and usually there aren’t any,” Maltby says.
Of course, losing personal data, like photos documenting key moments of life, is so painful precisely because their value is intangible.
There’s also no way to put a monetary value on the hassle of carrying a second phone, or of fighting an employer that’s reluctant to pay for one. But placed side by side, securing your privacy is probably worth more than enduring some inconvenience.