Fast company logo
|
advertisement

NCC Group studied the apps’ privacy policies and found that many send out information at night while you’re sleeping; share your phone number with third-party companies; look at your text messages, phone calls, and contacts; and can learn what other apps are on your phone.

I’m a security expert and this is how robocall-blocking apps violate your privacy

[Photo: You X Ventures/Unsplash]

BY Dan Hastings3 minute read

Spam phone calls, or “robocalls,” have become a huge nuisance in the last decade. The FCC received 52,000 consumer complaints about caller-ID spoofing alone in 2018. Spam-blocking apps have been touted as a way to protect consumers. But these apps themselves tend to have access to your phone number, your contacts, and even your text messages and voicemails. What would happen if a third-party company gained access to this data?

Privacy policies are a nightmare. Don’t just take my word for it: A New York Times article was titled, “We Read 150 Privacy Policies. They Were an Incomprehensible Disaster.”

Yet they’re also the only way for non-technical mobile app users to know what kind of data they’re giving up, where that data is going, and how it’s being used. If people had greater transparency into what their apps are doing behind the scenes—and whether private information is being sent to third-party companies—they will have the foundation to make informed decisions about which apps to use, and how, and when.

As a security professional, I decided to take a look at the top 10-15 robocall-blocking apps in Apple’s App Store to see what data is being collected and where it’s being sent. Apple has a reputation for being a strong privacy advocate, with a strict App Store review process. For example, some Google and Facebook apps that were meant for internal use were ejected from the App Store when it emerged they were being used for market research. Apple has now made it a requirement that all apps must include a privacy policy linked to the information about that app in the App Store.

I carefully read through these policies, and it was disheartening, if not entirely surprising. Even products specifically designed to prevent spam invade user privacy. In fact, one app sends your phone number to three different analytics companies—and doesn’t say so in its privacy policy.

Also, disclosing my findings to these companies proved very difficult. Apple requires that each privacy policy must have a clause that provides a way for a user to “revoke consent and/or request deletion” of a user’s data. Most privacy policies say this, but then have a general statement like “contact us” as the only directive. If deleting your personal data is hard, imagine trying to find the appropriate place to report privacy violations in the app.

Of course, I couldn’t find the right place to report my findings and ended up filing a claim with customer support for apps violating user privacy. One company was actually nice enough to publish the email for their data protection officer, but the message bounced back when I reached out. After several attempts to contact these companies I only got one vague response back. They said that the matter will be “looked into.”

So here’s my take on this messy field.

First, privacy policies should not only become more transparent and user-friendly, but they should also actually protect the user. Second, apps must clearly describe the level of user information that is being collected when the app is viewed for the first time. Third, users should be able to opt-out of specific provisions of the privacy policy, just as they can partially accept permissions (GPS location, accessing contacts, etc.). Otherwise, privacy policies only serve to check the requirements box. They really don’t protect the user.

Privacy should be by design, not policy, and should consider the user experience. People often click their privacy away by blindly pressing accept. Imagine that right before you install an app, you see a graphic appear that explicitly describes what data is being collected about you, and where it’s being sent. This kind of transparency should be a priority for both app developers and app store reviewers. Is an app’s functionality worth having your personal information compromised? If only you had the option and the information to make an informed decision.


Dan Hastings is a senior security consultant at NCC Group who’s passionate about mobile security and privacy. He has experience in technology education, training, and event facilitation and thrives on travel, education, and using technology as a catalyst for social change.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

Explore Topics