If you pay attention to the ads embedded on your monitor as you click your way through cyberspace, you may be one of the trillions of people the Digital Advertising Alliance (DAA) estimates is familiar with this symbol on the right.
It stands for AdChoices, a self-regulatory program the DAA, a leading coalition of advertising groups and firms, implemented after the Federal Trade Commission recommended a set of principles for online behavioral advertising in 2009. When you click on the blue triangle in the corner of an online ad, it will take you to a page that describes why you’re being targeted with certain ads and how you can opt out of being tracked by third-party entities looking to harvest your online habits.
If you don’t opt out, anything you do online–buying shoes, researching medical marijuana dispensaries, bankruptcy hotlines, or looking up symptoms about a weird rash on your inner thighs–is likely being tracked. Your data is accessible to a number of third-party trackers who bundle it with other users and sell it to advertisers and marketers who are then able to tailor ads right back at you. Looking for someone to talk to about suicide? Maybe you’ll be served an ad for depression meds. But that’s not necessarily where it ends. There is a lot third-party trackers do with all the pieces of data that could be connected back to you; in a worst-case, privacy nightmare scenario, your data could be linked to future employment decisions, loan eligibility, or insurance rates.
Because of these implications, advertisers have taken some steps to give consumers choices about how their data is scraped online (hence the symbol above). But no opt-out program, or a Do Not Track policy, exists industry-wide for mobile apps–at least not yet. Earlier this month, the Privacy Rights Clearinghouse (PRC), a consumer watchdog nonprofit, published a report showing just how vulnerable typing your data into mobile apps can be. In a survey of 43 health apps, PRC rated 17 of them as highly risky, and showed that the majority were passing along potentially very sensitive information to third-party trackers.
This week, however, the DAA announced it was taking on the challenge of bringing up mobile apps to its existing desktop standards. The coalition is working to put out an AdChoice mobile app of its own by late next year that would function in a similar way to its desktop counterpart. The group’s guidelines would also forbid third-party trackers and advertisers from using personally identifiable data to inform employment, health care, insurance, or credit marketing.
“There’s also a prohibition in the principles on collection and use of pharmaceutical prescriptions and medical records about a specific individuals for online advertisements without specific consent,” Sarah Hudgins, director of policy at the Interactive Advertising Bureau, said.
Stu Ingis, a partner at Venable and counsel to the DAA, further explained that companies operating in this space without adhering to industry standards would face sanctioning from the Better Business Bureau and the Direct Marketing Association.
“The principles apply primarily to cross app data. That’s tracking data collected across unrelated apps,” Ingis said. “So if you’re a third party that’s in the business of collecting data across apps over time, you’d sign up to be part of that interface to exercise choice. We had a lot of companies that immediately signed up [with the AdChoice desktop], but some that didn’t. That followed with enforcement actions.”
Still, some major consumer privacy advocates feel that the DAA’s proposal is essentially meaningless, or “window dressing to allow data collection as usual,” says Jeff Chester, director of the Center for Digital Democracy in Washington D.C. Chester argues that the AdChoice icon is far from effective in engaging consumers in transparency, and that the mobile principles have significant loopholes.
“Mobile health marketing, financial marketing–none of that depends on whether they have your social security number, bank account, or medical records,” Chester explained. “Health and financial marketing in mobile apps integrates a broad amount of data to target people for credit loans, for treatments for depression,” he said. Chester pointed out that the DAA’s definition of personally identifiable information is rather narrow–and micro-targeting firms could still potentially put together your identity based on several points of “anonymized” data, like birth date, zip code, and the type of car you own.
This week, the Center for Digital Democracy plans on petitioning the FTC to come back with suggested changes for the DAA code. All of this is also occurring in the weeks leading up to a plan to introduce Obama’s Consumer Privacy Bill of Rights to Congress. In late 2012, the Obama administration released a framework for the upcoming bill, citing self-regulatory measures like the DAA’s as the “first line of enforcement” with follow-up by government agencies. Still, the Obama white paper emphasized a baseline code of conduct in law.
“I think this is an incredibly timely and important thing to do,” Chester said. “This is a classic case of whether you can trust the fox to guard the data collection henhouse.”