The internet has given us many ways to make ourselves seen and heard, and to make our ideas, our bodies, ourselves, visible to many. But this visibility offers users an ironic bargain: giving both new platforms to expose injustices, and also enabling new kinds of abuse, public shaming, and coordinated attacks like doxing and trolling.
As we reckon with the unfolding of the #metoo movement, some have suggested that long-standing issues of sexual abuse and harassment at work are coming to light now, in part because access to the megaphone of social media has been a spark to arouse the public conscience to the side of women struggling to speak about their experiences. Others have pointed out the importance of another type of visibility, calling for the collection of more and better data about the otherwise invisible experiences of the U.S.’s increasingly fragmented and isolated workforce. In particular, attention is being turned to women working in pink-collar and other low-wage industries, where abuse is hard to see and even harder to measure.
In a sense, bigger and better data seems like one solution to the #metoo movement’s focus on individual and celebrity-driven narratives rather than larger systemic problems. Pointing to the lack of data on sexual harassment in the freelancing sector, Nathan Heller of the New Yorker writes, “We can’t fix what we don’t see, and we can’t protect what we do not see whole.” Without these types of visibility, he and others have argued, addressing these systemic issues may be impossible.
But simply adding more data has its limitations, particularly for workers who rely on digital platforms to find work. Platforms already collect and leverage large amounts of data about workers in ways that have complicated consequences for their daily experiences of work. Importantly, this data doesn’t always tell the whole story.
For the past year, we’ve been interviewing nannies, babysitters, elder care workers, and housecleaners across the U.S. who use platforms like Handy, TaskRabbit, and the in-home care provider platform Care.com to do care and cleaning work, in an effort to better understand how platforms are shaping domestic work. Along the way, we have found that, in many cases, the aggregation of individual data leads not to more accountability and justice, but rather forces workers to make trade-offs between visibility and vulnerability.
Dora is a young black woman who has been regularly finding house-cleaning gigs on home services app Handy, in between finishing her degree in New York City. While most of her clients have been pleasant, others were far less so. “I had this one creepy guy client that kind of just watched me the whole time I was cleaning to the point I was just, ‘I think I’m just gonna leave.'”
But making the decision that is best for one’s own safety can have consequences down the line; Dora (whose name has been changed for anonymity) described receiving bad ratings on numerous occasions in retaliation from clients who made her feel uncomfortable. She added: “And I think that’s what really upsets me, because I realize how sensitive I am because of this app. It’s almost a lot of pressure to keep up a [good] review cause that’s how you would get more gigs. And that determines your pay.” (A Handy Pro’s rate is based on payment tiers determined, in part, by average rating.)
Other workers we spoke to described having their accounts suspended or deactivated after clients reported them for acting in their own best interests. Care workers, for example, sometimes receive bad ratings for turning down a job offer; like jilted men on dating sites, prospective clients sometimes lash out after hearing a “no.”
Domestic workers are excluded from legal protection from workplace discrimination and assault through most federal workplace legislation, and only four states offer protection from sexual harassment through Domestic Workers’ Bill of Rights (New York, Hawaii, California, Massachusetts). However, while these bills protect nannies and other employees of households, they don’t protect workers like Dora, who work through platforms for many different clients.
Even as they provide a form of mediation between customers and workers, these gig platforms can also amplify abuse, often behind the scenes of the data generated by platforms. This insight echoes trends in other contexts where data-intensive technologies are used, in effect, to manage vulnerable populations. As Virginia Eubanks details, states have combined data sets collected in a number of different contexts to disqualify poor people for the public benefits they need. Sarah Brayne likewise writes about how police departments use data, sometimes acquired from private companies like Pizza Hut, to create wide nets of surveillance that can track individuals across institutions.
These data technologies have consequences for those hoping to seek help and speak up. Imagine making the difficult calculation to report abuse in these domestic work situations. In the wake of increasingly aggressive ICE tactics, deportations, and anti-immigrant rhetoric, a 2017 survey reported a sudden drop in immigrant women seeking out social and legal service providers for victims of domestic abuse. Presumably they’ve made the impossible choice to avoid scrutiny over their personal safety. As workers use digital platforms to find work that has been traditionally less visible, there is a potential to make vulnerable people more visible and trackable by powerful institutions.
Domestic workers are often considered “invisible” workers because they largely work in an unregulated “gray” economy, face many labor market challenges, and work behind closed doors in people’s private homes. However, many of these workers’ lives are made hyper-visible through the websites and apps they use to look for work. These services promise to shine a light into this otherwise informal economy, bringing trusted strangers into clients’ homes to care for their children and elderly, clean their bathrooms, and hang their TVs.
In digital marketplaces, trust is often established through visibility–creating lines of sight between strangers on either side of a transaction, so that they might get to know one another by viewing ratings and reviews from past clients, response rates to messages, profile pictures, and biographical sales pitches. Platform companies offer these features–in addition to background checks–to assuage the nervous doubts of clients who want to see, what appear to be, important things about the short-term contractors they may hire to perform intimate tasks for themselves and their loved ones. Workers may also benefit from this kind of visibility, but they often don’t have much choice about the spotlight cast on them.
Domestic workers cultivate their profiles on platforms like Care.com or TaskRabbit in much the same way upwardly mobile professionals are expected to cultivate their resumes or LinkedIn profiles (though on gig platforms, their last names are shortened to a first initial). Domestic work platforms create systems that reward workers for investing in their profiles, making them more visible to clients and placing them higher on recommended lists.
For instance, a worker with a compelling and detailed profile, a fast message-response rate, and five-star ratings may receive a notification that she’s in the top percentile of sitters in her area. Keeping up with these metrics can also earn her a “CarePro” badge that is visible to clients on her profile. Those who choose not to disclose their age, location, or to post photos of themselves are deprioritized. This information is readily viewable by anyone using Google search–not just those with accounts on the platforms.
More often than not, however, this visibility mostly works in one direction. Workers are rendered transparent for the benefit of potential clients, while clients’ lives aren’t held up to similar scrutiny. While domestic workers face some of the highest rates of assault and abuse, many popular platforms, including Care.com, the largest marketplace for care workers, don’t allow workers to post reviews of clients; instead, workers can only privately notify platform companies should they see a message, job posting, or profile containing “inappropriate” content, or if they have any “inappropriate interactions.”
And while theoretically, clients who get flagged for inappropriate content/interactions while on the site can lose their accounts, the process is opaque. Workers, for example, can’t see when other workers flag a client’s account or posting. Based on our research, this review process doesn’t consistently result in problematic clients losing access to the site. Care.com outlines a single termination policy for its members, both care workers and clients, reserving the right to “terminate a member’s Care.com membership for any reason or no reason, with or without notice,” providing a partial list of reasons for account termination, ranging from “[s]uspicion of fraudulent activity” to abuse or harassment.
In other parts of the service economy, the ability to vet clients has been one of the primary appeals of digital marketplaces for people working independently and with few protections. For sex workers–a rarely acknowledged type of gig worker–this ability can be crucial for safety. When Rentboy.com, one of the largest websites for gay male escorts, was raided and shut down by the Department of Homeland Security in 2015 (the agency labeled it an “internet brothel”), many escorts who had relied on the site argued that the shutdown was a huge setback for their ability to protect themselves. Similar concerns are now being raised over Craigslist’s recent removal of its Personal Ads section in response to the Senate’s passing of the SESTA legislation.
Gig platforms are often critiqued for their supposedly novel ability to connect strangers with each other. But working for strangers is hardly a new phenomenon, as domestic workers, day laborers, sex workers, freelancers, and other independent workers have long experienced. What matters are the power dynamics embedded in these platforms.
In many cases, the power dynamics are unequal. In domestic work, this can have consequences for hiring, where establishing trust is more complicated than in other kinds of gig work. Work involving the care of loved ones can be emotionally fraught, and clients’ anxieties are further stoked by news coverage of identity-stealing housecleaners and child-neglecting babysitters. One bad review, regardless of whether it is deserved, can, in many cases, make a care worker de facto unhireable on a platform, which ends up having the same effect as having one’s account terminated.
Despite questions about client accountability, and even as they continue to take increasingly large pay cuts, many workers stay on platforms like TaskRabbit or Handy largely because of the layers of protection they provide, however meager they may be. Workers we’ve spoken to expressed uneasiness with taking these transactions off-platform, not only because platform companies often penalize them for it, but also because the platforms provide at least some possibility of accountability in case things go wrong. When the alternative is Craigslist or word of mouth, domestic workers make uneasy trade-offs with regards to safety.
Comparing Notes On Facebook
Some care workers have found support through Facebook groups–some with thousands of members–where they share advice, celebrate each other’s victories, and offer mutual aid in distressing and traumatic situations.
As researchers, we’ve spent months observing these groups, and have found that they offer a smaller public space where workers can more safely navigate the intimate politics of their unique work contexts. For example, Maia, a young woman who came to the U.S. to work as an au pair for an American family, told us that she has kept in touch through Facebook with many of the other women she met in her training program in Mexico, and was able to compare notes and seek out advice during her live-in employment. But these groups also extend into larger networks beyond the public spaces that domestic workers have traditionally congregated, like public parks or connections through training programs.
Members post about a “dad boss” who makes them uncomfortable with sexual innuendo or inappropriate advances, or collectively think through the consequences of telling a charge’s parents about another parent or a coach that made them feel unsafe or threatened. They often encourage each other to seek out legal action. Threads can extend to hundreds of responses that may point the questioner in the direction of relevant state laws, workers’ bills of rights, lawyers that offer payment plans, or offers to help them find alternative employment. In these ways, they function as a community of workers who may never meet one another in person, but who nevertheless forge a culture of work across time and space. Just as in more traditional workplace settings, these interventions aren’t uniformly supportive or without contention, as disagreements over whether or not to file police reports, to quit jobs, or to address harassment at all will often be contested among group members.
Related: The Network Uber Drivers Built
While these groups are valuable, they have limitations. Care workers are keenly aware of the illusion of privacy that even closed Facebook groups provide. While they’ve created their own codes of ethics for protecting the privacy of the families they work for (such as using code names, blurring faces in photos, and obscuring details), their own privacy is tenuous in part because of Facebook’s real name policy, and because employers and nanny agencies occasionally infiltrate these groups to spy or dig up dirt on individual workers. As a work-around, an administrator of the group will often post on behalf of a member that wishes to remain anonymous.
These groups also don’t appear to include some populations of domestic workers. They are mostly English speaking and populated with digitally savvy workers who are already active on social media. (In its survey of nannies, caregivers, and housecleaners across the country, The National Domestic Workers Alliance found that 66% of respondents were foreign-born, 47% of which are undocumented.) Moreover, these groups often work more like digital whisper networks, focused on immediate and ad hoc harm reduction or emotional support rather than more organized efforts to effect change for the industry. Still, these social media groups are a testament to the fact that gig workers are not as isolated or powerless as they may seem.
In a labor market where contingent and “gig” work arrangements are on the rise, domestic workers provide a prescient example of the trade-offs of an increasingly independent workforce. As Ai Jen Poo, the director of the National Domestic Workers Alliance (and Meryl Streep’s date to the Golden Globes) points out, domestic workers are the original gig economy workers and have long been working out some of the solutions that will allow other vulnerable workers to survive difficult conditions. The online groups that provide support for many independent domestic workers–and as our colleague at the Data & Society Research Institute Alex Rosenblat has illustrated, for many Uber and Lyft drivers as well—may herald something we’ll only see more of, especially as higher-wage professionals also start seeing their job security weaken, become independent contractors, and need ways to cope with the conditions of atomized work.
Some of the most important open questions around sexual harassment come back to data, and our lack of it in the context of gig work. But having more statistics on sexual harassment in the gig economy, while important, invariably tells us what we already know to be true–that it is rampant, and its frequency is directly correlated to a lack of accountability for its prevention and redressing its harms for workers. While new data sets can shed light on collective phenomena, they still direct our gaze toward the bad behavior of individuals and obscure the ways new technologies of work have transformed the very contexts of abuse and harassment.
These are perennial issues that have always faced domestic workers, but now, more than ever, we need a better understanding of the changing dynamics of how abuse is enabled. This means turning our eye toward the platform companies themselves, how their platforms are architected, what they decide to track or not to track, and how their technologies change the dynamics of user behavior and social interactions, sometimes for the worse.
Julia Ticona (@JuliaTicona1) is a sociologist who researches technologies of work, emotions, and inequality, and is a postdoctoral scholar at Data & Society; she will be joining the faculty of the Annenberg School for Communication at the University of Pennsylvania in the fall. Alexandra Mateescu (@cariatidaa) is a researcher at Data & Society who works on labor and tech issues, and has spent the last year conducting ethnographic research, in collaboration with Julia Ticona, on domestic workers and labor platforms.