Color and our emotions have an incredibly complex relationship. Recent research, for example, shows that our emotional state affects what colors we see: sad feelings actually hamper our ability to see some colors. For designers, the implications are huge: if color can change how we eat, sleep, and function, how should it be used? And can the way users express themselves with color actually reveal warning signs about their well-being?
A pair of researchers are studying just that. Harvard scientist Andrew Reece and Christopher Danforth–the co-director of the Computational Story Lab at University of Vermont–recently set out to see whether computer learning could detect depression based on a seemingly inane source of information: Instagram.
“Social media is just another way that people communicate,” Reece explains over email. “When people feel different on the inside, they communicate differently.” It stands to reason, he continues, that communicating differently reflects changes in how we feel. And thus, “we should be able to figure out what’s changing in people’s psychology by looking at the way their communication patterns change on social media.”
The duo recruited more than 160 participants and asked them to detail any history of depression. Then, they collected and analyzed 43,950 Instagram posts for a number of details: Hue. Brightness. Saturation. Filter type. Even the number of faces, and metadata like comments and likes. They also asked humans–via Amazon’s Mechanical Turk–to rate the photos for happiness and sadness. Then, using computer learning, they computationally analyzed the photos for depression based on a number of hypotheses.
What they found was intriguing, as detailed in a pre-submitted paper this week to ArXiv. Social data, like comments, likes, and the amount of posts per day, weren’t so closely correlated with depression. What was? Hue (or the amount of red or blue in a color), along with brightness, and even saturation. Depressed users tended to post pictures with higher hues (meaning more blue), lower brightness, and lower saturation levels. Even specific filters fell along these lines: Inkwell, the black-and-white filter at the end of the Instagram UI, popped the most in depressed users’ feeds.
“Depressed Instagram users in our sample had an outsized preference for filtering out all color from posted photos, and showed an aversion to artificially lightening photos, compared to healthy users,” Reece and Danforth write. At the other end of the spectrum, red-toned, bright filters like Valencia and Rise were more common among healthy participants. (Another interesting finding: depressed users used filters less, and posted more faces–but the number of faces in their photos tended to be lower, implying they might be selfies.) In the end, their computer learning methods were able to predict depression in the majority of cases–even from photos posted before a users was officially diagnosed.
Could these tools be used to detect other psychological conditions–even, for example, something as generalized as anxiety? Reece says that with the right understanding of potential markers and the right tools for analysis, it’s entirely possible that “any kind of health condition that has an impact on the way people communicate is something we might detect on social media.”
The technology we interact with everyday automatically creates a vast pool of user data. This data reveals much about how we act, think, and communicate–for better or worse. Paired with computer learning, which can crunch huge numbers with complex computational models, it reveals much about us that traditional science can’t. Bing searches, for example, were recently shown to be a predictor of cancer, before a user gets a diagnosis. Something as inane as your Instagram filter, as Reece and Danforth show, speaks to your psychological health.
For designers, this emerging form of research could offer totally new insights into how users interact with technology–and not just in terms of color.