Opinions do not necessarily represent CUIndependent.com or any of its sponsors.
Contact CU Independent Opinion Staff Writer Kim Habicht at email@example.com.
Most of us live double lives: our “real” life and our online life. Our online lives are more deliberate, curated and carefully managed, while our real lives remain organic and at times uncontrollable. Nothing lives in a vacuum though; these two lives inevitably bleed over. A recent study by Andrew Reece of Harvard University and Chris Danforth of the University of Vermont shows us how true this is — and how your carefully crafted Instagram profile reveals more about you than you realize.
The study revealed significant correlations between elements of color (hues, contrast, saturation) on a person’s Instagram profile and their mental health. The researchers evaluated a group of 500 individuals and their Instagram profiles. Each participant was also given a standard clinical depression survey.
Reece and Danforth found that depressed individuals typically post darker pictures (they favor the “Inkwell” filter) and are more likely to post pictures with faces, although the pictures averaged fewer faces per photo.
Armed with this knowledge, Reece and Danforth created an algorithm that is able to detect depressed individuals with 70 percent accuracy. This is better than the rate of general practitioners, who typically detect depression with 47.3 percent accuracy.
Depression is one of the most common mood disorders in America and its symptoms can be devastating. An algorithm like this could be highly beneficial to general practitioners, psychiatrists and family members hoping to help those suffering from depression.
But there’s a dark side. By diagnosing a disorder en masse via a platform that is already dehumanizing and impersonal, using an algorithm alone might lead to the alienation of depressed individuals. And, for the 30 percent of Instagram users who the algorithm misdiagnosed, a self-fulfilling prophecy might develop and come to haunt them. Being told by a computer that their social media profile has depressive tendencies might lead a person to question their mental health.
Furthermore, there is still a societal stigma associated with depression. A public “outing” of those who are diagnosed might be counterintuitive, and lead those who are detected as depressed to be even more detached or isolated.
Optimistically, it would be psychiatrists who would use the detection software to target those suffering from depression. Realistically, you have to take a look at what’s already happening. Advertisers on Instagram currently make millions by using algorithms to target users based on their profile tastes. Advertisements are then combined with genuine user content so seamlessly that ads often go undetected. Imagine scrolling through your Instagram feed and being barraged with Zoloft and Prozac advertisements, just because you posted a picture with the Inkwell filter.
The reason people seek help and diagnosis is for treatment and recovery. Alone, the algorithm can only detect depression. It can’t outline a course of action to recovery, it can’t prescribe treatment and it can’t provide emotional support that is so integral to recovery for mental health patients.
It’s undoubtedly fascinating that an algorithm can identify depression more accurately than general practitioners, who have invested at least a decade to their medical education. But medical experts still have the upper hand; they can sit in front of real patients, visually detect cues and sense nuances that an algorithm cannot. The algorithm has access to only the curated, filtered and primed pictures on a single social media app, and yet it still diagnoses with more accuracy.
The moment we begin to truly rely on software and algorithms in detecting mental health disorders is the moment we (may) lose all the things that make therapy — and therapists — great for mental health patients: genuine connection, compassion and the ability to relate beyond the screen.