Are We All Lab Rats? 5 Things We Learned from ‘The Social Dilemma’
When Instagram came out in 2010, I was a freshman in high school with no smartphone, just a desktop computer I shared with my brothers. As someone who didn’t get an Instagram until college, I was often tasked with helping my friends take their posed “candid” photos and look over their captions before the final post. It struck me that my peers weren’t portraying their true selves but rather a manufactured version of their lives, and I promised myself that I would never become an addicted consumer. Fast forward ten years, with three Apple iOS devices within reach and one pandemic lockdown later, I find myself at the mercy of social media algorithms keeping me mindlessly engaged. And emotionally drained.
Netflix’s “The Social Dilemma” reveals the intended and unintended consequences of social media’s addictive human-centered design. Former employees of Silicon Valley tech companies, like Twitter, Pinterest and Google, condemn the very products they helped create. Isn’t that ironic?
My work exists at the intersection of design and tech. Even prior to watching this film, I was well aware of all the things we lose when we introduce a new app into our lives. Here are five things everyone should know about big tech companies, social media and how they use psychology for their products as revealed in “The Social Dilemma.”
If you’re not paying for the product, you are the product.
Surveillance capitalism describes a market in which the commodity is our personal data and the buyers are advertisers. By hastily giving companies access to our data when we download their apps for free, we give them the tools to manipulate our behaviors and make money off our existence. Utilizing methods of persuasive design, social media companies have primed us to compulsively check our phones. The goal is to get you looking at content for as long as possible because if you’re online, they’re making money.
We are lab rats.
Ever delete an app only to download it again a few days later? At the beginning of shelter-in-place, I deleted Instagram for several months, only to find myself still checking for ghost notifications. Companies purposely use our psychology against us in order to get us coming back for more. By implementing what is known as intermittent reinforcement, engineers and designers keep our phones buzzing with notifications. Each time you get a new text message, like on your latest IG post or photo tag on Facebook, the device sends a hit of dopamine rushing through our brains. The more often we get these rushes, the more we crave them.
Our attention can be mined.
There’s a lot to love about social media. It has given me the opportunity to connect with family and has helped me foster a creative community with people around the world. But since 2011, there has been a remarkable increase in anxiety, depression and loneliness in teenagers. There are regulations that protect children and young adults from deceptive marketing tactics because we now understand the impressionability of young minds. Though these same regulations have extended themselves to advertisements on social media, there’s still some grey area. Influencers and businesses on social media have to disclose whether a post is an ad, but a “regular” user raving about a specific product or lifestyle doesn’t necessarily fall within these same regulations.
Social media is demobilizing facts.
The pandemic has exposed — and perhaps intensified — the rapid and rampant spread of fake news. Even before the coronavirus hit, an MIT study from 2018 determined that false news “was 70% more likely to be retweeted than the truth.” Misinformation on social media perpetuates fear, racism and violence. And because feeds are custom tailored to a user’s specific interests and location, news looks different for each of us. Seeing how different the YouTube homepage can look for people with varying beliefs enforces my greatest fear: algorithms we have no access to shape our realities in profound and life-altering ways.
Regulation for change.
There are small things you can do to reduce your screen time and improve your mental health, like turning off all notifications, setting a time limit on your phone or simply deleting apps. But our individual efforts are not enough. As it stands, lack of regulation has allowed companies to create their own rules. Ultimately, the companies that are innovating social media must be held responsible for implementing a more ethical and humane system. Accountability can’t be left to individual users.
Technology is only going to get more integrated into our lives as companies continue to develop new devices and applications. To be conscious of its effects on us psychologically and physically is a first step in taking back some ownership of our data and selves.
In my own design practice, I too have taken advantage of human behavior in order to extend how long users stay on projects. I had always learned to produce content under the guise of “human-centered design,” implementing layouts, button placement and behavioral cues because that’s what users were used to. I know now that this is just a softer way to say “manipulative design” and in many ways, we are trained to react positively to what is presented to us. The ethical goal for designers, — myself included — engineers and corporations should be implementing humane design for a more equitable future.