Kathryn Stamoulis, PhD
2020 has been a terrible year for many Americans. The relentless spread of COVID-19. Distrust of science. Extreme political polarization. A contentious presidential race. A rise in conspiracy theories. Heightened depression, anxiety, and loneliness, less sex, and on and on.
All this negativity begs the question, how did we get here?
The Social Dilemma, a documentary directed by Jeff Orlowski, purports to have the answer: Big tech is largely to blame for many of our current societal ills.
The documentary, streaming on Netflix, examines the individual and societal impact of algorithms optimized to extract users’ attention and influence behavior. The film blends interviews with tech experts, including many former employees and founders of Silicon Valley giants like Facebook/Instagram, Google, Twitter, and YouTube. Before getting into the relationship between social media and societal ills, the Social Dilemma explains the basic business model of online companies that are “free” for users. This business model is an ongoing cycle of tracking users’ behavior, encouraging more use, and selling ad space. Now, many people know the “price of admission” for using these free platforms is targeted ads based on location and interests. But the probe goes much deeper. According to the documentary, every action we make online is monitored and measured to the millisecond to build a picture of who we are as individuals, what we like, what we don’t like, when we use phones most, and when we need more nudges via notifications to bring us back. This information feeds the algorithm, to predict our behavior and determine what content we’ll be served at a later date.
The more a person uses a company’s app, the more money the company makes, so the goal is simple to understand: keep users tethered to their technology for as long as possible, by any means possible. Keeping users “hooked” is where psychology comes in. Interviewed in the documentary is Chamath Palihapitiya, who started at Facebook in 2007 as the VP of growth. He spent his time at Facebook experimenting on users, trying out tactics that would work below the radar of conscious awareness to increase usage and engagement. He states “over time, by running these constant experiments, you develop the most optimal way to get users to do what you want them to do. It’s manipulation.” Shoshana Zuboff, a Harvard Business School professor emeritus, who studies surveillance capitalism, says that Facebook has discovered, “We can affect real-world behavior and emotions without ever triggering the user’s awareness.” The user, she says, is completely clueless.
Thus, algorithms can be used to incite emotions like anger to keep users engaged online. A 2019 study from Social Media and Society found that angry people are more likely to engage in debates on social media with people having both similar and opposing views. While the study cannot determine causation, it concludes “it is reasonable to envision a spiral of anger, in which angry emotions are stimulated by an angry online debate climate, which in turn makes participants even angrier.”
As the film goes on, it pivots to explaining how data mining relates to many of our current social ills. Big tech sells data not just to companies selling sweatshirts or vacation packages, but also to political campaigns and foreign special interest groups. And because the algorithm wants us engaged, it does not care about truth or political motivations. The algorithm is amoral.
According to the documentary, fake news spreads six times faster than real news on social media. So not only is there no incentive for these companies to stop it, they are financially motivated to keep it going. A study by Digital New Deal found engagement with deceptive news increased 102% on Facebook leading up to the 2020 election than it did during the run-up to the 2016 election. And if someone engages with fake news online, the algorithm sends them even more! This, the documentary speculates, is why conspiracy theories like “flat earthers” and QAnon are prospering. Users are being fed a steady diet of what sparks their interest (or what enrages them) thus creating a personal echo chamber.
Even the search engine Google, says the documentary, gives a user search results that are specially personalized for their interests and location. So while my first search result for “climate change” is an article from NASA, others may be directed to a “climate denial” conference or videos of people sharing their anti-science views. According to Pew, 70% of liberal Democrats trust climate scientists to give full and accurate information about the causes of climate change, compared with just 15% of conservative Republicans. In an interview with OneZero, Orlowski said, “I think that this business model of doing whatever is best for engagement will always privilege giving each person their own reality.”
If we are living in our own reality, with our own individual “facts,” how can we truly understand another’s? How can we come together and work on mending so many of our nation’s problems?
Things get dark in the documentary when the experts are asked about their worst fears if the algorithms go unchecked. “Civil war”, says Tim Kendall, who worked at Facebook and Pinterest. Another, Jaron Lanier, a virtual reality pioneer and longtime technology critic who currently works at Microsoft Research says, “If we go down the status quo, for say another 20 years, we probably destroy our civilization through willful ignorance.”
It’s difficult for me to form an opinion on their fears or to judge how much big tech is responsible for social ills because I don’t understand algorithms. Many of the experts in the documentary who work on the algorithms say they don’t understand the algorithms, as they are constantly evolving. This, to me, is the most worrisome part of the documentary.
It is unfair to blame all society’s problems on social media, when depression, loneliness, and political divide have been with us long before Facebook and Youtube. Certainly, social media didn’t cause the current pandemic. But, mask-wearing (or not) has, for some, become an expression of ideologies rather than public health. And the 2020 political election, according to many, is the most contentious in US history. There does seem to be something unique about this time.
Facebook responded to the documentary by saying “Facebook has more than 70 fact-checking partners and that it has removed more than 22 million pieces of hate speech” and “will ban new political ads the week before the election.” Twitter has also made moves to identify misinformation. It responded when White House social media director Dan Scavino tweeted an altered video designed to make it look like Presidential Candidate Joe Biden fell asleep during a speech, by marking it as “manipulated media.” Still, it was viewed 2.4 million times.
Can we truly parse out if technology’s role in our lives, or its role in human behavior, if we don’t completely understand algorithms? How can we measure the unique “reality” that each person experiences online?
Social media isn’t going anywhere, whether that’s because users are “hooked,” or they believe the benefits outweigh the negatives. However, many people, including members of Division 46, are working hard to unmask the impact of social media and technology use. Researchers are also responding to the need to create procedures for identifying and decoding fake news. More encouragingly, many people don’t agree with the doom and gloom thesis of the documentary (see Pamela Rutledge’s review). Regardless, as we address societal ills, both from a personal or psychological perspective, it seems we have to seek to understand what “reality” we are being fed and the realities of those we seek to understand.