What is The Social Dilemma?
First streamed in 2020, the documentary continues the looming conversation of humans’ addiction to technology as sort of an exposé. Over the hour and a half span of the documentary, we hear from various tech experts, and former employees of these companies, and see what they have to say about their time there. By bringing into conversation the big giants of Silicon Valley, the film demands a response from the very companies that dominate our devices: Facebook, Google, Amazon, Instagram, you name it…
Meet the Interviewees
As some of the earliest employees and innovators in the technology world, the interviewees felt their work was changing the world for the better. But they no longer feel this way. In fact, most of them have left their jobs due to similar ethical concerns.
With the Information Age came many new prospects, so much information was never digitized and accessible to the average citizen. However, with this came a new means of capitalism, elicited from the private human experience.
Former director of monetization at Facebook and president of Pinterest, Tim Kendall, admits that he believes himself and his colleagues were naïve about the trajectory of these technological advancements. Others note that the more the algorithms that run these platforms were innovated, the more that they ran without human direction, without any sort of moral compass. An algorithm that is constantly adapting, getting better and better at pinpointing our interests every day. While the benefits of technology are undeniable, I am going to talk about the flip side. The subtle yet detrimental effects of the ever-decreasing gap between technology and humanity.
“Social media is a marketplace that trades exclusively in human futures.” – Shoshana Zuboff
Roger McNamee, an investor in technology, speaks about how in the earlier days in Silicon Valley its purpose was to manufacture and sell hardware to customers. While that is still true, this business has transitioned into a marketplace of consumer data. The goal is to maximize our screen time, our attention being what these companies are fighting for. And this is the business model. By predicting the behavior of individuals, these platforms can guarantee the success of their advertisers. By tracking and targeting. And once social media platforms realized that they could also influence real human behavior and choices it was a game-changer. Cathy O’Neil, data scientist and author of Weapons of Math Destruction calls algorithms “opinions embedded in code”, classifying them as non-objective. So even though a person writes the algorithm, the machine is optimized to some goal, and for social media platforms that goal is profit and maximizing screen time. Not the health or wellbeing of the user. Over time the machine learns and adapts its algorithm to better achieve this goal. But it does not end there. They can now influence people socially, economically, and politically. All without the immediate awareness of the user.
Scholar and author of The Age of Surveillance Capitalism, Shoshana Zuboff, defines surveillance capitalism as “capitalism profiting off of the infinite tracking of everywhere everyone goes by large technology companies whose business model is to make sure that advertisers are as successful as possible”. In her Channel 4 interview, she breaks down this concept using big technology companies, specifically Facebook and Google as they are the ones who perfected this business model and what Zuboff calls the “surveillance dividend”.
It starts with surveillance capitalists collecting data. Lots of data. In order to have good enough predictions to be able to guarantee success to advertisers, these platforms must have databases of scale and scope meaning an abundance of data and a good variety of it as well. And with the digital revolution, a good chunk of digitized information is made up of our personal lives. From what we willingly share online but also extracted data, information that can be inferred from the little patterns in our usage on the internet or social media platforms. This comes from any sort of action we take on their platforms, by taking our predictive signals to create data to model human behavioral patterns. And then one step further, by feeding us curated content, we are induced certain thoughts and our mindsets and behavior can be slightly modified to push us into their desired consumer/social/political/etc. cycle. These are thoughts that we may not have intended to have otherwise. This has been named the “digital nudge” which inherently limits our self-control.
Shoshana Zuboff classifies surveillance capitalism as a “collective action problem”, or put more familiarly, a social dilemma. This is a dilemma that collectively humans would be better off without if only they could overcome it. But for various reasons are not able to. Partially because not enough of the general population views the internet as a problem at all and partially because right now, this is a field that lacks regulation. It is a relatively new sector of innovation, and legislation cannot keep up with its exponential growth or with the immense legal power of big tech companies. As a result, companies like Google and Facebook have no incentive to stop what they are doing. If there is no threatening resistance and they are profiting immensely off our private human experience, why would they?
The Strategy of a Surveillance Capitalist
After analyzing the business history of Google, Zuboff has identified a four-step cycle of a surveillance capitalist. It begins with what she calls incursion, where they introduce some new product, application, or function and see how people react. These are usually not overtly malicious so typically there is no widespread public outrage. But occasionally when there is, they move into the habituation stage. This is where they console and pacify the general public with false assurances and apologies and use friendly rhetoric to make us believe that they are in it for us. Next comes the adaptation stage where these platforms announce that they will make some changes and protect our privacy. Lastly comes redirection. Here they come out with an addition to their platform that is pitched as new and innovative but has the same deceitful motive with a new name slapped on it.
Pokémon Go… a Social Experiment?
Now, I am sure you have heard about the augmented reality game, Pokémon Go. But what you might not know is that this game was developed under Google to test new limits of influencing peoples’ consumer decisions. Developed by John Hanke and Niantic, it was disguised as a familiar and friendly childhood game, but it was more of an experiment. Essentially establishments would pay Niantic labs to guarantee “footfall” in their vicinity by placing its prized Pokémon nearby to lure customers into the business. So, working backward, this is the redirection stage that Zuboff identified. Previously, John Hanke had been involved in several projects like Google Earth and Street View cars which specialize in surveillance and were met with some backlash and dissatisfaction. Moving forward, Google understands now that their most frictionless path to achieve their goals is the one that is mostly hidden from the user…. hence Pokémon Go. Had it been advertised as a Google development, perhaps people would have been warier about its ability to locate its players and herd them unquestioningly around their neighborhoods. Nonetheless, it quickly became a trend and was successful. With rewards and upgrades, the user feels a rush, but these little wins are the most trivial part. The user is too busy being entertained and pursuing the next Pokémon to stop and question what Shoshana Zuboff calls the “shadow operation” of the game.
But so, if this is such a violation of our privacy, why aren’t people outraged? How is this business model so successful?
The first answer is the extreme asymmetry of knowledge. Never has a handful of companies had a hold on the thoughts of billions of people worldwide. A power imbalance where we know nothing about them, yet they know almost everything about us. So naturally, the first step is becoming more aware. But of course, this is not easy. One of the reasons that these companies are so successful is their ability to function efficiently while bypassing our awareness.
However, there are more answers to this question. A large part of the problem lies in its convenience. I am aware that big technology companies know alarming amounts about myself and that this makes me vulnerable yet, I still use my iPhone. I still use it because having it gives me this distorted sense of staying “connected”. Not only that but it also makes my life quantitatively more efficient. I don’t mind that my phone can remember things for me so that I don’t have to keep track of everything in my head. I don’t mind that my Netflix account can cater to me TV shows and movies that I will most likely enjoy. It makes my life easier, and I need convenience when it’s available to me. But so, in the end, I seem to choose convenience over privacy. And when I do this, it doesn’t usually cross my mind how these things are helping algorithms track and predict my future consumerism, and my future behavior.
Having grown up in New York City, I am accustomed to being surrounded by people who hold more or less the same political, social, and economic views that I do. Of course, I speak of the majority of the population. Or rather the part of the population that circulates on my social media pages, intentionally the people who share similar views as me. What I mean is that my social media page is at its core a mechanism of confirmation bias. Confirmation bias is a human’s psychological tendency to favor and focus on information that substantiates what they previously believed. So, by filtering my content to that which concerns either what I already believe or something that is guaranteed to pull some sort of emotion out of me, and thus increasing my engagement, I am constantly exposed to my own way of thinking. This leaves me with the false sense that most people hold the same values that I do.
On a large scale, this has led to a sensitivity where people can no longer tolerate those who disagree with them because of inflated senses of self-righteousness. We each see different content, different search suggestions based on past interests or geographical location. Each human has their own little reality, our own “truth”. Therefore, it is crucial to become self-aware in that being human makes us impressionable. And that the platforms on which we find information are the biggest confirmation bias mechanisms that we have available.
A good argument lays out both sides to an issue. This documentary does not do that, leaving maybe a sentence to briefly mention the good that social media has brought into our lives. However, to do more would have taken away from the urgency of the matter. This left Facebook with a lot to add…
Firstly, something worth noting was that Facebook made their rebuttal aesthetically pleasing, with a color-coding theme, and overall, beautifully laid out. This could be argued for the sake of professionalism but to me, it felt overwhelmingly perfect to put off an innocent and friendly effect while making their claims.
In Facebook’s rebuttal to the film, they said that it was based on sensationalism. There is so much misinformation and mistrust in the world of technology it is difficult to know what to believe. Granted that we should take anything that we see with some hesitation, it is not difficult to determine which side of this fight is for the people. We have heard from Facebook whistleblower Frances Haugen, formerly known as “Sean”, that Facebook constantly “chooses profits over safety”. We have seen instances like Cambridge Analytica with Facebook and the Google data breach in 2018, where millions of users’ data were vulnerable to outside parties. So, is our data really safe?
And finally, back to the rebuttal. Facebook disagreed with the statements made in the documentary with confidence but with their history of misleading users and breaching our privacy, I am not convinced. Look for yourself, what do you think?
A couple of interviews and articles to check out: