Gå til hovedinnhold

Both state and commercial mass surveillance risk transforming free democracies into surveillance states

The consequences of mass surveillance: How data collection threatens a free society

Authoritarian states use mass surveillance to control the population. Even in democratic countries, we see direct consequences of collecting absurd amounts of data. But there are also less visible effects: both state and commercial mass surveillance show signs of being able to transform free societies into the complete opposite.

Mass surveillance equals control. We find the most obvious examples of this in countries such as Iran where the internet is censored, the inhabitants’ online behavior is controlled and where so-called smart cameras identify women who aren’t wearing a hijab.

Or in Russia where the authorities combine mass online surveillance with a vast number of surveillance cameras using facial recognition to catch journalists and people critical to the regime.

Even worse is China with its total surveillance of people’s online lives, the censorship tool known as the Great Firewall of China and persecution of people taking part in protests. And not least the country’s surveillance cameras, using technology claimed to be able to determine a person’s ethnicity. In 2018, Huawei and the China Academy of Sciences applied for a patent for exactly this type of AI camera.

China’s uses of this type of surveillance technology include persecuting the Uyghur people in Xinjiang province. They are registered using technology dubbed ‘racial AI’, and Human Rights Watch has reported that during a nine-month period the state carried out 11 million searches on the phones of almost half of the 3.5 million inhabitants of Urumqi, Xinjiang’s capital city. The result of this mass surveillance? Documents obtained by CNN in 2020 showed that millions of Uyghur were first monitored and then imprisoned in work camps on totally fabricated grounds. At the same time, it’s been reported that China tested another type of new technology on the Uyghur, where AI cameras using ‘emotion detection’ were used to reveal emotional states. Naturally, the Chinese state denies this and in an interview responded to the BBC that in China, “People live in harmony regardless of their ethnic backgrounds and enjoy a stable and peaceful life with no restriction to personal freedom”.

Investigative journalist Liu Hu, who was denied the ability to travel on public transport because he had scored poorly in one of China’s social credit score systems, has another perspective. As he told the BBC: “There have been occasions when I have met some friends and soon after someone from the government contacts me. They warned me, ‘Don’t see that person, don’t do this and that’. With artificial intelligence we have nowhere to hide.”

Perhaps you’re wondering how China justified this new surveillance system that’s now persecuting entire ethnic groups? Well, it was introduced after five people were killed in 2016 in what the state described as a terrorist attack.

These countries have hit rock bottom. Things can always get worse for their populations, but we aren’t talking about free societies here. The question is how far the world’s democracies will follow in their footsteps.

In 2019, 70+ countries were subject to social media manipulation campaigns. The number of global democracies has been declining since social media emerged around 2010.

Center for Humane Technology

There are hundreds of terrifying examples, even in countries classified as democratic. In both Europe and other parts of the world, we’ve seen how Pegasus spyware is used to target dissenters, political activists and journalists. Mass surveillance in the USA is a chapter in itself, and Edward Snowden’s revelations showed how extreme the country’s authorities are when it comes to this activity.

This type of surveillance is reminiscent of George Orwell’s dystopian 1984, with its telescreens, ‘Big Brother is watching you’, thought police and a lack of freedom of speech. But there are other elements in the old dystopian books that accurately predicted other parts of our current situation. Like the propaganda and obvious fake news in 1984. Or like in Aldous Huxley’s Brave New World where people get by on happy pills (social media and dopamine rushes, anyone?), are clearly anti-intellectual (TikTok, anyone?) and believe they live a good life despite the fact that their freedom has in fact slipped through their hands.

Large parts of the world have already sunk into some kind of cross between these two dystopias. And the countries still classified as free democracies now have a choice: either a society based on control or a society based on culture.

We are already seeing how mass surveillance comes with disastrous consequences in countries classified as democracies. But mass surveillance isn’t merely a symptom. It’s also used to control development and steer free countries in the wrong direction. There is a risk that, hand-in-hand, state and commercial mass surveillance will water down democratic societies. This is something happening right here, right now. In 2019, 70+ countries were subject to social media manipulation campaigns. The number of global democracies has been declining since social media emerged around 2010.

“We’ve created an entire global generation of people who are raised within a context where the very meaning of communication, the very meaning of culture, is manipulation.”

Meta and Google have become two of the highest valued companies in world history thanks to income from their advertising networks and their business concept is clear. It’s about mapping your behavior and predicting what you’re going to want in the future to tailor ads as accurately as possible. And even better if they can steer your behavior in the desired direction. As Harvard professor Shoshana Zuboff writes in her book The Age of Surveillance Capitalism:

“Automated machine processes not only know our behavior but also shape our behavior at scale. In the thousands of transactions we make, we now pay for our own domination.”

What Zuboff is talking about is, for example, Meta’s AI system, which according to leaked documents, as early as 2018 had the capacity to collect thousands of billions of data points every day to produce 6 million behavioral predictions per second.

Tristan Harris, former design ethicist at Google and later founder of The Center of Humane Technology, expresses the same thing in the documentary Social Dilemma:

“We’re pointing these engines of AI back at ourselves to reverse-engineer what elicits responses from us. So, it really is this kind of prison experiment where we’re just, you know, roping people into the matrix, and we’re just harvesting all this money and data from all their activity to profit from. And we’re not even aware that it’s happening.”

In the same documentary, Sean Parker, Facebook’s first president, says the company was aware of what it was doing from the outset.

“I mean, it’s exactly the kind of thing that a hacker like myself would come up with. Because you’re exploiting a vulnerability in human psychology. And I think that we… you know, the inventors, creators, it’s me, it’s Mark (Zuckerberg), it’s Kevin Systrom at Instagram, all of these people… We understood this consciously, and we did it anyway.”

We were all looking for the moment when technology would overwhelm human strengths. But there's this much earlier moment. When technology exceeds and overwhelms human weaknesses. And this is checkmate on humanity.

Tristan Harris

The creators of the tech giants (at least, those who’ve left the companies) speculate that data collection and the AI engines analyzing billions of internet users could be the end of humanity. As Tristan Harris says:

“We were all looking for the moment when technology would overwhelm human strengths and intelligence. But there’s this much earlier moment… when technology exceeds and overwhelms human weaknesses. This point being crossed is at the root of addiction, polarization, radicalization, outrage-ification, vanity-ification, the entire thing. This is overpowering human nature. And this is checkmate on humanity.”

Jaron Lanier is one of the creators of virtual reality, but now he advocates for unplugging from social media for good.

In a conversation with Jordan Harbinger, he agrees that “social media can manipulate your behavior and it puts your free will under threat. It contributes to this mass production of misinformation.”

In Social Dilemma, he says:

“We’ve created a world in which online connection has become primary, especially for *****er generations. And yet, in that world, any time two people connect, the only way it’s financed is through a sneaky third person who’s paying to manipulate those two people. So, we’ve created an entire global generation of people who are raised within a context where the very meaning of communication, the very meaning of culture, is manipulation. We’ve put deceit and sneakiness at the absolute center of everything we do.”

Or as Shoshana Zuboff puts it in the documentary The Big Data Robbery:

“One of the things that Chris Wiley (the whistleblower who revealed the Cambridge Analytica scandal) said when he broke this story with the Guardian back in 2018 is that we knew so much about so many individuals that we could understand their inner demons and we could figure out how to target those demons. How to target their fear, how to target their anger, how to target their paranoia and with those targets we could trigger those emotions. And by triggering those emotions we could then manipulate them into clicking on a website, joining a group, tell them what kind of things to read, tell them what kind of people to hang out with, even tell them who to vote for”.

Absurd amounts of collected data and AI systems targeting human fears helped Trump win the election. Today, mass surveillance is used to monitor women who want to an abortion.

Each of us who lives with social media and in today’s digital world should think about the personal profiles that AI systems create. Are they used in a positive or negative way? If someone is classified as depressed, does that person then see targeted content suggesting that they go out and run in the woods or ads for one medication after another? If somebody buys unhealthy quantities of soda, does that mean they get suggestions for an alternative lifestyle or a discount for Coca-Cola? For somebody who’s started reading about conspiracy theories, do they get ads for books issued by the university or recommendations for sites about fake moon landings and how the Earth is flat?

A document leaked to The Australian revealed that Meta had offered advertisers the opportunity to target 6.4 million *****er users (*****ren) during moments of psychological vulnerability, such as when they felt ‘worthless’, ‘insecure’, ‘stressed’, ‘defeated’, ‘anxious’, and like a ‘failure’.

Let's not be naive. Our government will be tempted to annex these capabilities and use them over us and against us. When we decide to resist surveillance capitalism right now [...] we are also preserving freedom and democracy for another generation.

Shoshana Zuboff

The same tactics used to sell products and services are used to influence users in a particular political direction. Shoshana Zuboff again:

“Every aspect of Cambridge Analytica’s operations was simply mimicking a day in the life of a surveillance capitalist”. But instead of manipulating people for commercial purposes, they did it for political gain. Instead of a purchase, a vote. “Democracy is on the ropes in the UK, US, many other countries”, says Zuboff. “Not in small measure because of the operations of surveillance capitalism.”

Tristan Harris, who was formerly a design ethicist at Google but now runs The Center of Humane Technology,uses numbers to clarify how today’s data collection and the prevailing social media world affect politics: 9% of all tweets in the 2016 US Presidential Election were generated by bots. Ahead of the 2020 U.S. election, Facebook’s top pages for Christian and Black Americans were run by troll farms.

And the algorithms that social media is based on are created to promote chaos: each word of moral outrage added to a tweet increases the rate of retweets by 17%, which accelerates polarization. Each negative word about political opponents increases the odds of a social media post being shared by 67%.

MIT has its own figures for how fake news spreads faster than real news. And additional research has demonstrated that Facebook’s algorithms pushed some users into ‘rabbit holes’, which Meta knew about but didn’t do anything to prevent.

In other words, we have a digital infrastructure that collects absolutely everything we do and which promotes radical and untrue content. It’s quite obvious that if you have such a system in place it will be exploited. Like when the company Cambridge Analytica (of which Donald Trump’s chief strategist Steve Bannon was formerly a board member) obtained access to 87 million Facebook users’ personal data (including private messages) that the company then fed into its own AI system. Out came personal profiles that Cambridge Analytica then used to tailor digital content aimed at people undecided about how they should vote in the presidential election between Donald Trump and Hillary Clinton. The sponsored posts built on the recipients’ fears, were designed in a radical way to trigger the algorithms and contained clear fake news.

In an interview, whistleblower Christopher Wylie talked about the consequences:

“You are whispering into the ear of each and every voter, and you may be whispering one thing to this voter and another thing to another voter. We risk fragmenting society in a way where we don’t have any more shared experiences and we don’t have any more shared understanding. If we don’t have any more shared understanding, how can we be a functioning society?”

In the Netflix documentary The Great Hack, Cambridge Analytica’s CEO says it wasn’t the only company involved in the election in this way. To this can be added the fact that Russian troll factories were once again causing havoc prior to the American election and the information that Cambridge Analytica is said to have been implicated in 200 elections around the world. A picture emerges of how commercial data collection has consequences far beyond targeted ads for that sweater you looked at that one time.

In other words, we’ve seen evidence of how collected data, together with algorithms and AI systems that build on people’s fears and uncertainties, were used to spread fake news so Donald Trump could win the presidential election. Once in power, Trump changed the abortion laws and now mass surveillance is being used to monitor women, who are facing the sudden realization that the abortion they want is now illegal.

In the documentary The Big Data Robbery, Shoshana Zuboff urges citizens of democracies not to be so naive.

“Our self-determination, our privacy are destroyed for the sake of this market logic. That is unacceptable. And let’s also not be naive. You get the wrong people in charge of our government at any moment, and they look over their shoulders at the rich control possibilities offered by these new systems. And there will come a time when, even in the west, even in our democratic societies, when our governments will be tempted to annex these capabilities and use them over us and against us. Let’s not be naive about that. When we decide to resist surveillance capitalism, right now while it lives in the market dynamic, we are also preserving our democratic future and the kinds of checks and balances that we will need going forward in an information civilization if we are to preserve freedom and democracy for another generation.”

“Privacy is the fountainhead of all other rights. Freedom of speech doesn’t have a lot of meaning if you can’t have a quiet space, a space within yourself.”

What Shoshana Zuboff is talking about is resistance that must come now, before it’s too late. This is an important point. Because the infrastructure built today will be used by future governments. Because we don’t know who will be coming to power. And because this type of surveillance society tends to come creeping in, hidden from the masses. Function creep is total in this area. As we all know, the road to hell is paved with good intentions and it’s difficult to detect the bigger picture when it’s being laid out one small jigsaw piece at a time. Every obscure small law that’s introduced may not represent a catastrophe, but together they’re taking us in the wrong direction. And the ultimate destination is crystal-clear: when a country has introduced total mass surveillance, people begin self-censoring. When they can’t be sure whether or not they’re being monitored, they hold their tongues. In a Ted Talk, Glenn Greenwald, one of the journalists who met Edward Snowden in that Hong Kong hotel room and helped him get the word out, explains exactly how self-censorship is a highly developed control method that’s been used for several hundred years.

“In the 18th-century philosopher Jeremy Bentham set out to resolve an important problem […] for the first time, prisons had become so large and centralized that they were no longer able to monitor and therefore control each one of their inmates. He called his solution the panopticon […] an enormous tower in the center of the institution where whoever controlled the institution could at any moment watch any of the inmates. They couldn’t watch all of them at all times, but the inmates couldn’t actually see into the panopticon, into the tower, and so they never knew if they were being watched or even when. This made Bentham very excited. The prisoners would have to assume that they were being watched at any given moment, which would be the ultimate enforcer for obedience and compliance. The 20th-century French philosopher Michel Foucault realized that the model could be used not just for prisons but for every institution that seeks to control human behavior: schools, hospitals, factories, workplaces. And what he said was that this mindset, this framework discovered by Bentham, was the key means of societal control for modern, Western societies, which no longer need the overt weapons of tyranny – punishing or imprisoning or killing dissidents, or legally compelling loyalty to a particular party – because mass surveillance creates a prison in the mind that is a much more subtle though much more effective means of fostering compliance with social norms or with social orthodoxy, much more effective than brute force could ever be.”

In the same TED talk, Greenwald also talked about the cooling effect that mass surveillance has on society:

“When we’re in a state where we can be monitored, where we can be watched, our behavior changes dramatically. The range of behavioral options that we consider when we think we’re being watched severely reduce. This is just a fact of human nature that has been recognized in social science and in literature and in religion and in virtually every field of discipline. There are dozens of psychological studies that prove it.”

Shoshana Zuboff:

“Privacy rights enable us to decide what is shared and what is private. These systems are a direct assault on human agency and individual sovereignty as they challenge the most elemental right to autonomous action. Without agency there is no freedom, and without freedom there can be no democracy.”

Edward Snowden:

“Privacy is what gives you the ability to share with the world who you are on your own terms for them to understand what you’re trying to be and to protect for yourself the parts of you that you’re not sure about that you’re still experimenting with. If we don’t have privacy what we’re losing is the ability to make mistakes we’re losing the ability to be ourselves. Privacy is the fountainhead of all other rights. Freedom of speech doesn’t have a lot of meaning if you can’t have a quiet space, a space within yourself, within your home to decide what it is that you actually want to say.”

It’s actually quite simple. Either we have a society where people have the right to their own thoughts, their own private conversations and space to test out their ideas. A free society, where development and change are possible. Where power can be challenged, examined and replaced. Or we have a closed society where you never know whether or not you’re being watched. Either we continue step-by-step towards undemocratic societies. Or we instead try to uphold Article 12 of the universal Declaration of Human Rights: “No one shall be subjected to arbitrary interference with his privacy”.

Commercial mass surveillance risks transforming free democracies into controlling states. But what actually happens when the tech giants map our lives?

Today’s commercial data collection doesn’t merely risk tipping society in the wrong direction. It’s already being used in a way that has consequences for people all over the world.

Authoritarian states control their populations via surveillance. Democratic countries spy on people across the planet. Read more about global mass surveillance.

And if you’re still thinking ‘As long as you have nothing to hide, you have nothing to fear’, this article might give you cause to think again.