Digitization, Surveillance, Colonialism
As I write these words, articles are mushrooming in newspapers and magazines about how privacy is more important than ever after the Supreme Court ruling that has overturned the constitutionality of the right to have an abortion in the United States. In anti-abortion states, browsing histories, text messages, location data, payment data, and information from period-tracking apps can all be used to prosecute both women seeking an abortion and anyone aiding them. The National Right to Life Committee recently published policy recommendations for anti-abortion states that include criminal penalties for people who provide information about self-managed abortions, whether over the phone or online. Women considering an abortion are often in distress, and now they cannot even reach out to friends or family without endangering themselves and others.
So far, Texas, Oklahoma, and Idaho have passed citizen-enforced abortion bans, according to which anyone can file a civil lawsuit to report an abortion and have the chance of winning at least ten thousand dollars. This is an incredible incentive to use personal data towards for-profit witch-hunting. Anyone can buy personal data from data brokers and fish for suspicious behavior. The surveillance machinery that we have built in the past two decades can now be put to use by authorities and vigilantes to criminalize pregnant women and their doctors, nurses, pharmacists, friends, and family. How productive.
It is not true, however, that the overturning of Roe v. Wade has made privacy more important than ever. Rather, it has provided yet another illustration of why privacy has always been and always will be important. That it is happening in the United States is helpful, because human beings are prone to thinking that whatever happens “over there” — say, in China now, or in East Germany during the Cold War — to those “other people,” doesn’t happen to us — until it does.
Privacy is important because it protects us from possible abuses of power. As long as human beings are human beings and organizations are organizations, abuses of power will be a constant temptation and threat. That is why it is supremely reckless to build a surveillance architecture. You never know when that data might be used against you — but you can be fairly confident that sooner or later it will be used against you. Collecting personal data might be convenient, but it is also a ticking bomb; it amounts to sensitive material waiting for the chance to turn into an instance of public shaming, extortion, persecution, discrimination, or identity theft. Do you think you have nothing to hide? So did many American women on June 24, only to realize that week that their period was late. You have plenty to hide — you just don’t know what it is yet and whom you should hide it from.
In the digital age, the challenge of protecting privacy is more formidable than most people imagine — but it is nowhere near impossible, and every bit worth putting up a fight for, if you care about democracy or freedom. The challenge is this: the dogma of our time is to turn analog into digital, and as things stand today, digitization is tantamount to surveillance.
Behind the effort to digitize the world there is a corporate imperative for growth. Big tech companies want to keep growing, because businesses are rarely stable animals — companies that are not on their way up are usually on their way down. But they have been so successful and are so gigantic that it is not easy for big tech to find room to grow. Like Alice in Wonderland, trapped in the rabbit’s house after growing too big, tech companies have their arms and legs sticking out the windows and chimney of the house of democracy. One possibility for further growth is to attract new users. But how to find fresh blood when most adults with internet access worldwide are already your users? One option, which Facebook is unscrupulously pursuing, is to focus on younger and younger children. The new target group for the tech company is children between the ages of six and nine. This option is risky. There are several investigations into Facebook and Instagram for knowingly causing harm to minors. What, then, are the other options for the expanding behemoths?
The preferred option these days is to digitize more aspects of the world. Despite the rapid advancement of digital technologies, most of our reality is still analog, even after the onset of covid. Most of our shopping is offline. Most readers prefer paper books. Much of our homes, our clothes, many of our conversations, our perceptions, our thoughts, and our loved ones are analog. That is, most of our experience has not been translated into ones and zeroes, which are the building blocks of digital technology. Experience, almost by definition, is directly lived, unmediated by a screen.
Tech giants wish to change all that. They share the desire to digitize the world because it is an easy way to gain more ground, to expand by enlarging the house. In this sense, digitization is the new colonialism. Digitization is the way to grow an empire in the twenty-first century. Everything analog is a potential resource — something that can be digitally conquered and converted into data and then traded, directly or indirectly. That is why Google keeps coming up with new products. Maps? Chrome? Android? Those were not designed for you. They are all different ways of collecting different data from you. That is why Facebook and Ray-Ban have together come out with new glasses that have microphones and a camera: more “data capture,” which in reality means the conquest of life by corporate avarice. That is why Apple is launching an augmented reality product, and why Microsoft is proposing a platform that creates three-dimensional avatars for more interactive meetings. And why Facebook — sorry, Meta — is insisting on its metaverse.
The tech titans assure us, of course, that their new inventions will respect our privacy. What they fail to mention is what I call the Iron Law of Digitization: to digitize is to surveil. There is no such thing as digitization without surveillance. The very act of turning what was not data into data is a form of surveillance. Digitizing involves creating a record, making things taggable and searchable. To digitize is to make trackable that which was beyond reach. And what is it to track if not to surveil?
A good example of the close link between tracking and surveillance are AirTags. In 2021, Apple launched the AirTag: a small coin-like device with a speaker, a Bluetooth antenna, and a battery, designed to help people keep track of their items. You can attach an AirTag to your keys and link it to your phone, and if you lose your keys, the device will ping Apple products around it and use Bluetooth to triangulate its location, which you can see on a map on your phone. The AirTag can also beep to let you know where it is.
Keeping track of your keys seems innocent enough, but the AirTag is designed to track more in general. You can track a wallet instead of keys, or a purse — and not necessarily your purse. Privacy and security experts warned Apple that AirTags would be used for stalking. In response, Apple said it had implemented a notification feature that alerts people with iPhones if there is an AirTag following them. But this measure is insufficient in various ways. First, many people don’t have iPhones, and if you have an Android you have to download an app to be notified through your phone; the vast majority of people have not downloaded it and will likely not download it. You might think that the phone notification is not necessary, because AirTags are meant to start beeping at a random time between eight and twenty-four hours after they have been separated from their paired iPhones, but the beeping is so low that people might not hear it. Moreover, eight hours is plenty of time for a stalker to follow and find his victim. Even if you have an iPhone, my own experience is that there is no guarantee that you will be notified about an AirTag that is tracking you. A few months ago my brother and I rented a car from a peer-to-peer network. After a few hours of driving the car, my brother’s iPhone notified him that there was an AirTag around. The owner of the car had placed it in a locked glove compartment. My iPhone, however, never notified me of the AirTag — even after having been near the car for more than twenty-four hours. We never heard any beeping.
The New Jersey Regional Operations & Intelligence Center issued a warning to police that AirTags posed an “inherent threat to law enforcement,” as criminals could use them to identify officers’ “sensitive locations” and personal routines. One year after their launch, there were at least 150 police reports in the United States mentioning AirTags, and recently, one murder case. That might not seem like much, but cases are likely in the thousands, given how many people might not notice they are being tracked or might not report it to the police. Not that reporting it to the police is of great help. Police often don’t know what to do about it; sometimes they don’t even take a report, which leaves vulnerable people (women, most often) unprotected.
Stalking affects an estimated 7.5 million people in the United States every year and, not surprisingly, it is on the rise. Last year a study by the security company Norton found that “the number of devices reporting stalkerware on a daily basis increased markedly by 63% between September 2020 and May 2021.” We are producing more and more technology to track — of course stalking is on the rise! To expect anything different would be to engage in self-delusion. In the pre-internet age, it was expensive, effortful, and risky to spy on someone. Today, you can buy an AirTag for $29.
What is most striking about the AirTag example is how foreseeable these issues were. It’s not that the AirTag was misused in any surprising or imaginative way. When an AirTag is used for stalking, it is being used exactly according to its design. Some dual uses of technology are surprising. Gunpowder was originally designed for medicinal purposes — who would have thought it might change war forever? But tracking technologies are designed to track — and tracking is surveillance, and surveillance amounts to control. Human beings are social beings, which means that most of the time what we are most interested in is other people. We should hardly be surprised when tracking technology is employed to track people, the most salient element of most people’s lives. AirTags are the tracking device par excellence. They are designed to track and to do nothing else. Yet smartphones, for all their many uses, are also tracking devices. Your phone can make calls and take photographs, but above all it collects information about you and others.
Too many people enthusiastic about digital technology are under the impression, as convenient as it is misguided, that if people consent to data collection, and if the data processing happens within our own phone or computer, there is no problem with privacy. If only it were so simple. There are at least two reasons why there are still privacy issues when it comes to the collection of personal data in our devices.
First, there is no informed consent in data collection. The consent we give is neither consent, because it is not truly voluntary, nor informed, because no one has any idea where that data may end up and what inferences may be drawn from it in the future. We are forced into “consenting” because if we do not consent we cannot be full participants in our society. There is no leeway for negotiation in platforms’ “terms and conditions.” It’s their way or the highway, and their way can change at any time and without warning. But we could not give informed consent even if we had the chance, because data is so abstract and unpredictable in the kinds of uses it may have, and the kind of inferences it will be able to produce, that not even data scientists can give informed consent. No one knows what consequences today’s data collection will have.
Second, data creation is itself morally significant. The term “data collection” is somewhat misleading, in that it seems to suggest that to collect data is to assemble things that are already there. But data are not natural phenomena, like mushrooms that we find in the forest. We do not find data. We create data. Data collection implies data creation. And that act of creation is a morally significant decision, because data can be dangerous. Data can tell on us: whether we are thinking about changing jobs, whether we are planning to have children, whether we might be thinking of divorcing, whether we might be considering having an abortion. Data can harm people. For this reason, data creation carries with it a moral responsibility and a duty of care towards data subjects.
“What privacy problem can there be if the data is on the user’s encrypted phone?,” a tech executive asked me once, assuming that users are in control of their phones, and ignoring the many examples that show otherwise. Our phones have a life of their own. They send data to third parties without us even realizing it, for starters. Every phone connected to the internet is hackable. Domestic abusers can take advantage of technologies to control their partners and their children. If an abuser forces you to share your password, the data that your phone has created without your asking it (where you have been, who you have called, etc.) can work against you. A TSA officer can ask you to unlock your device at the border and can download your data. That can happen even if you are American, and even if it is your work phone, in which you have confidential professional data. The police can ask you to unlock your phone too. And who can guarantee that an insurer will not ask you for access to that data in the future? If you do a commercial DNA test, even if it was only for fun, you are obligated to disclose it to your insurer. Can we be sure insurance companies will not ask for access to our smartwatches or smartphones some day? As soon as personal data has been created and stored, there is a privacy risk for the data subject, which then spills on to be a risk to society.
The risks to society are significant and varied. They go from national security (all that personal data can be used to extort public officials and military personnel, for instance) to threats to democracy, which will be my focus here.
Just like the old colonialism, digitization carries with it a certain ideology that it seeks to impose. It comes with ideas of what progress looks like. Old colonialism imposed a certain language, etiquette, clothing, social institutions, and ways of life. New colonialism imposes code, exposure as etiquette, a weakening of old social institutions, and ways of life that lead to societies of control.
Technology is never neutral. Tech companies find it convenient to present their products as neutral tools, but marketing bears little relation to truth. Artifacts inevitably embody values. We make artifacts so that they do something for us, and we wouldn’t bother making them if we did not value whatever it is that they do. Since technology is designed with a purpose in mind, artifacts end up having affordances. An affordance is what the artifact invites you to do. It is an implicit relationship between the designer and the user through the object designed. A chair affords you to sit on it. We design things like buttons and handles to match our bodies, perceptive systems, and desires. A gun affords you to use it to threaten, hurt, and possibly kill; it does not afford you to cook with it. Pans and skillets afford you to cook. Surveillance tools afford control; they afford the chance of keeping a close watch on something or someone. A camera allows you to watch anyone who appears in its purview. And a camera is a tool for surveillance irrespective of whether the footage is encrypted and in your phone. This is not to imply that encryption is not important. It certainly is, because it adds very necessary security to sensitive data. But no amount of encryption will detract from a camera the affordance to surveil.
Contemporary surveillance tools all too often are a double-sided mirror, which not only enables you to watch others but also enables others to watch you. They are often also camouflaged as some other kind of tool, like a phone or a TV. Before the age of the internet, surveillance tools were mostly one-directional. A Stasi agent monitoring a suspect in East Berlin through a wiretap could listen to her target without thereby opening the possibility of being wiretapped herself. But the internet allows for multiple directional flows of information. You might buy an Amazon Ring camera to watch whoever gets near your door, but that device allows Amazon (and your housemates) to learn things about you. It can track when you leave your home, and when you come home and with whom. It can also be used to inform the police (in some cases without your permission and even without a warrant). And anything that can be online is hackable, so you are enlisting into the risk of criminals accessing your footage, for example, to figure out when you are away so they may rob your home.
Your Ring camera is not only surveilling you — it is also watching and listening to your neighbors. Amazon has recently rejected the request made by Senator Ed Markey that the company introduce privacy-preserving changes to its doorbell camera after product testing showed that Ring routinely records audio conversations happening as far away as the opposite sidewalk. Your neighbor could be recording the conversations that you have at your doorstep or driveway and could post them online. If you use a screen door and keep your front door open, a Ring device could be recording the conversations you have in your living room. The potential for blackmail, stalking, and public shaming is immense.
Other surveillance tools are much less obvious than a camera. Take something like Alexa. It’s a speaker that plays music. It is a timer. It can read you the news. It can allow you to order all kinds of products. It doesn’t look or feel like a surveillance machine, but it is keeping a close watch on you. Amazon wants to turn Alexa into an appliance that can predict what you want. For it to accomplish its task, it has to know you very well. Alexa collects data from what you say and shares it with as many as forty-one advertising partners. If you have not opted out, human beings might be reviewing what you tell Alexa. And, sure, you can have your data periodically deleted and opt out of human review, but your data will still be used to train Alexa, whether you like it or not.
In more than one out of ten transcripts analyzed, Alexa “woke up” accidentally and recorded something surreptitiously. The same thing happens to other digital assistants. An Apple whistleblower confessed to have “heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever.” The police might be interested in getting access to that data. Alexa recordings have already been used in all kinds of legal cases, from proving infidelity in a divorce case and identifying drug users in a household to providing evidence in murder cases. If police can access recordings made by our devices at home, how is that different from having the police living under your roof? We would never be at ease having the police living in our homes, so why do we invite Alexa in? Aren’t we uncomfortably close to building a police state, or at the very least building the structure that could support an almost omniscient police state?
In the 1990s, we owned the objects we bought. Today we still pay for our phones and doorbells, but they work for other people, and often against our own interests. And of course it’s not just AirTags, smart doorbells, smartphones, and Alexas. It’s your smartwatch, and your smart TV, and car, and electricity meter, and kettle, and laundry machine. Everything “smart” is a spy. And while every piece of data may seem uninteresting and innocuous, you would not believe how precise a picture emerges from joining the dots of all those data points.
Data creation and data collection will only increase if we continue the trend towards augmented and virtual reality. These technologies will want to collect much more data about everything, from your indoor spaces to the movement of your eyes. Eye-tracking technology will be crucial in creating a rich digital environment. It is likely that virtual reality will mimic human sight, which focuses on something and blurs the background. If everything is equally salient, it is harder to navigate your surroundings and you can easily get motion sickness. To simulate our natural visual experience by offering low-quality images in your peripheral vision and high-quality images on what you focus on, the tech needs to identify what you want to pay attention to. Eye-tracking is the most important source of information for that. Relatedly, eye-tracking can be used to increase the user’s ability to direct and control her experience.
Unfortunately, your gaze can be incredibly revealing. Your eye movements, iris texture, and pupil size and reactions can inform others about your identity (through iris recognition), state of mind (e.g., if you are distracted), emotions (e.g., if you are afraid), cognitive abilities (based on factors like how long you look at something before acting), your likes and dislikes (including your sexual interests), your level of fatigue (through analyzing your blinking), whether you are intoxicated, and your health status (by looking for patterns of eye movements that might be symptomatic of problems such as Alzheimer’s or schizophrenia). Even if some of these inferences might be scientifically questionable, experience suggests that companies are likely to try their luck with them anyway.
By creating and collecting so much personal data, it is becoming more difficult to avoid surveillance. Even if you leave your phone at home (I know, a big if), you might still get caught by surveillance through dozens of cameras as you go about your day. If we plaster our cities with sensors of various kinds, there is no opting out or escaping it. The danger is in the long term. Surveillance is a slow-acting poison. Its consequences are not immediately apparent. All of which leads to the surveillance delusion: the mistaken belief that surveillance has many advantages and no significant costs. For every individual decision, surveillance can seem like an attractive solution in the short term, when we imagine that all goes exactly as planned: it seems to keep us safer, it helps us track what we care about. But the long-term and systemic effects of surveillance are often overlooked. Under the surveillance delusion, only the benefits of surveillance are valued, and surveillance is understood to be a convenient solution to problems that could be solved through less intrusive means. But surveillance often creates more weighty problems for democracy in the long run than the ones it can solve.
Democracy is a complex house with many pillars sustaining it, and it can crumble so slowly that we might not know immediately when we are undermining it. Journalism, for example, “the fourth estate,” has long been considered an important pillar of democracy. Citizens have to be well informed enough about their society to be able to make autonomous democratic decisions, such as whom to vote for. When we reduce privacy, we weaken journalism. In July 2021, a leak revealed that more than fifty thousand human rights activists, academics, lawyers, and journalists around the world had been targeted by authoritarian governments using Pegasus, a hacking software sold by the Israeli surveillance company NSO Group. It is probably not a coincidence that the most represented country among the people who were targeted with spyware, Mexico, is also the deadliest country in the world for journalists, accounting for almost a third of journalists killed worldwide in 2020. When journalists do not have privacy, they cannot keep themselves or their sources safe. As a result, people stop going to journalists to tell their stories, and journalists quit their jobs before they lose their lives, or they focus on safe stories, and investigative journalism slowly dies, thereby gravely hurting democracy.
Some people think that if surveillance is done by corporations and not the government, the concern is lessened. Others think the opposite: that if surveillance is done by the government and only by the government, we will be safe. Both views are wrong: corporate surveillance is as dangerous as government surveillance and vice versa, and even peer-to-peer surveillance undermines ways of life that are supportive of freedom and democracy.
Giving too much personal data to governments will grant them too much power, which can support authoritarian tendencies. As I have argued, surveillance tools afford control, and when governments hold too much control over the population they become authoritarian. You might happen to trust your current government, but you cannot be sure that you will trust the next government. And you cannot be sure that a foreign power will not hack the data held by your government, or even invade your country. One of the first stops for Nazis in a newly invaded city was the registry, because that is where the personal data was held that would lead them to Jews. The best predictor that something will happen in the future is that it has already happened in the past, and personal data has already been used to perpetrate genocide. A contemporary Nazi regime with access to the kind of fine-grained data we are collecting would be near indefeasible. That alone makes surveillance reckless. China is using its surveillance apparatus against “enemies of the state”: from minorities such as the Uyghurs and the Tibetans to the defenders of democracy in Hong Kong. We must dismantle architectures of surveillance before they get used against us.
Corporate surveillance is just as much of a problem. First, any data collected by companies can — and often does — end up in the hands of governments, whether through governments purchasing data, legitimately acquiring it (e.g., through a warrant or subpoena), or hacking it. In practical terms, corporate and government surveillance are indistinguishable. Moreover, corporations do not have our best interest at heart, and these days they are certainly not guardians of democracy or the common good. Thanks to corporate surveillance you can be unfairly discriminated against for a job, or insurance, or a loan. And personal data can be used to produce personalized propaganda, pit citizens against one another, and undermine civic friendship and democracy. Companies, after all, think of themselves as answerable only to shareholders.
Corporate surveillance is all the more worrying in the case of companies that can become more powerful than entire countries. Once again, this worry gives us reason to learn from old colonialism. At its summit, the East India Company was the largest corporation in the world, and it had twice as many soldiers as the British government. Among its many sins were slave trafficking, facilitating the opium trade, exacerbating rural poverty and famine, and looting India. A senior official of the old Mughal regime in Bengal wrote in his diaries: “Indians were tortured to disclose their treasure; cities, towns and villages ransacked; jaghires and provinces purloined.” So it’s not only that powerful corporations can violate human rights. To some extent, they can also act like states when they are the protagonists of colonialism. As William Dalrymple puts it,
We still talk about the British conquering India, but that phrase disguises a more sinister reality. It was not the British government that seized India at the end of the 18th century, but a dangerously unregulated private company headquartered in one small office, five windows wide, in London, and managed in India by an unstable sociopath — Clive.
Just like at the end of the eighteenth century, corporations are leading colonialism in the twenty-first century. This time round it is big tech doing the looting (of our privacy, at the very least). They are the entities setting the agenda and imposing a culture of exposure around the world. Big tech companies benefit from our spending as much time as possible on their devices and platforms, sharing as much personal data as possible — which is why they sell the idea of exposure as a virtue: tell us what you feel, where you go, what you eat, what you think about other people, what worries you, and how we can make money off you. And if you don’t want to tell? Well, that must be because you have something to hide, which in big-tech-speak is not about protecting yourself from wrongdoers but about being a wrongdoer yourself. Big tech colonialism shames us into exposure for their own profit, and in doing so, they poison the public sphere.
Cultures of exposure are another good example of how surveillance leads to control. The pressure to overshare encourages social vices such as stalking and witch-hunting. If everyone is pressured into exposing their opinions and habits, it is a matter of time before someone finds some of them objectionable and starts hunting people for their views. It is interesting how something that used to be regarded as inappropriate — exhibitionism — has now morphed into being considered a social imperative — transparency. Some measure of transparency is certainly appropriate when it comes to institutions — but not when applied to individual citizens. Both exhibitionism and social policing cause “either-you’re-with-us-or-against-us” mentalities and thereby jeopardize civic friendship.
Liberal democracies aim to allow as much freedom to citizens as possible while ensuring that the rights of all are respected. They enforce only the necessary limits so that citizens can pursue their ideal of the good life without interfering with one another. But for a liberal order to work, it is not only governments and corporations that have to give citizens a space free from unnecessary invasions; citizens have to let one another be as well. Civility requires that citizens exercise restraint in the public sphere, especially regarding what we think of one another. To expect people to be saints is unreasonable. “Everyone is entitled to commit murder in the imagination once in a while,” as Thomas Nagel has remarked. If we push people to share more than they otherwise would, we will end up with a more toxic environment than if we encourage people to edit or curate or limit what they bring into the public sphere. A culture of exposure invites us to share our imaginary acts of murder, needlessly pitting us against each other. Sparing each other from our less palatable facets is not a vice, but a virtue. Protecting privacy — our own and that of others — is a civic duty.
Totalitarian societies tend to match institutional surveillance with peer-to-peer surveillance to achieve near-total control of the population. During China’s Cultural Revolution, people were encouraged to denounce their neighbors and even their family members. Children sent their parents to their deaths. The same thing happened in Stalin’s Soviet Union. The East German Stasi used an astonishingly high number of informants to infiltrate the general population. When we use social media for trolling, witch-hunting, and publicly shaming others, we behave more like subjects of totalitarian states than as citizens of free societies.
We resist the colonialism of digitization partly through culture. We defy digital colonialism when we value the analog, the unrecorded, the untracked. Tibetan Buddhist monks have a tradition of spending days creating beautifully intricate mandalas using colored sand. When they finish their work of art, they sweep it all away in a ceremony. The sand is collected in a jar which is wrapped in silk and taken to a river, where it is scattered. Sand mandalas are a homage to impermanence. Unlike paintings, which strive to resist the passage of time, sand mandalas are there to remind us that there is beauty in the ephemeral.
We challenge digital colonialism when we enjoy life without wanting to freeze it into a photograph. We resist totalitarianism when we decline to publicly shame someone for a mistake that anyone could have made. We preserve intimacy when we allow a conversation to go unrecorded. We stand up for democracy when we buy a paper book at a bookshop using cash.
Yet culture is not enough. We also need the right technology. Architectures of surveillance afford control over the population. Our current technology — all of it the result of engineering and corporate decisions, and none of it inevitable in its present configurations — is priming society for an authoritarian takeover. Analog technology is more respectful of citizens. We could also make digital technology less intrusive by creating and collecting less personal data, by periodically deleting data, and by improving our cybersecurity standards. In a global context in which a country such as China is exporting surveillance equipment to around one hundred and fifty countries, the job of liberal democracies is to be a counterweight to that authoritarian influence by exporting privacy through culture, technology, and legal standards.
We need the right regulation to match culture and technology, because collective action problems can only be solved through collective action responses. For starters, we should ban the sale of personal data. As long as personal data can be bought and sold, companies will not resist the double temptation of creating and collecting as much of it as possible, and then selling it to the highest bidder. The trade in personal data is jeopardizing democracy through personalized propaganda. We do not sell votes, and for many of the same reasons we should not sell personal data.
We should also limit the purview of the digital. Asking technology companies not to digitize the world is like asking builders to please refrain from paving over natural spaces. Unless society sets legal limits, profit-seeking will reign. Corporations will sell our democracies if it is lucrative enough and we let them. Governments create protected areas to restrain the impulse to build over every square inch. We need similar protected areas from surveillance. It is in the very nature of big tech to turn the analog into digital, but turning everything into a spy is a threat to freedom and democracy. Full digitization equals total surveillance. There is some data that is better not to create. There is some information that is better not to store. There are some experiences that are better left unrecorded.
Just over a decade ago, enjoying digital technology was a luxury. Increasingly, luxury is being able to enjoy space and time away from digital technology. Spaces that are free of digital technology stimulate deeper connections between people, more honest conversations, free experimentation, the enjoyment of nature, being grounded in our embodiment, and embracing lived experience. That is why Silicon Valley elites are raising their children without screens.
We need urgently to defend the analog world for everyone. If we let virtual reality proliferate without limits, surveillance will be equally limitless. If we do not set some ground rules now on what should not be digitized and augmented, then virtual reality will steamroll privacy, and with it, healthy democracies, freedom, and well-being. It is close to midnight.