Corona or Google & Co: The data protection discussion distracts from the real problem
The Covid 19 pandemic has sparked the debate on privacy and fundamental rights. But the arguments sometimes seem downright absurd because they focus on data collection rather than data use. In doing so, they distract attention from the real danger, the already existent and rapidly increasing manipulation of human beings. It’s about more than just movement data.
Let us take a brief look at the year 2024. You are 31 years old, married and mother of two children. You are applying for a job as branch manager in a supermarket. In your first interview, the personnel manager of the supermarket chain asks you whether you would be able to get along with the employees of this store on a human level. You say yes, of course, but he follows up and confronts you with the fact that over the last few weeks you have only met people who have a much higher education than the employees of this store and earn correspondingly more. Furthermore, three of your contacts are drug users and five have a bad credit history. You can make it clear to the personnel manager that you have proved with your current employer that it is precisely these contacts who qualify you for the job. When he asks you about the frequent contacts with the head of the branch’s food department, and for three hours in a hotel, you cannot deny the possible conflict in the working relationship and you are out of the race for the job.
What does this have to do with coronavirus? If the Covid-19 pandemic keeps us under its spell for another two years, if the tracing apps reach the hoped-for 70% penetration rate, but the end of Covid-19 is not officially declared and therefore the automatic collection of contact data continues, precisely a scenario of this kind is conceivable if the anonymization of data is not adhered to as agreed or a shrewd data analyst finds a way to create personalization by combining several data sources. Employment screening services such as GoodHire, for example, attempt to thoroughly prepare recruitment interviews with the data already available today.
A scenario like the one described above raises alarm among data protectionists, even if there are dissenting voices that justify employment screening, since it is aimed at establishing the truth, avoids deception, and prevents the employer from making a wrong decision. Furthermore, in the case of the contact-tracing app, the purpose of combating the pandemic justifies such means in the eyes of many citizens. It becomes even more dangerous, however, if the applicant for the post of branch manager is not invited to the interview due to an identification error and does not even find out the reason for the rejection. The media, especially social networks, repeatedly report such incidents.
Coronavirus tracing fuels the fear of digitalization
If we want to stop the spread of Covid-19 and return to a normal life, a tracing app that alerts us to contacts with infected people is an important tool. For weeks, epidemiologists and data protection experts have been debating how contact tracing can be done without putting data protection at risk and how far the interference with personal liberties can go. The uncertainty this creates will make it extremely difficult to attract the 70% of citizens needed to download and use a tracing app. It is probably not so much the fear of concrete misuse of data that stands in the way, but rather the uncertainty and fear of digitalization.
We should perhaps actually be thankful for the coronavirus pandemic if it forces us to think about an even more important issue: life in the digitalized world (information society). Irrespective of whether they are based on the European standard PEPP-PT (Pan-European Privacy-Preserving Proximity Tracing) or, for example, the Austrian Red Cross app, or whether they are country-specific apps based on Apple and Google APIs, the planned apps can make a significant contribution to ending a pandemic with sometimes devastating consequences for quality of life, while simultaneously jeopardizing our fundamental right to the protection of privacy.
Lifestyle apps are more dangerous
Healthy sleep is the prerequisite for a high quality of life, sleep deprivation makes us suffer — so much so that sleep deprivation is even used as an instrument of torture. It is therefore not surprising that numerous companies have for years been working with increasingly sophisticated techniques on so-called sleep apps, which are intended to help many people to sleep better and thus contribute to their well-being. Compared to a coronavirus tracing app, a digital sleep coach has a much greater appetite for data because it should take into account as many of the influences on sleep quality as possible. In future, a smartwatch will be able to detect not only heart rate but also blood pressure, skin surface tension (stress indicator), blood oxygen saturation, human movements, and ambient sounds. Sensors in the bedroom record the air quality, brightness, ambient noise and electrical radiation. A sleep coach should always have access to human activities before sleeping. For example, it could tell whether a person was active on Facebook until shortly before going to sleep, watching a thriller on Netflix, or making a heated phone call in an agitated voice. Ultimately, an electronic sleep coach also needs access to the patient’s medical records in order to be able to include other factors in sleep behavior.
We surrender our privacy and autonomy voluntarily
When it comes to our own health, we quickly give up our privacy. But the critical benefit threshold is much lower. The use of social media, fitness trackers, games, navigation services, news and information, and ultimately business transactions such as travel booking and banking, shows that we are willing to open up our privacy wide, even if the benefits are not readily apparent. Just think of the value of posting and chatting on Instagram or constantly “checking” your incoming messages.
When we agree to the terms and conditions of a digital service and thus to the use of data, we usually only look at the individual app, if we even think about the volunteered data, let alone read or even understand the privacy agreements.
The Chinese apps WeChat and Alibaba are regarded as the precursors of superapps that integrate all the services we need every day in a single app. In addition to the basic functions such as chat, payment, and shopping, they combine more and more specific functionalities, for example taxi call, event tickets, making appointments with the doctor, games, and last but not least the “coronavirus traffic light”, which displays a QR code that gives access to train stations or restaurants in China. The megaportals Facebook, Google, Apple, Amazon, and Microsoft are marching in the same direction by incorporating services of formerly isolated apps into their internet portals. They are working to ensure that their customers can get everything done without having to leave their app and without having to input data in one function that they have already entered in another.
The Internet of Things (IoT) will raise the volume, detail, accuracy, and timeliness of data to a new level, thus enabling a superapp, for example, to also use all the sensor data of a sleep coach mentioned above. From the heart rate, skin resistance, voice color, facial expression, etc., the superapp will be able to deduce whether you like the food in a restaurant or whether you are bored talking to certain colleagues. In his science fiction novel “Code Zero”, Marc Elsberg paints a world with “omniscient” digital assistants in an almost pleasurable way.
The superapps’ integrated databases will not only make it possible to respond to an individual in detail, but also provide the data that is required for the recognition of behavioral patterns through deep learning. For example, the digital sleep coach could learn from the data of many sleepers that heart rhythm disturbances, teleconferencing into the evening, and an early wake-up call the next day cause poor sleep, but that 15 minutes of meditation with quiet music or a short walk can prolong the deep sleep phase. “Customers who bought this book also bought these items” is a familiar recommendation based on a very simple pattern of behavior that we encounter in a similar form in many places on the internet.
The creativity of millions of software and hardware developers and the motivation through money and power provide ever new functions for our digital companions. Two years ago, I installed Netatmo to monitor the air quality in my apartment (temperature, CO2 content, humidity, noise level); I am currently evaluating a smartwatch to log my physical activity; I am documenting and analyzing my mountain tours with Maps 3D; I am looking for a successor for my 13-year-old car that can recognize pedestrians and initiate emergency stops; I am now using Apple Pay, and have installed the Austrian Red Cross app, to mention just a few frequently used apps.
From the success of the megaportals it is clear to see that
he market continues to be dominated by just a small number of megaportals,
the superapps of these portals invade all areas of our lives,
we are voluntarily surrendering our privacy to an ever-greater extent, and that
the megaportals integrate personal and factual data and thus develop a comprehensive picture of people and their environment.
Megaportals are becoming the infrastructure of our private lives — much more than electricity, water and roads, and even public administration ever were. With their rules and procedures and their power, the megaportals are even competing with our legal systems and our nation states. Perhaps in a few years’ time we will ask ourselves why we in Europe have been arguing about handing over national powers to the EU instead of dealing with the issue of handing over decisions affecting our personal lives to the megaportals.
Quality of life is crucial; privacy and autonomy are part of it
Does this form of digitalization benefit or harm people? Will the superapps contribute more to people’s well-being or more to their woes? Our ultimate goal is happiness, or to put it more modestly, a satisfying quality of life. The fulfillment of our needs decides between joy and sorrow. In a simple quality of life model, privacy and autonomy are not identified as independent needs, but rather as prerequisites for needs such as health, security, power, status, and capital, and are therefore of great importance to us.
Megaportals increase consumption and capital
Facebook, Amazon, Microsoft, Apple, Netflix, and Google are so attractive that billions of people use them voluntarily and intensively. Their stock market valuation is equal to that of the largest 100 European companies combined because investors believe in their continued dynamic growth. If customer needs and capital fit together in this way, the world must be in perfect order — or is that not the case?
Facebook provides its users with information and messages they want to hear and see. Instagram shows us, among other things, what clothes our idols wear. YouTube offers millions of videos, some of which educate us, but most of which entertain us. Netflix offers us films whose heroes we like to identify with and whose values and behavior we adopt. Amazon offers an almost unlimited range of items that we didn’t even know we needed. In other words, the megaportals first create our values and our world view, and then satisfy the resulting needs with their own offers or offers they promote. The megaportals’ knowledge about us and our behavior enables them to influence us in a way that is individualized down to a single person and to offer the appropriate information, which we then find particularly useful.
Countless publications report on the history, motives, and business practices of these financially outstanding megaportals. Their success is ultimately measured by a single parameter: stock market value. Initiatives by their employees, their customers, their suppliers, non-profit organizations, trade unions, and consumer protection groups as well as politicians create certain boundaries for profit generation.
The profit maximization[M1] of the megaportals is based on “customer benefit”. This is the benefit that customers expect from a product: pleasure from food or sex, pleasure through improvement of appearance, satisfaction through power and income. These are the feelings that arise from the short-term satisfaction of needs (hedonia), but which are usually very fleeting.
Advertising has always tried to persuade us to consume products and services, but digitalization is taking its possibilities to a new level. The quantified self of consumers, the growing understanding of our behavior, and the increasing knowledge about the functioning of the brain are the basis for a threatening manipulation. Our economic and social system uses our strengths and weaknesses almost without hesitation. This manipulation leads to environmental damage through nonsensical consumption, to consumer indebtedness, and puts us on a stressful treadmill where we have to pedal faster and faster to meet consumer expectations. On social networks such as Instagram, people of all ages present themselves as attractively as possible, thus fueling the race for prestige and status within their community.
Even if it sounds different in the vision and mission statements or in the advertising of companies, the lasting satisfaction of human beings (eudaimonia) is not the goal of companies, indeed it cannot be because satisfaction generates little consumption need. The very serious question to ask is therefore whether the megaportals with their absolute consumer orientation contribute more to people’s happiness or more to their misfortune. The only thing that is certain is that they know us to an ever-greater extent and that they help, support, guide, or manage us in ever more areas of our lives in order to generate turnover and profit. We are not merely leaving our data to the megaportals, but increasingly the design of our lives.
Social scoring can secure power and reduce conflicts
The Chinese social scoring is regarded by Western media as a horror vision of a totalitarian surveillance state. But Western scientists are now beginning to examine it as a source of inspiration for our social system. Every society functions only on the basis of rules of conduct, from road traffic to business conduct to the use of violence. Education and training convey these rules, and the police and courts ensure that they are enforced.
The data collections of digital services provide new, additional ways of assessing people’s behavior. With their data collections, Google and Apple are already able to prepare a recruitment interview as described, i.e. without a tracing app. We have become accustomed to the Flensburg Central Register for Traffic Offenders and to credit information from Creditreform to Schufa. Airbnb evaluates landlords, and Uber reports on the quality of drivers. Scientists can be rated via ResearchGate, Mendeley, Google Scholar etc., and the appointment of professors is based on the largely automatically derived h-index. Doctors and other professions pay attention to the evaluation by their patients on portals like Healthgrads and Jameda. What is probably new about Chinese social scoring is that it draws on additional, machine-recorded data, forms a more comprehensive system, and publicly discusses the goals and criteria. It is perhaps surprising to many that the vast majority of the Chinese population has welcomed social scoring in the pilot applications to date.
If one assumes in a strongly idealizing way that the clients and developers of social scoring pursue the aforementioned goal of formulating and enforcing social rules, it can be an extremely valuable addition to education, training, and enforcement. It is transparent, based on automatic and therefore objective data collection, and uses reward and punishment instead of merely relying on sanctioning misconduct, as our penal legislation does. It works with small doses and does not merely become active when gross violations occur.
It stands to reason that the Chinese Communist Party sees social scoring as an instrument for maintaining and expanding its power, just as the politicians of Western-style multi-party democracies instrumentalize the media to get their message across and sometimes do not even shy away from spreading gross false news. Some states are rightly accused of allowing the power elite to use the judiciary for their own interests. We Europeans in particular have a great deal of experience of spying, repression, and abuse of power by the state, the church, associations, and clubs. We are aware that state security services can use virtually all digital personal data to ensure the safety of our fellow citizens or to detect threats that may be contrary to the democratically agreed social system.
Ultimately, social scoring, like the megaportals, is about influencing and guiding citizens. In contrast to the megaportals, its goal is not to increase capital, but to create a society that functions as well as possible. History shows, however, that even well-intentioned organizations with noble goals are easily misused to secure or expand the power of the ruling elite if no functioning control mechanisms such as an independent judiciary or free media are built in. Digitalization in particular offers an opportunity to significantly increase transparency while harboring the risk of spreading “alternative facts”.
Do we renounce privacy and freedom?
Digitalization has brought us countless services and products that help us in all areas of life. For example, who wants to do without navigation, internet searches, or weather forecasts? We have thus delegated decisions to the machines, because it is not you, but Google Maps that decides which route you take; not you, but the search engine that decides what information you find; and not you, but the machine that decides how the air pressure and other weather readings are interpreted. But we have not only delegated the decisions, we are also losing our skills: Young people in particular find it hard to reach a destination with only road maps and signs. We are certainly no longer aware of the fact that people used to be able to search for information manually (e.g. a library catalogue). And the ability to derive a weather forecast from air pressure, cloud formations and other phenomena is also lost to us.
There is no stopping the development of new digital services, and we will use them if they promise us sufficient benefits. Even if we reject certain services and we can prevent them within our sphere of influence, others will work on them. If, for example, we do not develop a sleep coach because it wants or needs to access too much of our data, we will probably still use it if we suffer from insomnia and the depressive moods it causes.
It is time to accept further digitalization and mechanization in general, even if we are too overwhelmed to fully understand the possibilities and consequences. When complexity becomes too great for rational decisions, we tend to listen to our gut feelings and argue on the basis of our emotions. Behind lofty ethical claims such as the protection of privacy, autonomy, equality, and fairness, which regularly lead to demands for additional regulation, there may be a fear of living in a technicized world that we do not yet understand and do not want to acknowledge.
In the agrarian society, people at least had the feeling that they understood the consequences of their decisions, such as fertilizing, and that they were the masters of their decisions. In the information society, we are regularly overwhelmed by its demands and we have to call in legions of experts. With the current example of Covid-19, we are painfully experiencing that these experts understand complexity better than laypeople, yet still do not have sufficient understanding to come up with the same recommendations. The complexity seems to be unmanageable and populists tap into people’s gut feeling of being overtaxed with their simplistic slogans.
If we want to influence development for our own good, we cannot try to stop technological innovation or leave it to others, but must control it ourselves. To do this, we need a better understanding of what constitutes quality of life in the first place, and we need competitiveness at least in individual technologies, in accordance with the motto “Lead or be led!”
A Life Engineering discipline must provide the basis for the organization of the information society. It must bring together the findings of psychology, neuroscience, computer science, philosophy, economics, management theory, statistics (machine learning), etc. A single scientist such as the writer of this article is overtaxed! I therefore invite readers to participate in the discussion on Life Engineering. Use the channels LinkedIn, Medium, Facebook, Twitter, and the blog on http://www.lifeengineering.ch/. Or do you think the topic is insufficiently relevant, too complex, and too abstract?
The measures aimed at combating Covid-19 pandemic seem to be curtailing our fundamental rights. But compared to the impact of digitalization, the interventions are almost meaningless in the medium term. Since we are unwilling or unable to stop digitalization, we must face up to the changes. Whether we like it or not, we have to recognize that the megaportals, the innovation leaders, the best-capitalized start-ups, and the companies subject to the least regulation will determine the services we use. Technological and entrepreneurial backwardness means not only loss of income and power, but also loss of sovereignty. Andreas Göldi summarized the situation that led to the partial loss of our digital independence in his essay “A blind spot for the dark side: the monopolies we didn’t see coming”. It is to be hoped that a rational analysis of the possibilities and framework conditions, freed from wishful thinking and emotions, will make.