Is human behavior becoming more and more predictable through the combination of intelligent algorithms and the digital footprint? What are the consequences of this development? These questions were at the center of discussions at the 9th SURPRISE FACTORS SYMPOSIUM. Three internationally renowned experts were invited to share their insights with ACADEMIA SUPERIOR to develop ideas and recommendations.
The discussions were moderated by the journalist Dr. Melinda Crane and the scientific director of ACADEMIA SUPERIOR Univ.-Prof. Dr. Markus Hengstschläger. Members of the Scientific Advisory Board and the Young Academia also engaged in the discussions. „The technological developments — like the thesis ‘Privacy is gone’ — are there and we have to deal with them. Turning back the time is not possible. So the big question is: how can we integrate these new technologies into our society, so that our society remains appreciative, benevolent, open, capable of criticism and responsible”, said LH-Stv. Mag. Christine Haberlander the aim of the discussions.
Privacy is an illusion
Prof. Michal Kosinski, PhD from Stanford University showed in an article titled „I just showed that the bomb is there” in 2016, that with today’s data analysis capabilities people’s behavior is not just predictable, but also can be influenced.
Kosinski underscored his provocative thesis, that privacy is an obsolete model, with clear figures and facts: „We all leave a digital footprint with our activities on the Internet. Already in 2012, a person generated a data volume of 500 megabytes per day. And according to forecasts, this will go up to 62 gigabytes per day in 2025.” This digital footprint is resulting from the use of smartphones, social media, the Internet, voice assistants, surveillance cameras, sensors or credit cards, etc., because all ’smart’ devices record data.
Facebook knows us better than the partner
In a study with 60,000 participants Kosinskis team could show, that it needs only about 250 Likes on Facebook, that an algorithm can assess a person in a personality test as well as his or her partner. Such profilings can be used for individually adapted advertising or marketing-messages.
The desire to protect privacy is more and more reaching its limits. „Companies may still can be forced to respect it, but criminal organizations or individuals will always get access to sensitive data, and people’s own comfort will do the rest”, said the psychologist and data analyst from Stanford. Also the vast majority of people is „sharing” their data voluntarily to use online services that make their lives easier and better. Such services would not work without this data. The conclusion of the expert is thought-provoking because one could almost say: who does not release his data, behaves anti-social and benefits from the fact that many others do it.
But the development has darker sides: In his latest study Kosinski showed that an artificial intelligence requires only five profile pictures of a person to be able to classify her or his sexual orientation with 80–90 percent certainty. „A circumstance that is rather secondary in our liberal societies, but in states where homosexuality is punished by death, it is a matter of life and death”, Kosinksi pointed out.
„Privacy is an illusion. The sooner you accept this reality; the sooner you can reasonably talk about the necessary policy.” – Michal Kosinski
„I am also worried about data misuse, but I’m convinced that 99.9% of the algorithms are used positively to help people. Therefore, I am in favor of accepting that privacy is a thing of the past and think that we should focus on minimizing the risks and maximizing benefits”, Kosinski emphasized and added: „Only if we accept this reality, can we start to discuss the necessary politics and get the most out of the new technology”.
Ethical rules for robots
Robot researcher and designer Univ.-Prof. Dr. Nadia Thalmann from Nanyang Technological University described her experiences with the use of social robots. Technological advances enable machines to recognize human language, gestures and emotions from movement, sound and image, to remember people, to answer questions and to speak in multiple languages.
The robot Nadine, created by Prof. Thalmann, is currently working on a test trial in the customer service of an insurance company. Preliminary results show that the robot can not only respond faster to inquiries than the human colleagues, but also that the customers accept the robot well.
The researcher at NTU Singapore is currently seeing the future of her robots above all in the care work with elderly people. In a few years, there will be a huge labor shortage in this area. „Social robots could act as a companion and support for people”, Thalmann said.
„Politics and society should now decide, what we want to allow robots to do. The time to discuss this is now.” – Nadia Thalmann
To make sure that robots can really be put to good use, suitable framework conditions are needed. There must be rules for robots that control their behavior: „Even if robots never really feel and only simulate emotions, we still have to anchor limits to their behavior in their software”, argued Thalmann, who has programmed her robot Nadine to be honesty.
Robot workforce
Will robots replace human workforce? Robot researcher Thalmann can’t see evidence for this in the near future, even if her tests and projects are extremely successful: „Nadine can handle some narrowly defined areas well, but she will not take on a full-fledged human job in the foreseeable future – only partial areas”, described Thalmann. She pointed out that – despite centuries of research – we only understand a fraction of human psychology and physiology. „But we know everything about our robots. They are by far not as complex as humans”, says Thalmann.
In general, the native Swiss, who now lives in Singapore, identified significant cultural differences in the use of robots between the US, Europe and East Asia. While Asians tend to be more open to new technologies, Europeans are more skeptical and cautious at first. „In Asia for example, if you find out that a robot can do a job as well or better than a human, then you will use a robot. Above all, efficiency is what counts here”, says Thalmann.
More technology assessment needed
The journalist Susanne Gaschke warned against „digital dumbing down” and called for more intensive technology assessment in order to be able to predict and minimize the risks of digitization: „In many cases, we also use the digital possibilities out of sheer convenience without sufficiently considering their negative effects: online trading for example, leaves the inner cities desolate and increases the traffic problem. The huge amounts of data require higher and higher storage capacities with corresponding energy requirements, this is a highly non-ecological system”, said Gaschke.
Digitization also opens up new educational challenges: „Adults and children need to learn how to get meaningful and accurate information online.” The news communication on the Internet also carries risks, said the journalist: „Extreme opinions are highly rated at the social platforms and, accordingly, widespread. Today, journalists deliberately use this logic and thus contribute to the strengthening of these ideas.”
„You have to know a lot in order to get a lot out of the internet.” – Susanne Gaschke
Especially in the education-area, the use of digital technologies should be well considered, because new studies show that children who come early into contact with „digital distraction machines” could train essential (human-) skills too little. Nevertheless, digitization also has many positive sides. Regarding the lack of workers in the care sector, Gaschke said that robots as nurses are still better than no care at all.
She pleads for more broadly discussing the negative and positive consequences of technological development: „Technological progress is almost considered sacrosanct today. You can not discuss this publicly. But that should be possible.”
Don’t do everything you can do
Markus Hengstschläger argued in the discussions, that humans should not implement everything that is technically or scientifically possible. In certain areas – where the risks can not be predicted – care must be taken. As an example of what was possible, but internationally rejected, the geneticist referred to the recently published cases of genetically modified embryos in China. Within a very short time, politics and science around the world, and even in China itself, have decided not to allow such experiments.
„It’s also up to politics to slow down technological development so that people still manage to come along with it”, emphasized Hengstschläger and underlined that more efforts are needed for global agreements on the major challenges humanity is facing. All of these new technologies can and should be used for the benefit of people, „but we need a permanent ethical support in discussing technological innovations”, said Hengstschläger. And the moderator of the symposium, Melinda Crane summarized: „We can not stop these new technologies. But we have to and can shape it. To minimize the risks and maximize the benefits.”
SURPRISE FACTORS PLENUM
First results and insights into the topic of the symposium were presented at the PLENUM with more than 700 visitors in the congress center Toscana.
All videos and content from the PLENUM >SURPRISE FACTORS REPORT
The results and ideas of the symposium will be summarized in the SURPRISE FACTORS REPORT.