It is obvious that the next generation of “digital natives” – people who have not known a world without the Internet – have a completely different approach to algorithmic and AI-driven predictions, influences and controls. At our student workshop, it already became evident that, although young people hardly have any fear of contact with new technologies, they are certainly critical of the possible effects on our communities.
Analog and digital are not mutually exclusive
We live in a world where it seems neither possible nor reasonable to turn away from digitization. Why should we? In most cases the benefits outweigh the drawbacks; the denial of digital technologies would be like a social exit. Anyone who longs for the good old analogous world may also believe that everything was better in the old days. But that’s not the point. We cannot do without “analogous experiences” or digital progress because they complement each other. For example, it is important for children to be introduced to digital technologies with reason and critical reflectivity while self-awareness in a supposedly analogous world should not be neglected. If “analogous” is understood in the sense of allowing a more active control and decision-making by users and policy-makers in the future, temporary slowdowns in technological developments may be accepted. Denying them, however, is the wrong approach.
Algorithms have to be able to explain themselves
It was emphasized on several occasions in the discussions that machines, unlike humans, have no morality, no feelings, no consciousness and no intentions. That is why it is up to us – citizens, decision-makers and politicians – to define the framework within which intelligent systems are allowed to operate. Creating this framework is perhaps one of the greatest challenges of our time. This also means that we should not develop and use machines or technologies that we no longer understand or whose decisions and actions are no longer manageable or predictable by people.
How do we want to use social robots?
Under the right circumstances, a social robot in the form of a humanoid companion has many benefits. It could support or relieve stress in various situations, for example as a companion for lonely or old people with dementia. It could not only tackle loneliness, but also at the same time inform emergency services in dangerous situations.
It would be a human regression, however, if such robots were used out of laziness or low esteem for interpersonal relationships and if they replaced all interaction. This is a question of dignity where there are no technological shortcuts.
If privacy is an illusion
Maybe the end of privacy has already begun. But even if we leave behind the “illusion” of our privacy, we should think about who controls our data and what they are used for. Politicians should ensure that data by Austrian citizens are stored and archived on servers in Europe or Austria and not in the US or China. In any case, the discussion about the end of privacy should inspire us all to re-examine our own opinion-forming processes, to promote the meaningful use of new technologies and to actively participate in shaping them.
VITA
In order to incorporate the views and concerns of young people, in the run-up to the symposium, ACADEMIA SUPERIOR organizes an intensive, interdisciplinary workshop for students from a wide range of disciplines every year. Four of them were given the opportunity to participate in the SURPRISE FACTORS SYMPOSIUM as members of the Young Academia and to discuss “Measuring the Future” with the invited international experts.
These four students participated in this year’s symposium:
Alexander Grentner
Medical and Bioinformatics, University of Applied Sciences HagenbergBarbara Angelika Siedler, BSc.
Industrial Design, Art University LinzPhilip Tazl, BSc.
Philosophy, University of Vienna and Economics, Vienna University of EconomicsJulia Wiesinger, BA
Supply Chain Management, University of Applied Sciences Steyr