Sirion Individual Discussion
- Hubert Österle
- Jun 24
- 7 min read
Hubert
Welcome, Bernhard, to our discussion about machine intelligence. Our video “Sirion Capital” shows a scenario of life in the year 2040. A digital personal coach could then help us in our daily lives. As a 75-year-old, I’m curious to hear your perspective as an 18-year-old.
Bernhard
And I’m curious to see what you want to achieve with the video. What do you conclude from the scenario with Sirion and Laura?
Hubert
Bernhard, you are currently studying management at the MCI in Innsbruck. You have been managing the social media channels for Life Engineering, with great success I might add, for a year now. You were also involved in the creation of our “Sirion Capital” video. I’m sure you’ve developed your own ideas of what your life might be like in 2040? Are you looking forward to it? Or are you afraid of it like so many others?
Bernhard
Well, I already enjoy using the possibilities that machine intelligence offers today. But I hear all around me that artificial intelligence could turn against humans. Many would even prefer to ban the development.
Hubert
You see, that’s exactly the crux of the matter. It’s hard for us to imagine everyday life with machine intelligence. The “Sirion Capital” video aims to develop a realistic picture in collaboration with young people like you and to find fellow advocates for a human centric future.
The video is aimed at STEM oriented 20–40’s, in other words students, researchers, entrepreneurs and politicians between the ages of 20 and 40 who are not afraid of computer science and technology, that is, those who can actually make a difference.
Bernhard
… may I interrupt you? Why only STEM-savvy subjects? Do you have something against other disciplines?
Hubert
Of course not, but rational doers change the world, not emotional aesthetes.
But back to your question about typical reactions. I can give you some examples:
· A successful high-tech entrepreneur wrote that Sirion frightened him.
· In contrast, a pharmacist states: “I will use Sirion for sure and even pay extra for it.”
· A doctor wonders whether the benefits of machine intelligence outweigh the potential harm.
· A venture capitalist says he would be shocked if Sirion only became a reality in 2040. He expects such technologies to be available within the next five years.
I’m actually a little disappointed. The central message of the video is that machine intelligence is creating an increasingly complex world. The individual MUST use machine intelligence in order to survive in this future world. This message is not being disputed, but rather denied.
But now to you: What does Sirion trigger in you?
Bernhard
To be honest, I’m in two minds. On the one hand, I think it would be great to have an all-knowing assistant at my side who is would be an expert in so many areas and be my advisor. On the other hand, it makes me feel a bit useless and incompetent. After all, I don’t want to be restricted in my independent thinking and actions.
Hubert
What do your fellow students think?
Bernhard
We often talk about issues such as data misuse, surveillance and influence peddling. These are, of course, issues that worry us about machine intelligence. If Sirion gets to know us better than we know ourselves, we will ultimately be easier to influence.
Hubert
Why will Laura use the Sirion coach?
Bernhard
Well, Sirion helps her satisfy her needs in the long term. Sirion supports and advises her on important decisions and issues. This pays off, as it fulfills her need for security and self-esteem. I guess, you look at Laura and Sirion with the network of human needs in mind.
Hubert
You are right. We revised Maslow’s hundred-year-old pyramid of needs. We have the basic needs of self -preservation and species preservation, namely food, energy, safety, health, sex and reproduction.
And we have the need for selection, that means community, appearance, power and knowledge which constitute status and self-esteem. We need Capital to fulfill these needs.
Bernhard
So how does Sirion meet these needs of Laura?
Hubert
Perhaps you could simply say that Laura feels overwhelmed by the complicated world. She would have to spend a long time looking at the options for financing a condominium. That would be exhausting. Laura´s decision to use Sirion could simply be called convenience, or better, as the need to use her own energy sparingly. Laziness is in our genes.
Laura would also be afraid that her idea wouldn’t work and that she would embarrass herself in front of her friends. That would run counter to her need for status and self-esteem. The starting point is the need for capital, the wish to buy a condominium.
Do you think you could find a better solution than Sirion after your business studies?
Bernhard
No, I don’t think so. Firstly, the financial planning would also cost me a lot of time. And secondly, Sirion certainly knows the market and the legal situation better than I do, including relevant taxes and subsidies. . I would also rely on Sirion and perhaps check the results again with my economic knowledge.
I know that there are many arguments in favor of Sirion, however, are there any reasons for Laura not to use Sirion?
Hubert
Yes, very serious ones. Laura will forfeit her skills in all areas in which she relies on Sirion. In doing so, she surrenders her power to Sirion and the rulers of Sirion. Her self-esteem suffers as she becomes aware of her dependence on Sirion. The subscription to Sirion increases in price from year to year; which drains Laura’s capital. But her capital suffers even more when Sirion’s investment advisor maximizes their commission rather than Laura’s capital.
Bernhard
That´s one thing. But even greater concerns arise when Sirion takes care of Laura’s and Hugo’s political opinions. Sirion, or more specifically the rulers of Sirion, gain a powerful tool for manipulating people. Their needs for capital and power could be more important to them than Laura’s well-being.
And, what if, Sirion pursues these kinds of goals undetected? For example, if Sirion prioritizes the needs of Apple, the ruler of Sirion? Or if Sirion promotes an ideology of racial superiority? Machine intelligence multiplies the potential for ideological manipulation.
If we foresee such dangers, are those who want to prevent the development of AI right?
Hubert
Can we really stop the development? No! The technopolists, thousands of start-ups and even governments are doing everything they can to win the race for machine superintelligence. Those who don’t play the game will lose and will later adopt the solutions of the winners.
Once Sirion is very highly developed, neither Laura nor the Human Life Board, not even the rulers of Sirion, may be able to notice the manipulation.
Bernhard
How is the Human Life Board or Sirion supposed to know what is good for Laura? And will Sirion really only think and act for Laura’s benefit? At the end of the day is a world with AI that benefits humanity just a delusion?
Hubert
Not quite. Perhaps there are ways to ensure that artificial intelligence is used for the benefit of people. Every intelligence, including artificial intelligence, pursues goals with its decisions. In the case of humans, we are talking about needs that are inherent in our genes. With machine intelligence, we are talking about the objective function, an algorithm. Perhaps it is possible to agree on an international objective function for artificial intelligence. If we could dream for a moment, we could imagine the following steps:
First, the derivation of an Open Behavior Model. Apple and other technopolists have access to the personal data of billions of users. From this data, they can recognize patterns of which actions are beneficial or detrimental to people’s well-being. Today´s recommender systems are very rudimentary, proprietary versions of a behavior model. However, we need an open behavior model that is transparent for all people and that all service providers can access.
An objective function for the AI systems can then be derived from the Open Behavior Model. Compliance with the objective function must be monitored by a Human Life Board. Such a board could be based at the UN or a consortium of technopolists and large countries.
Bernhard
That sounds good, but also utopian. Do you yourself truly believe that this is feasible?
Hubert
Your skepticism is justified. But what are the alternatives? Containment, as Suleyman the CEO of Google AI proposed in his book. That is even more unrealistic. Or ethical principles. On the one hand, these reflect the values of the old world and of a small elite, on the other hand, they can hardly be checked by a machine? Even then, we will have to rely on ethical values for the near future.
A transparent objective function for machine intelligence is ultimately also in the interests of the most powerful players, the technopolists and the political elites, because they are at the mercy of a superintelligence just like everyone else.
Bernhard
May I get back to my initial question: Will you use a digital coach like Sirion in 2040?
Hubert
Let me give you an example for the adaption of new technologies: 40 years ago, I introduced eMail to the University of St. Gallen. How do you think my colleagues reacted back then? “I don’t need that and I don’t want it.” Today, life without e-mail and other forms of electronic communication at a university is inconceivable. ChatGPT has shown that a digital helper that takes tasks off our hands is embraced and adopted incredibly quickly.
So: I will definitely use a coach like Sirion. I won’t have any other choice. The world will be exponentially more complex than it is today. I won’t be able to fully comprehend and navigate all the necessary digital services myself. And I will be glad if a digital assistant takes over tasks such as tax returns, travel bookings and the admin of all my devices.
Bernhard
Does this ultimately mean that you welcome machine intelligence and, consequently, superintelligence?
Hubert
If Sirion’s goal is to optimize Laura’s quality of life, then Sirion will be a huge asset to all the Lauras of the world, because Sirion knows more than Laura. If Sirion’s goal is to maximize the capital and power of the technopolists, then the Lauras of this world will become consumer and worker slaves. That’s exactly why I want to mobilize STEM-oriented 2040s through videos like the one with Sirion. We still have the chance to make the decision for all humans.
Bernhard
Dear viewers, what are you doing to ensure that machine intelligence serves the good of humanity? Why don’t you submit your thoughts in the comments below? If the topic of Life Engineering sparks your interest, why not join the conversation? Just get in touch with us at the emails below: bernhard.walter@gmx.at or hubert.oesterle@unisg.ch.
Thanks for watching our channel. You can find all of our videos via our linktree. linktr.ee/lifeengineering
Comments