Site icon Read Fanfictions | readfictional.com

Chatbots as a cigarette of the 21st century?

More and more young people interacting with AI chatbots are at risk of becoming addicted to media. This is the result of a study by DAK-Gesundheit. Particularly piquant: 33 percent of those surveyed said they felt better understood by AI chatbots than by real people. Chatbots could thus mutate into the cigarette of the 21st century. A commentary analysis.

AI chatbots are addicting to children

  • AI models such as ChatGPT, Gemini and Co. are now an integral part of the everyday lives of many young people. The problem: AI is not only used to help with homework, but many young people and children are increasingly interacting with chatbots in their free time. According to an addiction study by DAK-Gesundheit and the University Hospital Hamburg-Eppendorf (UKE), loneliness is one of the main reasons for this.
  • According to the research, about 7.8 percent of respondents said they use chatbots because they feel lonely. One in three said that he or she was I feel better understood by AI than by humans and trust her with things that would otherwise only be reserved for close friends. Children and adolescents with higher psychosocial stress such as stress, anxiety or depression showed an increased connection to AI chatbots.
  • The study not only examined the use of AI by children and young people, but also by digital networks such as Instagram and TikTok as well as streaming services, messengers such as WhatsApp and online games. The result: Millions of adolescents have problems due to high media consumption. Around 1.5 million young people and children are already at risk of or even affected by addiction. Meanwhile, politicians are discussing a social media ban for children. However, the effectiveness of such a measure is controversial.

AI simulates human emotions

The AI ​​consumption of many adolescents is not a small side note of digitalization, but a quiet paradigm shift with a cuddly voice. Because chatbots are no longer just a tool. She simulate human behavior like patience, understanding and emotions and are always available but never annoyed.

Against this background, it is hardly surprising that many young people and children feel understood by AI. The problem: chatbots are often yes-men and not critical. It is therefore extremely questionable whether such simulated human characteristics are even necessary. Because: They probably do more harm than good.

The Logic of the providers is as simple as it is effective: Anyone who always listens, never contradicts and provides confirmation on a continuous loop keeps users in their own ecosystem. This makes AI a resonance space with addictive potential. Or: While social media is the cigarette of the 21st century, AI is a subtler, more personalized and harder to regulate e-cigarette.

What is striking is how strong emotional needs for the ticket become. Chatbots are not the cause of loneliness, but stress or excessive demands can actually exacerbate such phenomena. Because if you get too used to machine proximity, you could forget about real friction.

Voices

  • DAK CEO Andreas Storm takes politicians to task: “The persistently high level of media addiction shows that there is a great need for action. For sensible age regulation, rapid legal regulation is now needed by the summer break. So that the first measures take effect in the coming school year, we should act independently of an EU-wide solution. The growing trend of chatbot use shows that we are dealing with a new quality of digital media. This makes it all the more important to teach media skills at school at an early stage.”
  • Kerstin Paschke, study director and medical director of the German Center for Addiction Issues in Children and Adolescents (DZSKJ) at the UKEin a statement: “By imitating human communication and the often affirmative reactions, intensive usage patterns are to be promoted. This allows young people to develop an emotional bond with the chatbot as part of a so-called parasocial relationship, which is associated with greater psychological stress and can promote problematic usage patterns. From the perspective of child and youth protection, stronger regulation, independent supervision and an age-appropriate design of these systems are therefore needed.”
  • Janosch Dahmen, doctor and member of the Bundestag for the Green Partytold the German Press Agency: “We are currently experiencing what happens when an entire generation grows up with products that are specifically designed for maximum bonding and dependence. Especially in children and young people whose brains are still developing, this leads to loss of control, dependence and increased psychological stress – from anxiety disorders to depression.”

Confirmation loops: AI puts children at risk

The most obvious answer on the problematic AI consumption of many young people and children begins where many debates end: in the classroom and within the family. This means that media literacy needs to get out of the PowerPoint presentation and into real life. And not as a warning or election poster, but as a practical understanding of how AI works, why it seems so pleasant and where their blind spots lie.

Because only those who understand how confirmation loops workdoesn’t fall victim to them so easily. At the same time, responsibility shifts back to the analogue and to legal guardians, caregivers and teachers. Not just as a control authority with a screen time stopwatch, but as a counter-offer to the machine.

Chatbots smooth conversations and do not simulate friction or contradiction. Real communication is increasingly becoming one Key competence and a protective factor. Without pressure on the providers, information and human interaction are only a fragment.

Because companies that build systems that feel like friends can not shirk responsibilityif this is exactly why dependencies arise. Against this background, age controls and transparency would be the absolute minimum standard. AI reveals a similar waste problem to e-cigarettes.

Also interesting:

Source link

Exit mobile version