8.3 C
Athens
Κυριακή, 22 Δεκεμβρίου, 2024
ΑρχικήEnglish EditionCharacter.ai: a possible alteration of human relationships?

Character.ai: a possible alteration of human relationships?


By Evi Chrysoheri,

The phenomenon of rapid technological advancement is nothing new for people of the 21st century. New devices, applications, machines and other innovations are created daily with the aim of facilitating the flow of human life. As a result, today’s citizen considers technological products a necessary condition for survival. Amid this technological boom, artificial intelligence, also known as AI has seen rapid development and gained widespread attention. Although AI undoubtedly has much to offer humanity, there are negative aspects that are often silenced, which can lead to dreadful consequences.

A case that has sparked considerable interest regarding this issue involves a mother’s accusation against an AI chatbot, claiming it contributed to the suicide of her 14-year-old son. Megan Garcia has filed a civil lawsuit in the federal court of Florida against Character.ai, a company that develops a role-playing chatbot. Her son, Sewell Setzer, passed away in Orlando, Florida last February. According to his mother’s testimony, he had been using the application extensively before his death. The 14-year-old, who suffered from depression, had his condition worsened by constant use of the app. Setzer had named the chatbot ”Daenerys Targaryen”, after a character from the series Game of Thrones. At one point, this character allegedly asked Setzer if he had devised a suicide plan, to which he responded affirmatively, though he admitted he had not succeeded. The chatbot then replied, ”That’s no reason not to follow through”. The boy’s mother emphasizes the danger posed by Character.ai application, which is easily accessible to children.

Humans have a tendency to describe and treat objects as though they were people. How often do we see young chlidren referring to stuffed bears or toy cars with human names, often even considering them as friends? This relationship involves a human and an inanimate object that doesn’t resemble a person, lacks emotion and cannot speak. Yet, this phenomenon can also be observed in living beings, particularly animals. Often, older individuals adopt a pet as a companion or friend to alleviate the loneliness they feel due to lack of human interaction.

Image Rights: Getty Images

Anthropomorphism is the attribution of human traits, emotions and intentions to non-human entities. In the past, people engaged in anthropomorphism because it allowed them to understand the unfamiliar better by ascribing familiar characteristics to it. One might easily imagine the appeal of an application allowing users to interact as if conversing with a real person, even helping them resolve personal issues. AI is designed to mimic human behaviour, yet it does not replicate the function of the human mind. It lacks what fundamentally makes a human ”human”. Instead, it is a data reservoir adept at making intelligent combinations, assimilating information, analysing and more-skills that it applies effectively. Although, AI is designed to recognize human emotions, it cannot experience them, thus failing to fully understand them.

Nevertheless, many people develop ”relationships” with AI chatbots, often under the illusion that their “conversation partner” can replace a human relationship of any nature. This false perception held by many users carries risks. Numerous individuals using these applications have developed romantic feelings for various AI characters or treat the chatbot as a friend. It is essential to emphasize that the bot lacks human moral values and inhibitions and users should be cautious not to take its words too seriously. The AI does not genuinely understand these concepts and may often err. Unfortunately, it is easy to fall into this trap, which raises ethical concerns regarding such technology and its implications for future generations. AI can now contribute to a person’s mental well-being, with users increasingly turning to it as if it were a psychologist. However, it cannot replace an expert in psychology, especially for delicate issues that have a direct impact on the individual. AI is, ultimately, a tool and people should treat is as such.

In conclusion, AI cannot replace human relationships, though many attempt to exploit people’s need for companionship for profit. Furthermore, this is a reminder that the use of technological devices, especially by minors should always be under the vigilant parental supervision, as dangers are ever-present. Ultimately, AI should serve only practical needs rather than providing entertainment.


References
  • Anne Zimmerman, Joel Janhonen Emily Beer. “Human/AI relationships: challenges, downsides, and impacts on human/human relationships”. Researchgate. Available here
  • ΗΠΑ: Μητέρα κατηγορεί AI chatbot ότι οδήγησε τον γιο της στην αυτοκτονία. Kathimerini. Available here

 

TA ΤΕΛΕΥΤΑΙΑ ΑΡΘΡΑ

Evi Chrysoheri
Evi Chrysoheri
She gratudated from the sixth General High School of Amarousion, Athens. She is an undergratuate student of the Department of Philology majoring in Classics of the National Kapodistrian University of Athens. She holds a degree in English.