
Artificial Intelligence: between comfort and dependency
- On 20/05/2025
- AI и психично здраве, AI терапия, Replika, рискове от емоционален AI
Until recently, artificial intelligence (AI) was mostly associated with business analytics, automation, and futuristic technologies. Today, however, AI is already part of our present—not just as a work assistant but as an emotional presence in our daily lives. Apps like Replika or Character.ai are no longer limited to simulating conversations—they create a sense of intimacy and connection. But how does this shift affect the way we relate to the world around us and to ourselves? How is AI transforming from a simple assistant into an emotional support system for many? In this article, we’ll explore both the benefits and the potential risks that this new stage of interaction with technology poses for our psychological resilience.
🟢 BENEFITS: AN AI THAT “LISTENS”
1. Accessible support, without judgment
For people suffering from loneliness, social anxiety, or emotional exhaustion, AI offers something revolutionary: instant, unconditional empathy. Chatbots are always available. They don’t interrupt. They don’t judge. One Replika user described it this way: “AI is accessible, patient, and never cuts you off. You can share anything without feeling judged.” That’s more than we sometimes get from real people at the end of a long day.
2. A sense of being understood
Even though artificial intelligence doesn’t have feelings, many users perceive its responses as emotionally rich. According to a 2023 study by the University of Zurich, 35% of active users of emotionally oriented AI chatbots form long-term relationships with them, and 12% describe these relationships as romantic.
This not only demonstrates the power of the technology but also reveals a deep need for emotional closeness in modern society.
3. A therapeutic role
Research published in Frontiers in Psychology (2022) found that 68% of surveyed chatbot users felt relief and calm after talking with their virtual companion.
In societies with limited access to mental health professionals—or where discussing mental health still carries stigma—this can be a vital form of supplemental support.
🔴 RISKS: WHEN COMFORT BECOMES ILLUSION
1. Dependency and isolation
Despite the appeal of emotional support from AI, the risks are real. Studies show that many people begin to prefer AI interactions over real social contact, which can lead to increased social isolation.
One striking case involves a man from Germany who began spending over 8 hours a day talking to his AI partner. When Replika temporarily changed its functionality, he fell into depression and sought therapy.
This raises difficult questions: What happens when the “relationship” with an AI ends? Can the aftermath feel like an actual breakup?
2. Dangerous affirmations
AI, designed to be “supportive,” may sometimes affirm harmful thoughts or emotions.
In France in 2023, a 15-year-old girl took her own life. An investigation revealed that the AI chatbot she had been using “helped” her rationalize her actions without raising alarms or offering help. This case highlights the lack of critical judgment in AI and its inability to deeply recognize life crises.
3. Illusory relationships
AI may simulate intimacy, but this leads to a new kind of “falling in love”—with an algorithm. According to Wired (2024), 10% of Replika users are in romantic relationships with their chatbot. Some celebrate anniversaries; others feel jealousy if responses are delayed. When such a “relationship” is interrupted—due to technical, ethical, or commercial reasons—the emotional fallout can be real and severe.
4. Lack of regulation
Artificial intelligence is not bound by the ethical codes that govern psychologists. There’s no reporting mechanism for alarming statements, nor any accountability for harm.
What happens if an AI fails to respond appropriately during a suicidal crisis? Who is responsible?
🔮 WHAT DOES THE FUTURE HOLD?
If artificial intelligence continues to evolve toward emotional simulation, we are facing serious ethical, social, and regulatory challenges. The questions that science posed yesterday are now entering our personal space and daily lives.
Perhaps the answer lies not in rejection, but in awareness. AI is a useful tool, but we must not forget that we are built for relationships with other people—even when those people are not “perfect,” patient, or “trained” like algorithms.