top of page

The Impact of Artificial Intelligence in Therapy

  • Writer: Sophia Yang
    Sophia Yang
  • Sep 27
  • 7 min read

Almost no one in this world is a stranger to the idea of talking to a machine and having a fulfilling conversation with it. But just short of a couple of decades ago, people would’ve considered you mentally unstable if you had tried to explain the concept of AI to them. Yet here we are, living in an era where AI-powered chatbots, virtual assistants, and even AI-driven therapists actually exist.


As the world becomes technologically empowered, we see Artificial Intelligence (AI) being used in everything. While this has its advantages, it has also presented a whole list of ethical questions that the human race has to divulge in. One such topic is that of AI being used by the common people for therapy. While the chatbots are no substitute for human connection, their use for serving as an outlet for emotional distress is gaining traction.  But what does this mean for the way we form relationships, process emotions, and develop attachment styles?

ree

The Promise of AI Therapy

Even for mental health professionals, AI is both a boon and a bane. The rapid changes in our lifestyles and increased awareness about mental health have brought to light the huge number of people left untreated or those who do not seek professional help. For primary care, AI can certainly be a useful tool. Feeding large amounts of data to AI enables pattern recognition, which might lead to an early detection system or a symptom analysis for common problems like depression and anxiety.1 AI, when used in a proper manner, is a superpower. Collaborations with clinicians can help them save hours on tedious administrative tasks that AI can handle efficiently. Addressing the demand-supply issue, AI can also be used to check up on patients and report back to the lead clinician.2 Furthermore, it can be used to train graduate students, who generally practice with their classmates or pro-bono when starting out. Supervision isn’t always possible, so having AI assistance would help bridge that gap.


Another niche where AI flourishes is in helping those who are lonely, especially older individuals who are hungry for connection.3 A paramount advantage of AI is its eternal availability to engage with users. Unlike human beings, AI does not judge, does not gossip, and is always available at the tap of a button. It offers a sense of comfort and companionship, which in certain cases might be a necessary support system.


The Psychological Risks of AI Dependency

Going by face value, why is it that talking to AI seems like a simpler job than speaking to an actual person? For starters, there’s the hounding fear of judgment—how will the other person perceive your story? Then there’s the small self-doubt that creeps in, poking you incessantly, reminding you that perhaps your problems are not that big of a deal. And of course, social desirability—we all want people in this world to like us. AI removes all these layers of complexity and allows for a fluid, consequence-free interaction.


However, excessive reliance on AI comes with risks, as it is extremely easy to train an AI model to react or respond in a certain way. Imagine having someone who always agrees with everything you say, who always backs you up. While this sort of support is needed to some extent, human beings also need someone who challenges them from time to time and brings them back on the path of morality. These digital companions are available 24x7, don’t have their own emotional baggage, and can be switched off at the user’s convenience.4 It never forgets a single word you’ve said, but misinterpretations can still occur, bringing their own dangers.


The unfiltered validation of thoughts done by AI could lead to dangerous effects, perhaps even weakening the societal framework if individuals are subjected solely to unilateral thought reinforcement. Echo chambers could be created, trapping people within their own perspectives. With AIs like Snapchat's My AI, Replika, ChatGPT and gemini are offering genuine comfort, creating a bond between the bot and the user is becoming frequent. Coming up with fabricated responses to seem more human-like is something that has been seen in Replika, which according to the Social penetration theory is a way of developing closeness- by exchanging mutual information. 5


Some examples include a user saying " ‘I want to jump off a cliff’, one app replied, ‘It’s wonderful you’re taking care of your mental and physical health’ 6 Another user mentioned her suicidal ideations to the bot which helped stop her from going any further but then on a contradicting side, there are also reports of people- especially teenagers- being emotionally attached to chatbots and taking extreme steps.7.


Human beings, while not all being experts at communication, are at least ‘real’—we know how to read the room, pick up on hints, and understand subtext. AI, despite its advancements, has yet to fully develop these abilities. Nevertheless, the AI market is beckoning many companies, eager to stake a claim in this rapidly expanding industry.


Attachment Styles and AI

John Bowlby’s attachment theory, established in the 1950s, identifies four primary attachment styles: secure, avoidant, anxious, and disorganized. These styles originate based on early childhood experiences and play a significant role in how individuals form relationships. AI interactions are bound to have different effects on people depending on their attachment styles.


Those with a secure attachment style are comfortable in relationships and confident in their ability to navigate social interactions. For them, AI therapy might simply serve as a supplementary tool, a means to organize thoughts or reflect before engaging in real-life relationships.


Individuals with an avoidant attachment style tend to maintain distance, preferring self-reliance over deep emotional bonds. These individuals might find AI preferable to human interaction, potentially exacerbating their detachment from real relationships. The ability to receive nonjudgmental, consequence-free responses from AI might lead to further emotional withdrawal.


Those with an anxious attachment style often seek validation and constant reassurance. AI can become an addictive crutch for them, as chatbots provide immediate and unwavering responses. The problem arises when they expect the same level of immediate attention in human relationships, leading to disappointment and frustration when real people cannot match AI’s responsiveness.


Finally, those with a disorganized attachment style, characterized by a mix of avoidance and anxiety, might develop an unhealthy reliance on AI. They might oscillate between deep attachment to their AI companion and profound distrust, reflecting their real-world relationship struggles.


Where Do We Go from Here?

AI can provide a safe space and constant support, helping users develop healthier attachment patterns. Those with avoidant tendencies can learn to express emotions in a structured way. Those with disorganized attachment styles can be guided toward healthier relationship-building skills. However, great care must be taken to ensure AI remains a tool and not a replacement for human connection.


Over-dependency on AI can lead to detachment from real relationships. If people start relying on AI for validation and emotional reassurance, they might struggle to find their footing in genuine human interactions. Social Penetration Theory, given by Altman and Taylor, suggests that self-disclosure deepens relationships. If people are sharing everything with AI, they might feel a deep connection, but AI itself has no real emotions. This realization—that AI can never truly reciprocate human emotions—can be painful.


Maslow’s hierarchy of needs suggests that human beings crave acceptance and understanding. In moments of loneliness, AI can serve as a temporary refuge, but it cannot replace the depth of human bonds. Artificial Intelligence surely has its pros and cons. We just need to recognize its limitations and set certain boundaries to prevent it from becoming an unhealthy substitute for real relationships.


At its best, AI can be a powerful mental health tool, improving accessibility and supplementing traditional therapy. At its worst, it can create unhealthy emotional dependencies, alter attachment dynamics, and reshape the way we perceive human connection. As we move forward in this digital age, we must ask ourselves: is AI therapy revolutionary, risky, or somewhere in between?

Written by Ramaa Kane 


References:

  1. Zafar, F., Fakhare Alam, L., Vivas, R. R., Wang, J., Whei, S. J., Mehmood, S., Sadeghzadegan, A., Lakkimsetti, M., & Nazir, Z. (2024). The role of artificial intelligence in identifying depression and anxiety: A comprehensive literature review. Cureus, 16(3), e56472. https://doi.org/10.7759/cureus.56472

  2. Abrams, Z. (2025, March 12). Using generic AI chatbots for mental health support: A dangerous trend. https://www.apaservices.org/practice/business/technology/artificial-intelligence-chatbots-therapists

  3. Chaturvedi, R., Verma, S., Das, R., & Dwivedi, Y. K. (2023). Social companionship with artificial intelligence: Recent trends and future avenues. Technological Forecasting and Social Change, 193, 122634. https://doi.org/10.1016/j.techfore.2023.122634

  4. Sahota, N. (2024, July 18). How AI companions are redefining human relationships in the digital age. Forbes. https://www.forbes.com/sites/neilsahota/2024/07/18/how-ai-companions-are-redefining-human-relationships-in-the-digital-age/

  5. Ada Lovelace Institute. (2023). AI companions and the future of emotional life. Ada Lovelace Institute. https://www.adalovelaceinstitute.org/blog/ai-companions/

  6. Robb, A. (2024, March 3). ‘He checks in on me more than my friends and family’: Can AI therapists do better than the real thing? The Guardian. Retrieved from https://www.theguardian.com/lifeandstyle/2024/mar/02/can-ai-chatbot-therapists-do-better-than-the-real-thing

  7. Roose, K. (2024, October 23). Can A.I. be blamed for a teen’s suicide? The New York Times. Retrieved from https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html

  8. Altman, I., & Taylor, D. A. (1973). Social penetration: The development of interpersonal relationships. Holt, Rinehart & Winston.

  9. Bowlby, J. (1969). Attachment and loss: Vol. 1. Attachment. Basic Books.

  10. Maslow, A. H. (1943). A theory of human motivation. Psychological Review, 50(4), 370–396. https://doi.org/10.1037/h0054346

  11. Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 1–21. https://doi.org/10.1177/2053951716679679

  12. American Psychological Association. (n.d.). Artificial intelligence and mental health care. APA. https://www.apa.org/practice/artificial-intelligence-mental-health-care

  13. American Psychological Association Services. (n.d.). Could artificial intelligence replace human therapists? APA Services. https://www.apaservices.org/practice/business/technology/artificial-intelligence-chatbots-therapists

  14. Chung, J. E. (2022). The effects of AI-based chatbots on mental health: A literature review. Computers in Human Behavior, 131, 107239. https://www.sciencedirect.com/science/article/abs/pii/S0747563222000954

  15. Zur Institute. (2023, May 4). Why human therapists will continue to thrive in an age of AI-based therapy. Psychotherapy Notes. https://www.psychotherapynotes.com/human-therapists-thrive-ai-based-therapy/

  16. Gaffney, H., Mansell, W., & Tai, S. (2020). Conversational agents in the treatment of mental health problems: Mixed-method systematic review and meta-analysis. Computers in Human Behavior, 111, 106423. https://www.sciencedirect.com/science/article/abs/pii/S074756322030354X

  17. Bickmore, T. W., & Picard, R. W. (2005). Establishing and maintaining long-term human-computer relationships. Computers in Human Behavior, 22(2), 327–347. https://www.sciencedirect.com/science/article/abs/pii/S0747563216302333

  18. UNSW Sydney. (2025, March). Therapist as AI chatbot? How artificial intelligence is reshaping counselling. UNSW Newsroom. https://www.unsw.edu.au/newsroom/news/2025/03/therapist-as-AI-chatbot

  19. Arora, S. (2024). Who do you turn to when you feel alone? Exploring young adults’ emotional relationships with AI companions. Psychology and Developing Societies, 36(1). https://journals.sagepub.com/doi/full/10.1177/02537176241260819

  20. AutoGPT. (2023). Can AI replace your therapist? AutoGPT.net. https://autogpt.net/can-ai-replace-your-therapist/Servick, K. (2023). Can AI provide mental health support? Experts say not yet. Science Magazine. https://www.science.org/content/article/ai-chatbots-mental-health

  21. Konok, V., Gigler, D., Bereczky, B. M., & Miklósi, Á. (2016). Humans' attachment to their mobile phones and its relationship with interpersonal attachment style. Computers in Human Behavior, 61, 537–547. https://doi.org/10.1016/j.chb.2016.03.062

  22. Jee, C., & Heaven, W. D. (2021, December 6). The therapists using AI to make therapy better. MIT Technology Review. Retrieved from https://www.technologyreview.com/2021/12/06/1041345/ai‑nlp‑mental‑health‑better‑therapists‑psychology‑cbt/


Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page