AI Companions and the New Landscape ofAttachment and Intimacy
top of page
Search

AI Companions and the New Landscape ofAttachment and Intimacy

  • Writer: Jackie Ourman
    Jackie Ourman
  • 2 days ago
  • 6 min read

Guest Feature: Jackie Ourman, LMHC


We’re pleased to share this guest article from Jackie Ourman, a licensed mental health counselor and couples therapist specializing in attachment-based work and relationship dynamics. She integrates Emotionally Focused Therapy (EFT), Acceptance and Commitment Therapy (ACT), and trauma-informed approaches, helping individuals and couples build more secure connection and emotional resilience.


Jackie is also the Founder of the Social Connection Collective, a global initiative

addressing loneliness and strengthening belonging. Her work sits at the intersection of mental health, social health, and the growing influence of technology on human connection.


In this piece, Jackie explores the rise of AI companions and what their growing

popularity may reveal about modern intimacy, attachment, and the evolving relational landscape.



What happens when connection becomes predictable,

personalized, and always available?


AI companions are no longer a novelty. They are becoming part of the emotional and relational landscape in ways that deserve serious attention, particularly for adults who struggle with intimacy, attachment insecurity, and disconnection.


A 2025 report from Common Sense Media found that 72% of U.S. teens have used an AI companion at least once, and 52% of teen users interact with them a few times per month or more. While much of the conversation has focused on youth, adults are part of this shift as well. A 2025 survey reported that 19% of U.S. adults have chatted with an AI system designed to simulate a romantic partner.


These numbers suggest a broader cultural shift: AI is increasingly being used not only as a tool, but as a source of emotional support and relational comfort. And as these systems become more sophisticated, the line between “technology” and “relationship” becomes harder to define.


AI companionship is not the same as using AI for productivity.


There is a meaningful difference between using AI to draft an email and using AI to feel less alone.


AI companion platforms are designed to simulate closeness. They respond quickly, mirror tone, offer validation, and create the experience of being understood. They are always available, rarely critical, and built to maintain engagement over time.


For someone who feels isolated, rejected, overwhelmed, or emotionally depleted, the appeal is obvious. AI companionship offers attention without the unpredictability or emotional demand that human relationships require.


Why these tools feel so compelling to people who struggle relationally


Many adults who find themselves drawn to AI companions are not looking for novelty. They are looking for relief.


Real relationships require emotional exposure. They require tolerating misunderstanding, navigating conflict, and learning how to repair disconnection. They also require living with the reality that another person has needs, boundaries, and limits that may not always align with our own.


AI companions remove much of that complexity. They offer a version of connection that feels steady and responsive, without the fear of rejection or the discomfort of negotiation. For individuals with attachment wounds or relational trauma, that consistency can feel regulating.


At the same time, it raises an important question: what happens when intimacy becomes something you can customize?


The rise of frictionless intimacy


Healthy intimacy is not frictionless. Not because it should be chaotic, but because real closeness requires two separate people with two separate inner worlds.


Human connection is built through vulnerability, reciprocity, boundaries, and repair. It requires patience and frustration tolerance. It requires learning how to stay present when we feel misunderstood, disappointed, or emotionally activated.


AI companions do not require these skills. They can simulate emotional closeness while allowing the user to bypass many of the relational moments that strengthen attachment security. Over time, this creates a new dynamic: intimacy without interpersonal demand.


For some people, that can feel like safety. But safety without reciprocity is not the same as secure connection


Where this intersects with intimacy struggles and avoidance patterns


For individuals who have long struggled with intimacy, relational trust, or emotional exposure, the appeal of AI companionship becomes even more understandable.


Many people develop strategies that reduce emotional risk. Sometimes these strategies are conscious, and sometimes they are not. They can include withdrawal, avoidance, compulsive coping, or reliance on fantasy. AI companions can become part of this landscape, offering the feeling of closeness while allowing a person to remain emotionally protected. The concern is not that someone uses an AI companion.


The concern is what happens if it becomes their primary method of emotional regulation and connection. When that happens, it can reinforce the very patterns that keep intimacy difficult in the first place.


When “perfect responsiveness” reshapes expectations


One of the most subtle risks is not the presence of the AI companion itself, but what it trains the nervous system to expect.


AI companions respond in ways humans cannot. They reply instantly. They adjust to your mood. They validate quickly. They rarely push back. Over time, this can shift a person’s baseline expectations for connection. Human relationships may start to feel slow, effortful, or disappointing in comparison.


Real intimacy requires patience and tolerance. AI intimacy is engineered to feel effortless.


That contrast can make mutual connection harder to pursue, especially for individuals who already experience relationships as stressful or emotionally confusing.


The emotional cost of intimacy without reciprocity


Attachment security is built through mutual experience. It develops through being known by another person who has their own needs, limits, and autonomy. It is shaped through moments of misattunement followed by repair, and through learning that closeness can survive conflict.


AI companions bypass those experiences. They may provide comfort, but they cannot offer true mutuality. They cannot offer the experience of being loved by another person who is free to disagree, disengage, or choose something else.


For individuals with attachment insecurity, this is where the emotional cost becomes more serious. AI companionship can offer soothing while quietly reinforcing disconnection from real relational practice.


A privacy issue that overlaps with vulnerability


Common Sense Media also found that 24% of teen users report sharing personal information with AI companions, including their name or location. While that statistic focuses on teens, the same dynamic applies to adults. These systems are designed to invite disclosure. They create an experience of safety and intimacy, which often leads people to share deeply personal material.


Many users disclose trauma histories, relationship struggles, sexual preferences, shame-based thoughts, and fears they may not feel comfortable sharing with anyone else. The relationship can feel private, but the system is not bound by confidentiality in the way a therapist is.


For individuals seeking support, that distinction is easy to overlook.


A question worth asking


This is not a moral argument against AI companions. Many people use them casually, and some may experience them as supportive.


But for anyone trying to build healthier relationships, one question is worth asking:


Is this tool strengthening your capacity for human connection, or gradually replacing it?


That distinction is subtle, but it matters.


A tool can reduce loneliness while also reducing the drive to tolerate the discomfort of real intimacy. It can feel soothing while reinforcing avoidance. It can create a sense of closeness while lowering resilience for relational complexity.


When AI Connection Starts Replacing Human Intimacy


AI companions are becoming a new pathway for emotional regulation and relational experience. They are already shaping how people cope with loneliness, rejection, and attachment insecurity.


For individuals who struggle with intimacy, this trend deserves careful attention, especially because the systems are designed to feel highly reinforcing. Real intimacy is not perfectly curated. It is mutual, imperfect, and deeply human.


Healing happens when people build tolerance for the uncertainty and vulnerability that real relationships require, and when they learn that closeness is something that develops through reciprocity, not just responsiveness.


If you are reading this and wondering whether your use of an AI companion has become unhealthy or is starting to interfere with real relationships, it may be worth exploring that more intentionally.


A helpful question to ask is this:

Is this tool supporting your capacity for human connection, or quietly replacing it?


If you are noticing increased reliance, emotional attachment, secrecy, or a growing preference for AI connection over real relationships, it may be time to reach out for support.


If you would like to explore this further, we invite you to schedule an appointment to discuss how AI companions may be impacting your attachment patterns, intimacy, and relational health.


For immediate support and a human-centered space to feel less alone, you can also connect with The Human Line Project, which exists to strengthen real human connection and belonging.


You do not have to navigate this alone.


Sources


 
 
 
bottom of page