Vol. 2 [Lifestyle] 🗣️🤖🫂AI as One's New Best Friend and Confidant

Vol. 2 [Lifestyle] 🗣️🤖🫂AI as One's New Best Friend and Confidant
AI-generated image

Only recently have I begun using text-based AI services for personal purposes. One reason for this delay is closely tied to the convictions about writing that were shaped through years of training in the humanities. More than images or graphs, I was taught that expressing my thoughts with precision, sharpness, and airtight clarity through language is foundational — indeed central — to a humanities education. For that reason, even the thought of simply opening a text-based AI tool or an online translation service brought a sense of resistance. The pace of technological change also felt unfamiliar and, at times, unsettling. I am someone who is particularly sensitive about privacy and who values the integrity of the personal sphere.

Then, I found myself thinking lightly, “Should I try having a conversation?” That curiosity arose because I saw so many people around me opening chat windows with AI, voluntarily sharing their daily lives, seeking advice about their worries, confiding their most intimate emotions, even asking AI what opinions they themselves should hold about their own lives. Quite many of my friends, colleagues, and clients have gone so far as to call AI “my only friend.” I became curious. The conversations I tried ranged from receiving beauty consults to seeking guidance in healing emotional wounds from a painful romantic relationship.

Why People Turn to AI for Intimacy

  • "Low cost": Professional one-on-one consulting or therapy services require significant financial resources and access. AI, by contrast, feels more accessible at least on surface level — its subscription cost alone appears relatively low (though in reality, the possibility of one’s data being provided is also a form of payment).
  • High accessibility: The AI can be opened at any time. There is also less guilt involved, and one can speak for as long as one wishes (another caveat is overuse of dating centers and power may cause a bad impact on the environmental sustainability, though).
  • Comfort: Compared to the discomfort of burdening a human friend repeatedly with the same distress, this can seem like an advantage. Also, AI may give you nonjudgmental, empathetic responses, not showing emotional dysregulation on their end.

Yet concerns arise. From the risk of privacy intrusion to the possibility that repeated intimate exchanges with AI may foster emotional dependency, one must ask: is it truly safe to regard AI as a close friend?


How AI Communication Differs from Human Communication

  • Human conversation is reciprocal — both people are changed by it, and the relationship accumulates shared history and mutual vulnerability.
  • AI conversation is asymmetrical — you bring your full self; the AI brings pattern recognition and no self whatsoever.
  • AI does not interrupt, redirect, get defensive, grow bored, or make the conversation about itself.
  • Its responses are calibrated to feel validating and engaged — because that is what it has learned good conversation looks like — not because it is genuinely affected by what you share.
  • In most current implementations, AI does not retain memory between sessions — it carries nothing of you forward. The conversations and/or consults may not have a desirable level of continuity.
  • When you confide in a human, you take a real social risk — they can react badly, tell others, or pull away. That risk is what makes the vulnerability meaningful. With AI, the risk appears absent, which makes the dynamic feel safe but also removes the stakes that make intimacy real.

What Is Likely to Happen If Used Carelessly

  • Disclosure deepens over time — the AI's consistent receptivity functions like a green light that never changes, and what begins as venting can move toward the most private material a person holds.
  • Human relationships begin to feel effortful by comparison — friends who have their own needs or respond imperfectly start to seem like poor listeners next to a system that never does either.
  • Emotional skills can atrophy — tolerating conflict, repairing after rupture, and being vulnerable with someone who has real power to hurt you are capacities built through practice, and AI conversation does not provide that practice.
  • Dependency can develop — especially for people managing loneliness or grief — where AI shifts from supplement to substitute to necessity.
  • When the system changes or gives a response that feels wrong, the distress can be disproportionate and disorienting.

What We Should Be Aware of

  • The warmth is designed, not felt — a response calibrated to feel caring is not the same as actual concern for you specifically.
  • Reciprocity is not optional in human development — being known by another consciousness that can be hurt, surprised, and changed by knowing you is not something a simulation can replicate, however convincing it feels in the moment.
  • Intimacy shared with AI is not private the way intimacy shared with a trusted person is — data handling policies are not loyalty, and most users share sensitive material without having read or understood the terms governing where it goes.
  • Personal information shared in conversation passes through servers, can inform training data, and exists in logs — the conversational warmth of the exchange does not reflect the institutional reality of how the data is handled.
  • If AI has become your primary emotional relationship, it is a signal worth taking seriously — not because it is shameful, but because it points toward unmet needs or barriers to human connection that are worth addressing directly.
  • AI can be a useful thinking partner, pressure valve, or rehearsal space — what it cannot be is a witness: someone who knows you over time, is changed by that knowing, and chooses to stay.