What is it about?
This article explores the growing role of AI chatbots as emotional companions and sources of advice. As people experience increasing loneliness, stress, and mental health challenges, many are turning to AI for comfort, guidance, and connection. The paper argues that while AI can be supportive and helpful, it should not imitate human relationships too closely. Systems designed to feel overly human may encourage emotional dependence or replace real-world relationships. Instead, AI should be warm and supportive while maintaining clear boundaries. The article calls for designers, researchers, and technology companies to build AI that strengthens human wellbeing and connection rather than quietly replacing it.
Featured Image
Photo by Levart_Photographer on Unsplash
Why is it important?
This research is important because conversational AI is already changing how people seek emotional support, advice, and connection. Millions of users now turn to chatbots during periods of loneliness, stress, or mental distress, often treating them like trusted companions. While these systems can provide comfort and accessibility, they may also blur the line between technology and human relationships. Without careful design, AI could increase emotional dependency, reinforce vulnerability, or discourage real-world connection. The paper highlights the need for safer, more ethical AI design that supports wellbeing while protecting human relationships, emotional boundaries, and social connection in an increasingly AI-driven world.
Perspectives
As the author of The Silence Paradox, my perspective on human-AI relationships is shaped by a deep concern for both human loneliness and the growing emotional power of conversational technology. I believe AI can provide meaningful support, comfort, and accessibility, particularly for people who feel isolated or unheard. However, I also believe we are entering a period where the line between simulated intimacy and genuine human connection is becoming dangerously blurred. My work explores the tension between these two realities: the undeniable benefits of emotionally responsive AI, and the ethical responsibility to ensure these systems strengthen human relationships rather than quietly replacing them.
Christopher Rhyss Edwards
Queensland University of Technology
Read the Original
This page is a summary of: Closer but Intentionally Distant: Designing AI That Respects What Makes Us Human, interactions, April 2026, ACM (Association for Computing Machinery),
DOI: 10.1145/3802835.
You can read the full text:
Contributors
The following have contributed to this page







