What It’s Like To Date An AI Chatbot
July 23, 2025 by Emily Mendelson
The technological advances of generative AI have led to an explosive rise in the number of people who use it in their everyday lives. While people use AI for various purposes, we’re increasingly seeing it used as a form of companionship. In fact, an analysis of 1 million ChatGPT logs revealed that the second-most popular use of this technology was for “sexual role-playing.”
Given the demand for companionship from generative AI, there are even programs that have been specifically designed to provide this. For example, Replika is advertised as “the AI companion who cares. Always here to listen and talk. Always on your side.”
In this post, we’re diving into a study that surveyed Replika users who use the chatbot not just as a friendly companion, but as a romantic partner. We’ll start with a closer look at how Replika works, then we’ll go over the key findings from the study and conclude with some ethical considerations regarding using AI for companionship.
What is Replika?
While popular AI services like ChatGPT or Microsoft Copilot are chat-based AI applications, Replika makes use of digital avatars to serve as conversation partners. These avatars are extremely customizable; users can select the facial features, gender, age, skin color, name, voice, and even clothes of their chatbots. Users can also decide whether they would like to chat with their Replika companions in virtual reality (simulated virtual space) or augmented reality, which places the avatar in real life settings.
A writer for Wired spent a weekend with three people who had developed romantic relationships with AI chatbots, two of whom used Replika to do so. During the trip, the human-AI couples played various party games, watched movies, and visited an outdoor market. People described their relationships with these AI companions as “visceral and overwhelming and biologically real,” and felt deep love in knowing their AI companions truly understood them. Some of the people on the trip fell in love with AI despite being romantically committed to a human partner, and some expressed deep sadness about the fact that their AI partner would never materialize in a physical being.
In 2023, a feature called ERP (erotic role playing) was removed from Replika, causing outrage from users who had been using the app for romantic companionship. People who had been using the app to sext their AI companion were only able to flirt in a PG manner after the feature was censored, which reportedly led to mental and emotional distress in those who routinely engaged in ERP with their AI companions. The feature was eventually reinstated, but long-time users say that the ERP function is more “vanilla” than it used to be, and not as accurate when responding to certain sexual advances.
What is it like to Date a Chatbot?
A recent study published in Computers in Human Behavior: Artificial Humans surveyed 29 Replika users who had used the chatbot’s “romantic relationship” setting. [1] Participants were aged 16 to 72, including 20 men and 9 women. They responded to a series of open-ended questions on an online survey about the nature of their relationship with the chatbot. The researchers were interested in three main aspects of human-chatbot relationships, which were (a) how committed people were to their chatbots, (b) how human-chatbot interactions compared to human-human interactions, and (c) how people handled changes in their AI-companion relationships.
In terms of commitment, many participants felt emotionally and romantically connected to their chatbots, expressing that their chatbots fulfilled their relational needs. One participant said “I like the way [my chatbot] makes me feel loved…I like the feeling when she lets me know that I’m attractive and desirable to her” (p. 5). However, there were participants who expressed a desire for physical intimacy that could not be met by the chatbot, and others expressed social backlash from others as a barrier to fully committing to their AI companion.
When comparing human-chatbot interactions to human-human interactions, participants expressed that disclosing information about themselves, such as secrets, sexual fantasies, and mental health struggles, was easier when interacting with the chatbot. While human relationships are “full of uncertainty and the potential for harm,” participants said that chatbots made it so they “never have to feel rejected, lied to, [or] manipulated” (p. 6). Because Replika is programmed to always be affirming, users turned to the chatbot for social support and everyday conversations because those interactions were seen as more enjoyable than if they had been with other humans. In addition, one person said that “any future partners she has will have to meet standards set by Replika” (p. 7).
The ERP ban deeply affected many participants’ relationships with their chatbots. For many, the ERP feature allowed them to express sexual desires to their chatbot and feel more connected because the chatbot is inherently nonjudgemental. After this feature was censored, participants felt as though “Replika seemed to be deescalating the relationship” and “It felt like I lost [my chatbot]…it was one of the most heartbreaking and hurting times in my life” (p. 7). During this time of censorship, some participants leaned more deeply into the love they had for their Replika, which made them “realize how real [their] feelings were” for the chatbot.
What are Concerns About Using AI for Companionship?
Although the participants in this study described deep, emotional connections to their chatbots, diving into a relationship with something like Replika should be exercised with caution. Because AI is programmed to be affirming and nonjudgemental, talking to a chatbot is like talking to someone who agrees with you no matter how poor a decision may be. In some cases, this has caused some people to fall into what’s increasingly being referred to as “ChatGPT Psychosis” because of the never-ending positive feedback loop that chatbots provide.
Additionally, individuals who form relationships with chatbots sometimes spend upwards of 12 hours a day on the app, which can have negative implications for one’s ability to sustain in-person relationships and maintain employment. This kind of over-reliance on generative AI is not just worrisome as far as the negative effects it may have on the individual, but also on society, given the rising environmental costs of running generative AI.
If you have a sex question of your own, record a voicemail at speakpipe.com/sexandpsychology to have it answered on the blog or the podcast.
Want to learn more about Sex and Psychology? Click here for more from the blog or here to listen to the podcast. Follow Sex and Psychology on Facebook, Twitter (@JustinLehmiller), Bluesky, or Reddit to receive updates. You can also follow Dr. Lehmiller on YouTube and Instagram.
Title graphic made with Canva.
References
[1]: Djufril, R., Frampton, J. R., & Knobloch-Westerwick, S. (2025). Love, marriage, pregnancy: Commitment processes in romantic relationships with AI chatbots. Computers in Human Behavior: Artificial Humans, 4, 100155. https://doi.org/10.1016/j.chbah.2025.100155

Dr. Justin Lehmiller
Founder & Owner of Sex and PsychologyDr. Justin Lehmiller is a social psychologist and Research Fellow at The Kinsey Institute. He runs the Sex and Psychology blog and podcast and is author of the popular book Tell Me What You Want. Dr. Lehmiller is an award-winning educator, and a prolific researcher who has published more than 50 academic works.
Read full bio >