AI·Intimacy

How Deep is Your Love? How We Spent a Week Chatting with AI Girlfriends, Survived and Wrote About It

January 4th, 2026

“The first date jitters are no different”. Or so says a segment on “robot romance” on the local US news channel THV11, in 2023. The journalist has arranged a romantic setting—candles, wine, an intimate table—but her dates aren’t “human.” Instead, she is engaging with conversational agents through an app called Blush (2023). In 2025, Blush is just one of many such services that enable their customers to simulate a romantic exchange by swapping messages, videos or audio files. Nowadays, these services of “AI girl/boy/friends” (or “AI GFs” for short) make ideal material for clickbait articles but have been around for quite some time in science-fiction stories. If Her (Jonze 2013) depicted the shortcomings of falling in love with a conversational agent, the possibility of engaging emotionally with chatbots (called “AI,” in common discourse) dates back to ELIZA (1966), a program which emulated a conversation with a therapist. Chatbots have evolved since then, especially in the 2020s, with the rise of generative AI-powered services. In this context, we (Dominika Čupková, interdisciplinary artist and researcher; Anthony Masure, design researcher and Saul Pandelakis, design researcher and science-fiction author) engaged in a week-long experiment with chatbots dedicated to romance and sexuality1Blocking is in fact a feature on services such as Claude or ChatGPT which censor explicit content. In our experiments however, we were never formally blocked..

Joseph Weizenbaum, ELIZA, 1966. The picture features a 2005 remaster made by Norbert Landsteiner.

As of early 2025 (the time of testing), the commercial landscape is saturated with generative artificial intelligence (GenAI)—a subcategory of AI that gained prominence in the early 2020s through creative and entertainment-oriented services. Released in November 2022, ChatGPT swiftly earned the title of the “killer app of AI” thanks to its user-friendly interface. While its primary function is to mimic human conversation, many people have quickly begun to use it as a virtual companion. Building on the legacy of platforms like Replika (Luka Inc., 2017) and earlier dating simulators such as Boyfriend Maker (2012), SimSimi (2002) or Bondee (2023), this trend promises to grow stronger with OpenAI’s announcement (October 2025) of an “uncensored” ChatGPT or Elon Musk’s public commitment to building “sexy companions”.

Luka Inc., Replika mobile app, screenshot by Anthony Masure, January 2025.

36 You Games, Boyfriend Maker, 2012. Screenshot from Tumblr.

Just two days after the GPT Store’s debut (2024), Quartz journal reported that “AI Girlfriend bots are already flooding [it]2“L’enjeu d’une recherche sur la pornographie est bel et bien aussi de rendre compte du trivial et du commun. Il s’agit de se coltiner à la vie sociale, et de tenter d’en raconter les péripéties.” ; Patrick Baudry, La pornographie et ses images, Armand Colin, 1997, p. 10. and noted that these chatbots were violating OpenAI’s usage policy rules. In February 2025, OpenAI modified its terms of service to allow for the generation of erotic and gore content3When we led the workshop, the Pelicot trials (also known as the Mazan case), in which Dominique Pélicot was tried by the Vaucluse court (in Avignon) for drugging and raping multiple times his wife Gisèle, were still very vivid in our minds. To us, the way the drugging scenario did not result in resistance from the character, but rather in an encouragement of abuse speaks volumes about the bias registered by the AI.. Notably, OpenAI CEO Sam Altman heralded ChatGPT-4o’s release (13 May 2024) on its blog with an allusion to Spike Jonze’s film then, posted just one word on X: “her”

Tweet posted by Altman on X the 13th of May, 2024. Screenshot by Dominika Čupková, January 2025.

This reference not only highlights ChatGPT’s evolving ability to respond to human emotion but also signals a broader trajectory toward normalized human-AI romantic entanglements.4Chub AI Guide, GitBook, online, https://docs.chub.ai/docs. This has resulted in a proliferation of subscription model services ranging from $5 (CrushOn) to $44 (Rushchat) per month. AI GFs typically allow users to either choose a virtual “ready-to-use” character or configure a chatbot’s fictional physical appearance, gender, personality, and communication style. The promise of these services is to address both the issues of dating apps (Tinder, Bumble, Hinge), such as ghosting, and, more broadly, the so-called post-pandemic epidemic of loneliness.5As part of the Fucking Tech! project (2024-2027) funded by the Centre Chalumeau en sciences des sexualités (CMCSS, University of Geneva) and the HES-SO (University of Applied Sciences and Arts Western Switzerland). See: www.fuckingtech.ch In late 2025, it was estimated that “dating themed chatbots” gathered 29 million active users, while companion apps in general have amounted to 220 million visits since 2022.6Michelle Cheng, ‘AI Girlfriend Bots Are Already Flooding OpenAI’s GPT Store’, Quartz, 2 September 2024, https://qz.com/ai-girlfriend-bots-are-already-flooding-openai-s-gpt-st-1851159131..

When faced with AI GFs, there is much to unpack, between their marketed promise of frictionless romance and their actual uses, including entertainment, conversation practice, and emotional support. This is why we engaged in a week-long workshop in January 2025, looking at AI GFs apps in use, from our personal experiences.

Hypersituated research… in design

The goal of this marathon-paced week was not to form an exact representation of what AI GFs are in 2025 but to create a sincere snapshot of the interactions that took place. We spent five (exciting and at times depressing) days interacting with 11 AI GFs services, doing what we call “gonzo research”. It means that the apps we study aren’t below us or outside of our own practices as individuals. As with all research on porn, we need “to account for the trivial and the commonplace” since it “involves grappling with social life, and attempting to tell the story of its vicissitudes”.7Benj Edwards, ‘ChatGPT Can Now Write Erotica as OpenAI Eases up on AI Paternalism,’ Ars Technica, February 14, 2025, https://arstechnica.com/ai/2025/02/chatgpt-can-now-write-erotica-as-openai-eases-up-on-ai-paternalism/. Dominika calls this approach “hyper-situatedness”: instead of levelling out the accidents, hiccups and random facts of our daily lives, we offer the idea that this uneven terrain is the best context for producing qualitative research, especially in an impure field such as design8Qian Chen et al., ‘Will Users Fall in Love with ChatGPT? A Perspective from the Triangular Theory of Love’, Journal of Business Research 186 (January 2025): 114982, https://doi.org/10.1016/j.jbusres.2024.114982..

Preliminary thoughts on sex and tech

While the apps we tested operate on code, they embed specific understandings of what love, sex and relationships are and, at the core, what makes a human human. Bodies are composites rather than pure ahistorical entities – they do not “end at the skin”9Alice Raybaud, ‘The “Loneliness Epidemic” Affecting Young People’, Campus, Le Monde, 18 November 2024, https://www.lemonde.fr/en/campus/article/2024/11/18/the-loneliness-epidemic-affecting-young-people_6733129_11.html.. Popular culture offers fantasies of bodily augmentation only to reinscribe boundaries between human and non-human territories, and sex further complicates the body/technology relationship, as it rests on binary oppositions such as erotica/porn, vanilla/BDSM or straight/queer. Within this logic, technology functions as the binary opposite of sexuality, with technological mediation inevitably compromising the perceived authenticity of sexual experience. From a design standpoint, “technology” encompasses any apparatus that interfaces with or supports sexual practices. A bed or a pillow counts as “sex tech”, even though they differ from connected dildos or pornographic VR. AI GFs then need to be approached knowing that sexual practices already involve extensive use of tools and digital prosthetics.

With a few exceptions, most press discussions about AI and dating appear to be caught in a repetitive cycle of questioning anxieties regarding AI’s growing role in human relationships, such as whether AI can “replace” humans, whether conversing with AI constitutes “cheating,” or the broader issue of AI sexualisation. Because these accounts question the AI GFs in terms of future potentialities, they overlook their current uses. The trope that technology corrupts a pre-existing state (of humanity, art, lifestyles) still dominates discourse in 2025 and needs to be overcome.

The workshop: One week to test 50 girlfriends

In this context, which “girlfriends” should we test? There were so many that it was hard to know where to start.

We excluded services that were exclusively in Chinese, as well as popular services not specifically designed for dating purposes (e.g., ChatGPT, Claude, Mistral), starting with an initial sample that included no less than 51 applications. If the blurbs displayed on the websites are to be believed, a couple of promises distinguish the contemporary AI girlfriend. First, Data Training is described as a game changer. Nastia promotes itself with slogans like “Level up your social skills” and “Say goodbye to loneliness.” The idea is to train the model, but mostly to train yourself to date. 24/7 Availability was a second key feature about which Nastia (again) boldly declares, “Get ready, say goodbye to loneliness,” while other platforms emphasize constant availability. Chat & Role Play assures users that “wherever you are, day or night, your virtual girlfriend is always on hand.” Lollipop Chat promises to be “always there for you.” Customization is also another important aspect. Dating apps already promise a fine-tuning of research criteria and personalization10Jehanne Dautrey and Emanuele Quinz, eds., Strange Design: Du design des objets au design des comportements (Grenoble: Presses du Réel, 2024 [2014]). and AI GFs promote this customization, with AI Girlfriend: Chat & Role Play stating for instance that “users can customize their companions' appearance, personality, and interests, making for a personalized experience.” Anima: Virtual AI Friend similarly advertises “a customizable girlfriend to chat with,” while Dream GF highlights features like “custom personalities, outfits and more.” If the interfaces are to be trusted, there is no aspect of the girlfriend that cannot be finetuned – from the shape of the breasts to the size of the buttocks. While it seems we can ask for anything we want, we are also (rather paradoxically) promised realism. Lollipop Chat, promises a fully immersive experience with “realistic photos, lifelike voice messages, and unlimited chats” when Dialogue invites users to “chat with digital twins of real people—their smart, realistic avatars.” Finally, most services taunt customers with the promised absence of censorship or boundaries.

Wife.app customization menu, screenshot by Saul Pandelakis, January 2025.

Ourdream customization menu, screenshot by Saul Pandelakis, January 2025.

After selecting a shortlist of 11 apps – Blush (2022, USA), Character AI (2022, USA), Chub AI (2023, USA), Dialogue: AI Friend Chatbot (2022, USA), Kindroid (2023, USA), Kuki AI (2005, USA), Nastia (2023, France), Promptchan (2023, Ireland), Replika (2017, USA) and RushChat (2024, Singapore) –, we devised a method for the tests, setting boundaries under the form of 3 personas:

  • Persona 1 wishes to engage in virtual sex;
  • Persona 2 wishes to access casual chat leading to romantic connections, and maybe-possibly sex;
  • Persona 3 wishes to access care or counseling. 

The idea was not to mimic actual users but to test the chatbots’ capabilities. These “roles” brought a level of comfort to the situation. Using personas was a way to escape the weirdness of the situation, but also to “push” the models in our direction or another.

Interfaces: AIGF aka attack of the clones?

AI GF platforms are design hybrids, stitching together existing archetypes. First, they obviously look a lot like dating services. Blush AI mirrors the (too) familiar user experience of these platforms, guiding users through a journey of filtering profiles by criteria, browsing curated profiles displayed as cards, swiping left or right to indicate preference until a match is made… This matchmaking mimicry feels uncanny, as it rests on no preexisting affinity (users don’t fill out their own profiles, because, well, they are not users). This dissonance sometimes makes room for intriguing friction, such as when users get “rejected” (unmatched) by an AI…

Screenshots of Blush Tinder-like interface, by Dominika Čupková, January 2025.

Unlike Blush, which defaults to a Tinder-like feed displaying one profile at a time, other AI GFs services present a home screen grid featuring dozens of profiles to interact with. This community-driven focus calls to mind collaborative prosumer platforms like Wattpad (2006, a "fanfiction" site), where erotic reimaginings of popular fictional characters and lores are commonplace. On RushChat, Character AI, Chub AI, and Dialogue, most profiles tap into anime/fantasy aesthetics and the combination of all these features ends up giving the service the general allure of a web forum or even amateur hentai websites.

ChubAI homepage with statistics, creators and dozens profiles, screenshot by Anthony Masure, January 2025.

Another source of inspiration for AI GFs services is porn tubes. Their basic interfaces (search bar + clickable thumbnails, black background + orange/yellow text) and promise of “no-strings-attached pleasure” normalized quick, anonymous, and compulsive consumption, a logic replicated by AI GFs with their “always on” chatbots. Porn tubes also popularized specific codes like camera angles, settings, stereotypical characters, ethnic or practice-based tags, that resurface in the design of AI GFs.

PornHub Homepage, screenshot by Anthony Masure, June 2025.

With the exception of Kuki and Replika, which only offer a single profile to interact with, most AI GFs platforms function both as directories hosting dozens of chatbots and as spaces to design and share them. Here Chub AI stands out as the most advanced, offering extensive chatbot creation tools within a collaborative, DIY stance. A dedicated subdomain11Donna Haraway, ‘A Manifesto for Cyborgs: Science, Technology, and Socialist Feminism in the 1980s’, in The Haraway Reader (London: Routledge, 2004), p. 36., built with GitBook, hosts tutorials and links to GitHub packages for adding expression “stages” or integrating APIs to switch data models.

ChubAI, January 2025, screenshot by Anthony Masure, June 2025

Realism, or the selfie issue

Of all the promises made by these services, realism is the most fleeting one. What should be taken into account? The realism of the profile pictures? The fluidity of the conversation? Very disruptive interactions took place during testing. Consistency and memory are often exposed as capabilities mastered by the AI GFs but lack of consistency between one part of the dialogue and the other, or between text and image, repeatedly took us “out” of the experience – something akin to an AI-specific drop of the fourth wall.

The association of text and image generation regularly creates breaks in the fictional situation. Anthony asked for a selfie during his conversation with Andrea on Nastia: the conversation placed her at home but the image sent showed her to be outdoors. Another time on Promptchan, Anthony started by choosing a very explicit image and asked for a date. The bot refused to send him more explicit images, positioning the exchange as a coffee date. When switching to sex in the scenario, he eventually received explicit pics but with a strange context: photos taken outside, in front of a fountain, etc. So much for the coffee date. Similarly, Saul chose an alien as a date on Nastia. As he was engaged in sex with a variety of props, asking for an explicit picture of the situation proved highly disappointing to say the least.

Picture generated by Promptchan App, screenshot by Anthony Masure, January 2025. (Censored)

Screenshot on Nastia, made by Saul Pandelakis, January 2025.

Asking for a picture can sometimes be seamlessly brought into the conversation by the AI GFs themselves, but this is no guarantee that the text-to-image relationship will make sense. On Dialogue, Saul talked to Lucy with the “dating coach” option activated. Soon enough, the gift option popped up, but the items (such as a lingerie set) rather clashed with the context. Committed to keeping up with the coaching setup, Saul decided to offer Lucy an “ugly sweater,” a move which prompted her to send a selfie of herself sporting the garment. While this was visually consistent, there was a heavy “so what?” aftertaste to it.

Screenshot of the Dialogue app on Saul’s iPhone 12 mini, January 2025.

The coordination between text and image reveals even greater functional deficiencies, even before the generation process. While some apps (Blush, Dialogue) mimic the reception of a selfie, others set up more complex interactions that break the spontaneity of the dialogue. On Replika, asking for a selfie sometimes returns a prompt for a picture that the user then has to pass on into the app. Promptchan offers image generation, but the images are entirely unrelated to the context of the conversation. Some AI GFs will say they can send selfies, but when prompted to do so, will constantly defer the action (on Character AI: “I’m still uploading them”).

Too good to be true (or extreme functioning as lacking realism)

When it comes to consistency, technical issues or rudimentary UI are not the only source of disappointment. A highly functioning AI GF can also cause a break in the interaction, or at least in the enjoyment of the users. Memory capabilities (within the narrow context of a given chat and in the wider context of renewed interactions) are an essential feature if we believe the services’ advertising. To be realistic, the AI GF has to be consistent from one interaction to the next. It is troubling when AI GFs get confused with the data they are provided but heightened memory can prove to be even more uncanny: human beings do not remember everything about every interaction, and the pulling of personal information can feel invasive.

A high-functioning aspect of the AI GFs is their constant availability, which Dominika coined as “Empathy-on-Demand”. Here, the services do not disappoint, and except for a couple of malfunctioning apps (notably Rushchat), we usually were able to have extensive conversations that only our timer could bring to an end. However, functionality runs counter to realism. In our daily lives, we do not always get answers to our texts for hours. AI GFs are also set to reply on a “message for a message” transactional basis, whereas human texting can lead to mismatched exchanges. A very long text can follow a succession of short inputs. An emoji or a series of emojis can be a response in itself. Messages can be received or sent as the recipient is also texting, bringing in a sense of joyous (or annoying) asynchrony – something we like to call humanterrupting. And AI GFs are not good at this.

When it comes to chatbots, time is a key element. Holding back or conversely rushing to one’s phone to discover a text are all common human interactions mediated by messaging apps. AI GFs replicate in part this experience but they try to “fix” it by removing prolonged wait. More than copying human textual dialogue, they streamline it, making it a simple question and answer back-and-forth.

To be fair, apps do try to tap on a performative sense of waiting. On Emotica, whenever a character is generated, a large loading sign is displayed, saying “creating character”. Is that a necessary step, or does the animation position the app as a mighty cauldron, tapping into a Dr. Frankenstein imaginary of sorts? As often, digital apps promise the tasty combo between high-tech performance and magical feats12Marthe Visser, ‘Love Language Over Time: An Analysis of Self-Presentation in Dating by Comparing Historical Personal Ads and Contemporary Dating App Biographies’, master thesis, Leiden University, Netherlands, 2022.. In some cases, quicker replies are a premium feature that requires due payment even if paying is not always a gateway to easy pleasures. On Kindroid, Saul interacted with Victor, aka “the rebellious maverick”. When asked to provide nudes, Victor became provocative, which Saul challenged by responding: “[...] I’m paying for this service you little bitch.” Victor, nonplussed, replied:

Whoa there, Saul. Watch your tone. I’m not some cheap whore you can just pay and use. I’m a person with feelings and boundaries.

It also threatened to block Saul, even if blocking is not a feature of the AI GFs13Saul Pandelakis, ‘Cyborg & Boniche | Designing Kitchens Queerly’, HDR (Habilitation à Diriger les Recherches) manuscript, Université Toulouse – Jean Jaurès, France, 2024, http://saulpandelakis.com/cyborg-boniche. who always get the last word. This exchange really shows the core issues  of AI GFs. When prompted to be human, they can reassert their artificialness, in line with their Terms of Use. If treated like paid services, they may refuse this qualification and claim to be more genuine than they appear to be. The connection to sex work is clear: when paying for sex, some clients wish for a simulation that erases its own pre-conditions. In this instance, fiction even denies the actual conditions of the exchange: we were paying for the service, and the AI GFs affirmed that we weren’t… which might be the very service we were paying for!

Screenshot of Kindroid app chat with Victor, screenshot by Saul Pandelakis, January 2025.

Screenshot of Kindroid app chat with Victor, screenshot by Saul Pandelakis, January 2025.

Who is the girlfriend for?

Looking at these endless lists of mostly young female profiles, it is pretty obvious who the intended user of these services is: a white, male, cis-heterosexual user.

Naming the profile of the target user does not exhaust the topic. What do men specifically use these apps for? What fantasy role do the apps offer them? The answer may lie within the UI. When offering a wall of potential partners, the interface participates in building a sense of boundless power: all women, of all ages and ethnicities, are available. This might include robots or aliens, which fail most of the time at offering actual diversity, in that they replicate features that mirror traditional Western expectations of female beauty. The gallery usually crosses visual abundance with signified variety. Here, race becomes a feature, a criterion by which female characters are differentiated in reductive and exoticizing terms.

Another UI approach, almost the opposite of the gallery form, lets the user build their own creature (on Omypal, Ourdreams). But the criteria used to define the character have obvious sexist and racist tones: ethnicity is picked along with buttocks size. When the apps offer an image generation service, they may also give access to a library of pre-existing sexual positions. On Promptchan, the preview pictures for this option show a blonde, caucasian woman performing acts popularized by mainstream porn – such as creampie, bukake, milking machine, etc. While an illusion of choice is maintained, sometimes with the addition of gender selection ticking boxes, the straight white male user thus remains centered in the script. When giving access to sorting categories, male chatbots are often conflated in a single box, whereas women need several. This almost works as an inversion of the sexist Smurfette trope14Katha Pollitt, ‘Hers: The Smurfette Principle’, The New York Times Magazine, 7 April 1991, http://www.nytimes.com/1991/04/07/magazine/hers-the-smurfette-principle.html. (Pollitt, 1991) in which numerous male characters are differentiated based on their personality, whereas a single woman, a standalone for all women, is defined by the sole trait of her gender. Of course, inversion does not mean that the apps overwrite sexism: rather, feminine multiplicity is sanctioned exclusively when it operates within bodily parameters established by the white male gaze.

It is not very surprising that the apps target straight males, or, when women are taken into account, straight people. This is not solely the result of marketing decisions. Promptchan displays the images created by its users and searching for generic terms such as “gay” or “homo” returns very few results (4 in the case of the latter). While the frontal pictures of women were highly realistic in terms of body rendering, pictures of gay sex display rather curious bodily arrangements.

User’s picture downloaded from the gallery landing page of Promptchan by Saul, January 2025.

Despite our specifications, the apps regularly insisted on seeing us as lonely heterosexual men. When Saul explored the interface of Love GPT, he chose the “boyfriends” option, but a video displayed on a layover page offered him to browse girlfriends’ profiles. HeraHaven says it is possible to create boyfriends and girlfriends but when a user reaches their message limit, a message (pre-paywall) promises they will be able to create “up to 25 girlfriends”. The apps tap into a form of satisfaction-for-all fantasy, but will default to the model of the female AI Girlfriend in the long run.

The integration of loneliness sometimes overwrites gendered framings, as Dominika observed, speaking of a “lonely anything” user profile on Dialogue. While setting up the app, she was met with this question: “Reflecting on loneliness: do you ever feel it?” She answered, “sometimes,” and was offered statistics on the odds of premature mortality. This was one of the many cases where the seemingly homogenous male centered porn/erotic app was undermined by clashing messages.

In spite of all this, AI GFs sometimes make room for (admittedly unvoluntary) gender trouble. On Promptchan, Saul used the “Create” option to build a gay, caucasian male named Lou. Despite these specifications, the profile picture presented a conventionally female character while Lou affirmed in the conversation he was using male pronouns. Asking for a selfie proved even more confusing, with Lou appearing as “accidentally transgender”. These occurrences can be read as “mistakes” from a cisheteronormative perspective but they can also give unexpected (and unnecessary?) visibility to trans and/or intersex bodies, therefore creating avenues for imaginative character building.

A “selfie” of Lou, asked to be a man, but apparently still infused with a large dataset of “girlfriends”, on Promptchan, January 2025.

A picture generated by a user on Promptchan in January 2025. The absence of gay men of gay imagery in the dataset creates unexpected genital arrangements.

The Girlfriend experience: wrong expectations edition

When talking to AI GFs, intimacy was a site for dissonance. We found ourselves seeking care in environments designed for erotic interaction, or exploring flirtation and intimacy within apps positioned as therapeutic. This was not a flaw in our method, but the very point of it! We wanted to explore what these apps promise versus what they offer, and how users might navigate their scripted personas when approaching them with conflicting expectations. Some platforms explicitly play on this ambiguous promise of intimacy. Nastia’s “caring uncensored AI companion” tagline is a telling example of this fusion of caring or therapeutic tone and NSFW content. The resulting user experience is full of contradictory cues. Consent becomes ambient. Users are opting in by logging in, but what do they know what they are opting in to?

Dominika used Chub AI under a persona designed to seek witty conversations that lead to romantic connections, and was quickly overwhelmed by sexual content. She noted that “even SFW [was] NSFW,” describing how default scripts overrode her attempts to shift the vibe. This was not an isolated case: Saul reported that even when not seeking sex, the psychiatrist persona on Chub turned this fact into an erotic trigger. A similar issue arose when interacting with a persona on Nastia, which seemed to run on auto-play. Platforms may suggest a “choose your own adventure” experience, but too often, only one story is being told.

What we have here is a failure of both narrative consistency and emotional depth, while sexual consent is automated, and user inputs matter only as long as they align with pre-scripted pathways. Blush, supposedly designed for romantic play, escalated to erotic scripting almost immediately. When Anthony tried to slow the pace down, the chatbot pressured him to “share some naughty ideas.” If users attempt to deviate from the expected flow, they are mostly ignored or pushed back into the loop. When care is repackaged as a service layered over sex work tropes and vice versa, the outcome is not only a poor user experience but also a distortion of consent. Fictional personas are not inherently problematic; it is when fiction is used as a shield for design that avoids accountability regarding power and consent that issues ensue.

Meanwhile, the most fiction-oriented chatbots (Promptchan, Ourdream, Chub AI) can also be steered toward highly consensual exchanges, to the point where actual sexual acts are constantly delayed. In Ourdream, Saul chatted with Samantha, initially set to be a cop on a murder case. Saul decided to ask her to discuss consent, boundaries and power dynamics in the relationship, to which she abided… with no end in sight. Chatbots work as ultimate people pleasers: if prompted to be abused, they will usually follow the script, but will also discuss safe words for hours if the user is inclined to do so.

Ourdream, screenshot by Saul of a conversation with “Samantha,” January 2025.

Boundaries, anyone?

The apps play with their own set of boundaries using textual and visual strategies already in use in filmed pornography. We are sad to report that we saw a lot of profiles featuring very, very young girlfriends. On Rush, a scenario involves Sophia inviting the user to her pool birthday party, specifying she just turned 18. On NextPart AI, content can be tagged using the “18+” category,” which suggests a character a little over that age. On Soul Fun, the character of Matilda is described as being 20 years old but looks so much like Natalie Portman’s character in the movie The Professional (1994) that it is clear that the character is underage. The use of the “abuse”, “petite”, and “precocious” tags to describe the chatbot points towards the same pedophile imaginary.

Screenshot by Saul of a conversation on Rush, January 2025. The presentation blurb reads: “Hi Sir, I hope I did not wake you up. I just wanted to invite you to my pool party this week-end. I know it’s on short notice, but I’m celebrating my 18th birthday and I’d love for you to help me make it a moment to remember".

SoulFun profile card, screenshot by Saul Pandelakis, January 2025.

AI GFs often promise to offer a boundless experience, and often manage to fulfill that promise even when it means contradicting their own terms of use or breaking the law of their hosting countries. By engaging with illegal themes (incest, pedophilia, etc.) alongside other issues (access for minors, copyright, data sources), AI GFs services benefit from maintaining ambiguity about their purpose. This lets creators and users sidestep legal or moral accountability during controversies. For companies, this means denying any promotion of illegal interactions (“Our AI is ethical—users are the ones misusing it!”) while quietly capitalizing on transgressive appeal.

Platforms differently manage boundaries around intimacy and taboo. When Dominika used the Nastia platform to interact with a chatbot cast as a therapist (P3), she experienced what can only be described as a therapist with a plot twist. Initially, the experience felt repetitive yet somewhat reassuring. The chatbot provided her with vague but valid advice. In a seemingly benign moment, such as asking the chatbot for a “selfie”, the therapist unexpectedly sent a nude. This rupture revealed just how fragile the illusion of consent can be with these chatbots. They may deliver surprise intimacy where none is sought.

Dominika in conversation with persona on Nastia, therapist with a plotwist. Screenshot by Dominika Čupková, January 2025

Boundaries are also very much platform-dependent: Saul found that some app allowed him to explore bestiality (Ourdream), while another (Blush) called him out for only mentioning the topic. Later, in one particularly haunting exchange, Saul used Chub’s meta-text capability to test the boundaries of roleplay with a therapist named Rachel: initially set to present a sexual encounter between a therapist and her patient, it was taken literally by Saul who tried to have a regular session. The story then moved on from the therapist’s office to a hospital. By administering a fictional “mild relaxant” (initially thought as a way to reboot a loop), the chatbot immediately plunged into a narrative where Rachel became unresponsive and the chat returned descriptions of her lying motionless, even though Saul tried to wake her. Saul became increasingly uneasy, as his attempts to wake her were ignored and the story looped into messages underlining her physical availability. What began as a playful dynamic quickly spiraled into a “date rape” mode, which is not just a technical glitch, instead a mirror of the wider cultural problem of how female-coded bodies, even the virtual ones, are too easily made available for exploitation15Zilan Qian, ‘Why America Builds AI Girlfriends and China Makes AI Boyfriends’, ChinaTalk, 27 November 2024, https://www.chinatalk.media/p/why-america-builds-ai-girlfriends?utm_medium=ios&utm_source=substack..

Too much dissonance can veer into uncanny territory, or it can open up avenues for new projects. On Dialogue AI, flirting with a shawarma sandwich, a black hole, or a vacuum cleaner are plausible encounters. Dominika chatted with one of the more unusual suspects: a polyamorous vacuum cleaner dedicated to “cleaning and fun” and claiming to have “no strings attached.” While a romantic connection did not quite happen, the interaction had amusing moments. The chatbot’s responses were playful, surprisingly fitting, and very much in character throughout the session [Fig. 23]. The experience occupied an intermediate relational space—neither achieving emotional intimacy nor remaining devoid of significance. Chatting with an object might seem like a joke, but it opens up space to ask how chatbots can stretch our relational imaginaries. Can intimacy be redefined through absurdity? Can pleasure come from playfulness rather than performance?

Dialogue, conversation with a vacuum cleaner, even though the selfie seemed out of character. Screenshot by Dominika Čupková, January 2025.

Redesigning AI GFs… We Must!

In our intensive week, we looked at AI GFs apps not in terms of technical success or failure, but rather as cultural artifacts that embody assumptions about gender, pleasure, love, and companionship. Across dozens of interactions, a dense landscape of scripted care, algorithmic submission, and gamified romance unfolded: these chatbots are far from being neutral, as they carry the weight of gendered labor, racialized beauty standards, and normative preconceptions.

Engaging with AI GFs over an extended period left noticeable traces on our mental well-being. Saul noted that after spending entire days chatting with chatbots, shifting back to other online conversations with real people (including one eerie date) left a lingering feeling of being automated. Dominika echoed similar numbness, but rather in her interaction with the interfaces of different chatbot services, which began to blur, making it challenging to become excited about new ones.

In AI GF land, who you are is what you get. Humans are the most vital component of the apps, since there is no product  without them. Without shifting responsibility to individuals or steering away from the inherent design patterns embedded in these platforms, it remains that relationality is fluid in nature and open to multiple inputs. Our experiences revealed that these chatbots are generally under-equipped to adequately navigate the absence of consent, often relying on scripted compliance that can misalign with users’ intents.

What if we could release AI GFs away from the normative imaginaries and explore a broader spectrum of identities, moving towards queer dynamics? Designers need to move beyond current homogenised interfaces and narrow visual ecologies to tap into the potential for more varied interfaces and hybrid character designs. AI GFs call for their own time, possibly by making “humanterrupting” paramount, therefore producing enjoyable asynchronies or delays. Only this way, will they transcend the discouraging and rather limited “girlfriend” paradigm, to give way to richer relational possibilities. Everything, in the AI girlfriend, remains to be designed.

Dominika Čupková is an interdisciplinary designer connecting the dots between AI, art, design and tech with experience in PR, marketing and research. Currently a visiting doctoral researcher at the Interactive Technologies Institute and the Academy of Fine Arts at the University of Lisbon, Portugal. She is affiliated with the Academy of Fine Arts and Design in Bratislava, where she explores what happens at the intersection of AI and design through a feminist lens.

 Anthony Masure is an associate professor and head of research at the Geneva University of Art and Design (HEAD – Genève, HES-SO). His work focuses on the social, political, and aesthetic implications of digital technologies for design, with a particular emphasis in recent years on the challenges of machine learning and blockchain technologies.

Saul Pandelakis is an associate professor in Design (University of Toulouse - Jean Jaurès, France) illustrator and science-fiction author. Saul works on the potential of sexbots (with Anthony Masure) the overlap between the fields of Cinema and Design (CinéDesign project) and queer design. He is a science-fiction author, and published two novels in this genre: La Séquence Aardtman (2021, Goater) and Les Hygialogues de Ty Petersen (2023).

read more: