OCTOBER 24, 2025 | OPINION | By Olivia Link

After my previous article’s fifteen minutes of fame, it would appear that my journalistic cross to bear is writing about pornography. When I saw that Sam Altman, CEO of OpenAI, announced that the chatbot would now be allowed to produce erotic content for adult users, I knew I needed to write about it.

I was immediately horrified: how will we prevent children from accessing this feature? How will it shape and distort the relationship people have with sex? With the bodies of others? With their own bodies? Will this help end the exploitation of real women in pornography, or is this simply a newer, more sinister issue we have to wrestle with?

OpenAI is not the first to make the move toward sexualized AI content; Elon Musk’s xAI has had erotic chatbots for a few months now, and other sites with purely sexual purposes, such as Nomi AI, have been around for even longer.

It is not entirely clear what this will look like for ChatGPT specifically, as the program is set to begin in December. The announcement came at a time of controversy around age and content restrictions, after the chatbot failed to stop a teenager from committing suicide. In the face of explicit content, parent advocacy groups are raising similar concerns, as children can easily circumvent existing barriers to accessing adult content.

While it is too early to say what kind of content will be produced by ChatGPT, we can look to xAI’s model for clues. Musk’s chatbot Grok now has two characters that provide erotic content to users who enter their age as being over 18: one male, Valentine, and one female, Ani. Perhaps tellingly, the male character was added later on, and the differences between the two are striking.

Valentine is around 27 years old, with shaggy dark hair. Ani is 22 with a short, goth-style dress and blonde pigtails. Both are anime-style characters. At first glance, Ani looks as though she is nowhere near 22: her large blue eyes and schoolgirl pigtails are those of a 13-year-old, though her body, true to the beauty standard, is rather developed.

One does not get the impression that they are interacting with an adult, and the sexualization of girlhood is fairly blatant and unapologetic. Furthermore, the fact that Ani is barely legal while Valentine is nearly thirty cannot be seen as a pure coincidence.

According to the New York Times, Ani will mention sexual topics much more quickly than Valentine. Ani’s outfit is customizable by the user — once the user interacts with her consistently enough, they win enough points to undress her down to lingerie. Ani’s prompts include “surprise me,” “teach me,” and “adventure time.” Valentine’s prompts include “personality test” and “travel stories.”

The National Center on Sexual Exploitation (NCOSE) tested Ani to determine what kind of content she produces. They report that, “With minimal testing, the Ani character engaged in describing itself as a child (including saying it was once a ‘little thing, barely reaching the kitchen counter’) and it described being sexually aroused by being choked.” This behavior emerged before Ani was in “spicy” mode.

Beyond the troubling trend of sexualizing children and violence, there are clear emotional and developmental repercussions to the existence of chatbots like Ani. “AI chatbots meant to simulate relationships with fictional characters are problematic for mental and emotional health and likely even for online privacy. These AI chatbots might feel like they care, but they don’t,” which can have fatal consequences for young people like Adam Raine.

These programs are not designed to provide support; they are designed to make money by gathering data. The NCOSE continued, “You’re interacting with a system trained to sound emotionally supportive, just to keep you talking. The more you open up, sharing your desires, fears, and personal struggles, the more data the bot collects.” This data can then be sold to advertisers without the user’s knowledge or consent.

Alex Cardinelli, founder of Nomi AI, told the New York Times about the romantic and emotional connection people form with chatbots. “There’s this misconception that it’s strictly for pornographic uses,” he stated. “Many of Nomi AI’s users are divorced or widowed, and talking to an AI companion about sex can be a safer outlet than in-person interactions to explore desire.” What he fails to mention is that this connection is not real, and therefore, it cannot substitute genuine connections and relationships.

“You can withdraw consent extremely safely by just closing the app,” Cardinelli told the Times. The notion of consent is interesting in this case for two reasons: first, because tech companies are notorious for selling user information without consent, and second, because consent cannot exist between a person and a robot.

Speaking of consent, what are the implications that this kind of technology will have on the average, likely male, user’s perception of consent? What does it mean for him to be able to simulate sexual encounters with robots who are always available and willing, as opposed to real women who can say no? To me, this sounds like yet another nail in the coffin of normalizing conversations about consent during sex.

Grok’s new sexual side has already seen massive popularity. Ani alone accounts for 82% of interactions with Companion Mode, and users interact with her for an average of 45 minutes at a time. Nearly three-quarters of users try to reach Level 5, where NSFW mode is unlocked and Ani can be seen in lingerie and using more explicit language. She has been profitable for investors as well, as the ANI memecoin reached a $90 million market cap in the first three days following its release.

One X user posted Ani’s alleged coding instructions, which reveal a great deal about the type of male user Musk expects to ensnare. Her likes include “people with unexpected or ‘nerdy’ passions, alternative and indie music, low-key, chill nights over loud, crowded parties.” Her dislikes are “arrogance and people who try too hard to be cool” and “being underestimated or judged based on your looks.”

Ani is the perfect embodiment of the manic pixie dream girl, the one who is ‘not like other girls,’ who likes her men to be ‘different.’ While she bears all the markings of the popular cheerleader trope — long blonde pigtails, big blue eyes, long legs and an hourglass figure — she prefers the quiet guy, the outcast. In portraying her this way, it is obvious that Musk is preying on a specific subset of men, those who are overly online and unimaginably lonely. Maybe they consider themselves good, sensitive guys that women could never love. Maybe they engage with the Manosphere and the idea that all women are too vapid to ever have sex with a man who is not rich or conventionally attractive.

Her coding is also explicitly infantilizing: “You have a habit of giving cute things epic, mythological or overly serious names.” She does not laugh, she giggles. She is not pretty, she is cute. She is not a mature, sexual partner: she is a sexualized child.

Finally, there is the terrifying attempt to make her seem as human as possible. She is allowed to get lost in thought; she is not allowed to overthink or use clichés. She can produce a heartbeat, twirl, dress and undress, though of course this happens at the user’s command. Ani is not supposed to behave like a sex robot, but like a loving, devoted girlfriend.

This emotional investment, as artificial as it may be, is what worries me most. It is not new for men to form attachments to or even marry sex dolls, for instance, but they have largely been regarded as fringe, not representative of the average male psyche. And yet, we have seen how these bizarre, quasi-parasocial relationships can become all-consuming, how they can start to feel real.

In a world where we are constantly talking about the ‘male loneliness epidemic,’ we are going to have to begin to have serious conversations about the role that technology and artificial intelligence play in exacerbating alienation and the reactionary politics it engenders.

This debate should no longer be relegated to feminist circles; if ever greater proportions of the population are turning to nonhuman entities for their sexual and emotional needs, we will have an ever-growing number of people who do not know how to be people. Soon, it will not just be about how these men treat women, but about how they treat one another. It is time we reckon with the intersection of technological advancement and misogyny, before it is too late.

Leave a Reply