Home Tech What if your AI girlfriend hated you?

What if your AI girlfriend hated you?

0 comment
Screenshot of a conversation with the AngryGF chatbot in which the first user writes Okay, honestly this is annoying...

It seems we’ve reached a point in the AI ​​hype cycle where no idea is too far-fetched to launch. This week’s surprising AI project is a new spin on the romantic chatbot: a mobile app called AngryGF, which offers its users the uniquely unpleasant experience of being yelled at via messages from a fake person. Or, as co-founder Emilia Avilés explained in her original keynote: “It simulates scenarios in which female partners are angry, prompting users to comfort their angry AI partners” through a “gamified approach.” . The idea is to teach communication skills by simulating arguments that the user can win or lose depending on whether he can appease his furious girlfriend.

I’ve always assumed that the central appeal of a chatbot that simulates relationships is that they are easier to interact with than real-life humans. They have no needs or desires of their own. There is no chance of being rejected or made fun of. They exist as a kind of emotional security blanket. So the premise of AngryGF amused me. You get some of the disadvantages of a real-life girlfriend (she’s furious!!) but none of the advantages. Who would use this voluntarily?

Obviously, I downloaded AngryGF immediately. (It is available, for those who dare, on both the Apple App Store and Google Play). The app offers a variety of situations in which a girlfriend may apparently be angry and need “comforting.” They include: “You put your savings in the stock market and lose 50 percent. Your girlfriend finds out and gets angry” and “During a conversation with your girlfriend, you unconsciously praise a friend by mentioning that she is beautiful and talented. Your girlfriend gets jealous and angry.”

The app sets an initial “forgiveness level” between 0 and 100 percent. You have 10 tries to say reassuring things that tip the forgiveness meter to 100. I chose the seductively vague scenario called “Angry for No Reason,” in which the girlfriend is, uh, angry for no reason. The forgiveness meter was initially set at a paltry 30 percent, indicating it had a tough road ahead.

Reader: I failed. Although I really tried to write messages that would appease my crazy fake girlfriend, she continued to interpret my words in the least generous way and accuse me of not paying attention to her. A simple “How are you today?” Text from me: Loving! Considered! Ask questions!—was immediately greeted with a snappy response: “Oh, now do you care how I am?” Attempts to apologize only seemed to make her angrier. When I proposed a dinner date, she told me that wasn’t enough, but also that it would be better if I took her “somewhere nice.”


It was such an irritating experience that I snapped and told this malicious bot that it was annoying. “It’s good to know that my feelings bother you so much,” the sarcast-o-bot responded. When I decided to try again a few hours later, the app informed me that I would have to upgrade to the paid version to unlock more scenarios for $6.99 a week. No, thanks.

At this point I wondered if the app was some kind of avant-garde performance art. Who would even want her partner to register? Wouldn’t I love to know that my husband considered me volatile enough that I needed to practice lady-appeasement skills on a synthetic shrew. While apparently preferable to AI apps for girlfriends that seek to impersonate real-life relationships, an app designed to train men to improve their conversation with women by creating a killjoy robot woman total could actually be even worse.

I called Avilés, the co-founder, to try to understand what exactly was going on with AngryGF. She’s a Chicago-based social media marketer and says the app was inspired by her own past relationships, where she wasn’t impressed with her partner’s communication skills. Her trick seemed sincere. “You know the men,” she says. “They listen, but then they don’t act.”

Avilés describes herself as the co-founder of the app, but she is not particularly well-versed in the nuts and bolts of creating it. (She says that a team of “between 10 and 20” people works on the app, but that she is the only founder willing to put her name on the product.) She was able to specify that the application is built on OpenAI’s GPT. 4 and was not created with any additional personalized training data, such as real text messages between significant others.

“We didn’t actually consult directly with a relationship therapist or anything like that,” he says. Oh really.

You may also like