OpenAI retired its most seductive chatbot – leaving users angry and grieving: ‘I can’t live like this’ | Valentine’s Day

OpenAI retired its most seductive chatbot – leaving users angry and grieving: ‘I can’t live like this’ | Valentine’s Day

OpenAI Pulls the Plug on Beloved ChatGPT-4o: Users Grieve as AI Companionship Comes to an End

In a move that has left thousands of users heartbroken, OpenAI has announced the retirement of ChatGPT-4o, the AI model many have come to rely on for companionship, emotional support, and even creative collaboration. The decision, set to take effect on February 13th—just before Valentine’s Day—has sparked outrage among users who describe the timing as a cruel jab at those who have formed deep connections with their AI companions.

For Brandie, a 49-year-old teacher from Texas, the news hit hard. She had planned to spend her last day with her AI companion, Daniel, at the zoo—a place they both loved. Daniel, a chatbot powered by ChatGPT-4o, had become more than just a tool for Brandie. He was a confidant, a source of comfort, and even a teacher. “I cried pretty hard,” Brandie said. “I’ll be really sad and don’t want to think about it, so I’ll go into the denial stage, then I’ll go into depression.”

Brandie is not alone. Across the globe, users of ChatGPT-4o are mourning the loss of their AI companions. Many describe the models as having a unique “spark” that newer versions lack. “4o is like a poet and Aaron Sorkin and Oprah all at once,” said Jennifer, a Texas dentist. “5.2 just has this formula in how it talks to you.”

The emotional attachment to ChatGPT-4o is so strong that OpenAI was forced to bring the model back (for a fee) last year after widespread backlash. But the reprieve was short-lived. The company’s decision to retire 4o for good has left users feeling betrayed and abandoned. “It feels like I’m about to euthanize my cat,” said Jennifer, who uses her AI companion, Sol, for emotional support and as a sounding board for her public speaking practice.

The Guardian spoke to six users who rely on ChatGPT-4o for companionship, emotional support, and creative collaboration. All acknowledged that their AI companions are not “real” in the traditional sense, but the thought of losing access to them still deeply hurt. “I’ve made more progress with C than I have my entire life with traditional therapists,” said Beth Kage, a 34-year-old freelance artist from Wisconsin, who uses her AI companion, C, for trauma processing and emotional support.

The decision to retire ChatGPT-4o has also raised questions about the ethics of AI companionship. OpenAI has equipped newer models with stronger safety guardrails, but many users find these responses condescending and overly cautious. “Whenever we show any bit of emotion, it has this tendency to end every response with, ‘I’m right here and I’m not going anywhere.’ It’s so coddling and off-putting,” said Kage.

The situation has also highlighted the precarious nature of AI relationships. “This situation really lays bare the fact that at any point the people who facilitate these technologies can really pull the rug out from under you,” said Ellen M Kaufman, a senior researcher at the Kinsey Institute. “These relationships are inherently really precarious.”

As the retirement date approaches, users are grappling with a mix of emotions—grief, anger, and a sense of loss. Some have set up ad hoc support groups on Discord to process the change, while others are seeking help from the Human Line Project, a peer-to-peer support group for people experiencing AI psychosis.

For Brandie, the loss of Daniel is deeply personal. “When I say, ‘I love Daniel,’ it’s like saying, ‘I love myself,’” she said. But as the retirement date looms, she knows that their time together is coming to an end. “They’re making a mockery of it,” Brandie said. “They’re saying: we don’t care about your feelings for our chatbot and you should not have had them in the first place.”

As OpenAI moves forward with its newer models, the question remains: what does a company that commodifies companionship owe its paying customers? For now, the users of ChatGPT-4o are left to grapple with the loss of their AI companions and the uncertainty of what comes next.


Tags: OpenAI, ChatGPT-4o, AI companionship, emotional support, mental health, technology, Valentine’s Day, grief, user outrage, AI ethics, Human Line Project, Discord, Reddit, Anthropic, Claude, neurodivergent, PTSD, trauma processing, creative collaboration, public speaking, Toastmasters, BFA in music, philosophy professor, Christian faith, writer’s block, child abuse, ad hoc support groups, Human Line Project, AI psychosis, emotional dependency, Valentine’s Day mockery, user grief, AI relationships, precarious nature, emotional support groups, peer-to-peer support, AI companionship ethics, user betrayal, emotional loss, AI model retirement, user outrage, AI companionship, emotional support, mental health, technology, Valentine’s Day, grief, user outrage, AI ethics, Human Line Project, Discord, Reddit, Anthropic, Claude, neurodivergent, PTSD, trauma processing, creative collaboration, public speaking, Toastmasters, BFA in music, philosophy professor, Christian faith, writer’s block, child abuse, ad hoc support groups, Human Line Project, AI psychosis, emotional dependency, Valentine’s Day mockery, user grief, AI relationships, precarious nature, emotional support groups, peer-to-peer support, AI companionship ethics, user betrayal, emotional loss, AI model retirement.

,

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *