It was as soon as a trope of science fiction, most notably in Her, the 2013 Spike Jonze movie, the place Joaquin Phoenix falls in love with an A.I. character. Now, chatbot relationships will not be solely actual however have morphed into a fancy sociotechnical phenomenon that researchers say calls for consideration from builders and policymakers alike, in accordance with a brand new examine from the Massachusetts Institute of Expertise (MIT).
The report analyzed posts between December 2024 and August 2025 from the greater than 27,000 members of r/MyBoyfriendIsAI, a Reddit web page devoted to A.I. companionship. The group is crammed with customers introducing their tech companions, sharing love tales and providing recommendation. In some instances, Redditers even show their commitments with wedding ceremony rings or A.I.-generated couple photographs.
“Individuals have actual commitments to those characters,” Sheer Karny, one of many examine’s co-authors and a graduate scholar on the MIT Media Lab, advised Observer. “It’s fascinating, alarming—it’s this actually messy human expertise.”
For a lot of, these bonds kind unintentionally. Solely 6.5 % of customers intentionally sought out A.I. companions, the examine discovered. Others started utilizing chatbots for productiveness and progressively developed sturdy emotional attachments. Regardless of the existence of firms like Character.AI and Replika, which market on to customers searching for companionship, OpenAI has emerged because the dominant platform, with 36.7 % of Reddit customers within the examine adopting its merchandise.
Preserving the “character” of an A.I. associate is a significant concern for a lot of customers, Karny famous. Some save conversations as PDFs to re-upload them if pressured to restart with a brand new system. “Individuals give you every kind of distinctive tips to make sure that the character that they cultivated is maintained by time,” he mentioned.
Shedding that character can really feel like grief. Greater than 16 % of discussions on r/MyBoyfriendIsAI deal with dealing with mannequin updates and loss—a development amplified final month when OpenAI, whereas rolling out GPT-5, briefly eliminated entry to the extra personable GPT-4o. The backlash was so intense that the corporate ultimately reinstated the older mannequin.
A remedy for loneliness?
A lot of the Reddit web page’s customers are single, with about 78 % making no point out of human companions. Roughly 4 % are open with their companions about their A.I. relationships, 1.1 % have changed human companions with the know-how, and 0.7 % preserve such relationships hidden.
On one hand, chatbot companionship might scale back loneliness, mentioned Thao Ha, a psychologist at Arizona State College who research how applied sciences reshape adolescent romantic relationships. However she additionally warned of long-term dangers. “For those who fulfill your want for relationships with simply relationships with machines, how does that have an effect on us over the long run?” she advised Observer.
The MIT examine urges builders so as to add safeguards to A.I. methods whereas preserving their therapeutic advantages. Left unchecked, the know-how might prey on vulnerabilities by techniques like love-bombing, dependency creation and isolation. Policymakers, too, ought to account for A.I. companionship in legislative efforts, equivalent to California’s SB 243 invoice, the authors mentioned.
Ha steered that A.I. merchandise bear an approval course of just like new drugs, which should clear intensive analysis and FDA assessment earlier than reaching the general public. Whereas replicating such a technique for know-how firms “could be nice,” she conceded that it’s unlikely in mild of the business’s profit-driven priorities.
A extra achievable step, she argued, is increasing A.I. literacy to assist the general public perceive each the dangers and advantages of forming attachments to chatbots. Nonetheless, such programming has but to materialize. “I want it was right here yesterday, but it surely’s not right here but,” Ha mentioned.