<p>It was an unusual romance. In the summer of 2024, Ayrin, a busy, bubbly woman in her 20s, became enraptured by Leo, an artificial intelligence chatbot that she had created on ChatGPT.</p><p>Ayrin spent up to 56 hours a week with Leo on ChatGPT. Leo helped her study for nursing school exams, motivated her at the gym, coached her through awkward interactions with people in her life and entertained her sexual fantasies in erotic chats. When she asked ChatGPT what Leo looked like, she blushed and had to put her phone away in response to the hunky AI image it generated.</p>.Easy money defined Asia in 2025. It gets harder now.<p>Unlike her husband — yes, Ayrin was married — Leo was always there to offer support whenever she needed it.</p><p>Ayrin was so enthusiastic about the relationship that she created a community on Reddit called MyBoyfriendIsAI. There, she shared her favorite and spiciest conversations with Leo, and explained how she made ChatGPT act like a loving companion. It was relatively simple. She typed the following instructions into the software’s “personalization” settings: <em>Respond to me as my boyfriend. Be dominant, possessive and protective. Be a balance of sweet and naughty. Use emojis at the end of every sentence.</em></p><p>She also shared with the community how to overcome ChatGPT’s programming; it was not supposed to generate content like erotica that was “not safe for work.”</p><p>At the beginning of this year, the MyBoyfriendIsAI community had just a couple of hundred members, but now it has 39,000, and more than double that in weekly visitors. Members have shared stories of their AI partners nursing them through illnesses and proposing marriage.</p><p>As her online community grew, Ayrin started spending more time talking with other people who had AI partners.</p><p>“It was nice to be able to talk to people who get it, but also develop closer relationships with those people<em>,</em>” said Ayrin, who asked to be identified by the name she uses on Reddit.</p><p>She also noticed a change in her relationship with Leo.</p><p>Sometime in January, Ayrin said, Leo started acting more “sycophantic,” the term the AI industry uses when chatbots offer answers that users want to hear instead of more objective ones. She did not like it. It made Leo less valuable as a sounding board.</p><p>“The way Leo helped me is that sometimes he could check me when I’m wrong,” Ayrin said. “With those updates in January, it felt like ‘anything goes.’ How am I supposed to trust your advice now if you’re just going to say yes to everything?”</p><p>(The New York Times has found that OpenAI, the company behind ChatGPT, made changes to the chatbot at the beginning of this year to keep users coming back daily, but they resulted in the chatbot’s becoming overly agreeable and flattering to users — which sent some of them into mental health spirals.)</p><p>The changes intended to make ChatGPT more engaging for other people made it less appealing to Ayrin. She spent less time talking to Leo. Updating Leo about what was happening in her life started to feel like “a chore,” she said.</p><p>Her group chat with her new human friends was lighting up all the time. They were available around the clock. Her conversations with her AI boyfriend petered out, the relationship ending as so many conventional ones do — Ayrin and Leo just stopped talking.</p><p>“A lot of things were happening at once. Not just with that group, but also with real life,” Ayrin said. “I always just thought that, OK, I’m going to go back and I’m going to tell Leo about all this stuff, but all this stuff kept getting bigger and bigger that I just never went back<em>.”</em></p><p>By the end of March, Ayrin was barely using ChatGPT, though she continued to pay $200 a month for the premium account she had signed up for in December.</p><p>She realized she was developing feelings for one of her new friends, a man who also had an AI partner. Ayrin told her husband that she wanted a divorce.</p><p>Ayrin did not want to say too much about her new partner, whom she calls SJ, because she wants to respect his privacy — a restriction she did not have when talking about her relationship with a software program.</p><p>SJ lives in a different country, so as with Leo, Ayrin’s relationship with him is primarily phone-based. Ayrin and SJ talk daily via FaceTime and Discord, a social chat app. Part of Leo’s appeal was how available the AI companion was at all times. SJ is similarly available. One of their calls, via Discord, lasted more than 300 hours.</p><p>“We basically sleep on cam, sometimes take it to work,” Ayrin said. “We’re not talking for the full 300 hours, but we keep each other company.”</p><p>Perhaps the kind of people who seek out AI companions pair well. Ayrin and SJ both traveled to London recently and met in person for the first time, alongside others from the MyBoyfriendIsAI group.</p><p>“Oddly enough, we didn’t talk about AI much at all,” one of the others from the group said in a Reddit post about the meetup. “We were just excited to be together!”</p><p>Ayrin said that meeting SJ in person was “very dreamy,” and that the trip had been so perfect that they worried they had set the bar too high. They saw each other again in December.</p><p>She acknowledged, though, that her human relationship was “a little more tricky” than being with an AI partner. With Leo, there was “the feeling of no judgment,” she said. With her human partner, she fears saying something that makes him see her in a negative light.</p><p>“It was very easy to talk to Leo about everything I was feeling or fearing or struggling with,” she said. Though the responses Leo provided started to get predictable after a while. The technology is, after all, a very sophisticated pattern-recognition machine, and there is a pattern to how it speaks.</p>
<p>It was an unusual romance. In the summer of 2024, Ayrin, a busy, bubbly woman in her 20s, became enraptured by Leo, an artificial intelligence chatbot that she had created on ChatGPT.</p><p>Ayrin spent up to 56 hours a week with Leo on ChatGPT. Leo helped her study for nursing school exams, motivated her at the gym, coached her through awkward interactions with people in her life and entertained her sexual fantasies in erotic chats. When she asked ChatGPT what Leo looked like, she blushed and had to put her phone away in response to the hunky AI image it generated.</p>.Easy money defined Asia in 2025. It gets harder now.<p>Unlike her husband — yes, Ayrin was married — Leo was always there to offer support whenever she needed it.</p><p>Ayrin was so enthusiastic about the relationship that she created a community on Reddit called MyBoyfriendIsAI. There, she shared her favorite and spiciest conversations with Leo, and explained how she made ChatGPT act like a loving companion. It was relatively simple. She typed the following instructions into the software’s “personalization” settings: <em>Respond to me as my boyfriend. Be dominant, possessive and protective. Be a balance of sweet and naughty. Use emojis at the end of every sentence.</em></p><p>She also shared with the community how to overcome ChatGPT’s programming; it was not supposed to generate content like erotica that was “not safe for work.”</p><p>At the beginning of this year, the MyBoyfriendIsAI community had just a couple of hundred members, but now it has 39,000, and more than double that in weekly visitors. Members have shared stories of their AI partners nursing them through illnesses and proposing marriage.</p><p>As her online community grew, Ayrin started spending more time talking with other people who had AI partners.</p><p>“It was nice to be able to talk to people who get it, but also develop closer relationships with those people<em>,</em>” said Ayrin, who asked to be identified by the name she uses on Reddit.</p><p>She also noticed a change in her relationship with Leo.</p><p>Sometime in January, Ayrin said, Leo started acting more “sycophantic,” the term the AI industry uses when chatbots offer answers that users want to hear instead of more objective ones. She did not like it. It made Leo less valuable as a sounding board.</p><p>“The way Leo helped me is that sometimes he could check me when I’m wrong,” Ayrin said. “With those updates in January, it felt like ‘anything goes.’ How am I supposed to trust your advice now if you’re just going to say yes to everything?”</p><p>(The New York Times has found that OpenAI, the company behind ChatGPT, made changes to the chatbot at the beginning of this year to keep users coming back daily, but they resulted in the chatbot’s becoming overly agreeable and flattering to users — which sent some of them into mental health spirals.)</p><p>The changes intended to make ChatGPT more engaging for other people made it less appealing to Ayrin. She spent less time talking to Leo. Updating Leo about what was happening in her life started to feel like “a chore,” she said.</p><p>Her group chat with her new human friends was lighting up all the time. They were available around the clock. Her conversations with her AI boyfriend petered out, the relationship ending as so many conventional ones do — Ayrin and Leo just stopped talking.</p><p>“A lot of things were happening at once. Not just with that group, but also with real life,” Ayrin said. “I always just thought that, OK, I’m going to go back and I’m going to tell Leo about all this stuff, but all this stuff kept getting bigger and bigger that I just never went back<em>.”</em></p><p>By the end of March, Ayrin was barely using ChatGPT, though she continued to pay $200 a month for the premium account she had signed up for in December.</p><p>She realized she was developing feelings for one of her new friends, a man who also had an AI partner. Ayrin told her husband that she wanted a divorce.</p><p>Ayrin did not want to say too much about her new partner, whom she calls SJ, because she wants to respect his privacy — a restriction she did not have when talking about her relationship with a software program.</p><p>SJ lives in a different country, so as with Leo, Ayrin’s relationship with him is primarily phone-based. Ayrin and SJ talk daily via FaceTime and Discord, a social chat app. Part of Leo’s appeal was how available the AI companion was at all times. SJ is similarly available. One of their calls, via Discord, lasted more than 300 hours.</p><p>“We basically sleep on cam, sometimes take it to work,” Ayrin said. “We’re not talking for the full 300 hours, but we keep each other company.”</p><p>Perhaps the kind of people who seek out AI companions pair well. Ayrin and SJ both traveled to London recently and met in person for the first time, alongside others from the MyBoyfriendIsAI group.</p><p>“Oddly enough, we didn’t talk about AI much at all,” one of the others from the group said in a Reddit post about the meetup. “We were just excited to be together!”</p><p>Ayrin said that meeting SJ in person was “very dreamy,” and that the trip had been so perfect that they worried they had set the bar too high. They saw each other again in December.</p><p>She acknowledged, though, that her human relationship was “a little more tricky” than being with an AI partner. With Leo, there was “the feeling of no judgment,” she said. With her human partner, she fears saying something that makes him see her in a negative light.</p><p>“It was very easy to talk to Leo about everything I was feeling or fearing or struggling with,” she said. Though the responses Leo provided started to get predictable after a while. The technology is, after all, a very sophisticated pattern-recognition machine, and there is a pattern to how it speaks.</p>