AI Relationships Are on the Rise. A Divorce Boom Could Be Next

Judges, Palmer says, “already struggle to know what to do in cases with humans,” and AI companions will only complicate the situation as they consider the broader impact on the relationship. Children make things even more complicated. When it comes to custody disputes, “it is conceivable and likely that they will question the parents’ judgment because they are having intimate discussions with a chatbot,” which “calls into question how they spend time with their child.”
Although the sophisticated chatbots we use today have only been around for a few years, Yang says technology will only play a bigger role in marriages and divorces. “As it continues to improve, becoming more realistic, compassionate and empathetic, more and more lonely and unhappy people in a marriage will seek love with a robot.”
Yang hasn’t yet asked her clients to raise the issue, but she expects an explosion in divorces in the coming years as more people turn to AI for companionship. “We will probably see an increase in the number of divorce filings. When Covid hit a few years ago, the increase in divorces was very significant. We probably saw three times as many divorces between 2020 and 2022. After 2022, once things return to normal, divorce rates went down. But they will probably go back up.”
This is already happening in some places. In the UK, a partner’s use of chatbot apps has become a more common factor in divorce, according to data collection service Divorce-Online. The platform claims to have received an increase in divorce requests this year, with customers saying apps like Replika and Anima created an “emotional or romantic attachment”.
Despite the rift this causes, Palmer says she continues to believe that relationships with AI can be positive. “Some people find true fulfillment.” But she warns that “people need to recognize the limits.” In October, California became the first state to pass a law regulating AI for companion chatbots. The law takes effect in January 2026 and requires apps to have certain key features, such as age verification and break reminders for minors, and prohibits chatbots from acting as healthcare professionals. Companies that profit from illegal deepfakes are also fined up to $250,000 per incident.
In a way, Palmer has already seen what’s happening today with social media rather than AI. “It could be that a partner is connected with someone they haven’t seen in years. Or there’s just a real need for communication. It’s a rare case where social media isn’t involved.” AI, she says, is the natural evolution of this. “And what I’m finding is that AI is becoming exactly that.”
:max_bytes(150000):strip_icc()/Health-sugar-cravings-6141bf51dba942c6bb0bba7f4963511c.png?w=390&resize=390,220&ssl=1)



