Divorced? With Kids? And an Impossible Ex? There’s AI for That

I. The Founder
Kennedy floor used to ask his assistant to read the messages his ex-wife was sending him. After the couple separated in 2020, Kennedy says, he found their communications “difficult.” An email, or a stream of them, would come in – stuff about their two kids mixed with unrelated emotional jabs – and his day would be ruined trying to respond. Kennedy, a serial technology founder and Silicon Valley investor, was in therapy at the time. But outside of the weekly sessions, he felt the need for real-time support.
After the couple divorced, their communications moved to a platform called OurFamilyWizard, used by hundreds of thousands of parents in the United States and abroad to exchange messages, share calendars and track their spending. (OFW keeps a time-stamped, court-admissible record of everything.) Kennedy paid extra for an add-on called ToneMeter, which OFW billed at the time as an “emotional spell check.” As you wrote a message, its software performed basic sentiment analysis, flagging language that might be “concerning,” “aggressive,” “upsetting,” “demeaning,” and so on. But there was a problem, Kennedy said: His co-parent didn’t seem to be using her Tonometer.
Kennedy, always an early adopter, had experimented with ChatGPT to “co-create” bedtime stories with his children. He now turns to it for advice on communicating with his ex. He was impressed – and he wasn’t the first. On Reddit and other Internet forums, people with difficult exes, family members, and coworkers posted messages shocked by the seemingly excellent advice and valuable emotional validation that a chatbot could provide. Here was a machine that could tell you, without apparent intention, that you were not the crazy one. Here was an advisor who would patiently hold your hand, 24 hours a day, while you waded through any amount of bullshit. “A scalable solution” to complement therapy, as Kennedy puts it. Finally.
But straight out of the box, ChatGPT was too chatty for Kennedy’s needs, he said, and far too apologetic. He would give her difficult messages and he would recommend that she respond (with many more sentences than necessary) I’m sorry, please forgive me, I’ll do better. Having no self, he had no self-esteem.
Kennedy wanted a chatbot with a “backbone” and he thought that if he built it, many other co-parents might want it too. According to him, AI could help them at every stage of their communications: it could filter out emotion-triggering remarks in incoming messages and summarize only the facts. He might suggest appropriate responses. This could guide users to “a better way,” Kennedy says. So he founded a company and started developing an app. He called it BestInterest, after the standard that courts often use for custody decisions: the “best interests” of the child or children. It took these commercially available OpenAI models and gave them a run for their money with its own prompts.
Separated partners end up fighting horribly for a variety of reasons, of course. For many, maybe even most, things calm down after enough months, and a tool like BestInterest might not be useful in the long run. But when a certain personality type comes into play — call it “high-conflict,” “narcissistic,” “controlling,” “toxic,” whatever synonym for “crazy” you tend to see crossing your Internet feed — the fights over kids, at least on one side, never stop. Kennedy wanted his chatbot to stand up to these people, so he turned to the one they hate the most: Ramani Durvasula, a Los Angeles-based clinical psychologist who specializes in how narcissism shapes relationships.



