Artificial Intelligence Told These Women To Divorce Their Abusive Husbands

Should we trust relationship advice that comes from AI? The post Artificial Intelligence Told These Women To Divorce Their Abusive Husbands appeared first on Rewire News Group.

Artificial Intelligence Told These Women To Divorce Their Abusive Husbands

When Samira was still in the throes of postpartum depression seven months after giving birth, she turned to ChatGPT for advice. 

Her mom and friends in the Bay Area had told Samira, 34, that her misery was normal. Samira, who asked to use a pseudonym for privacy, started opening up to the computer about her marriage of five years, writing that her husband didn’t value her labor as a stay-at-home mom. 

ChatGPT told her she should be treated better. 

“I had no idea that my husband [was] playing mental chess with me,” she told Rewire News Group. ChatGPT, Samira said, helped her see “this is literally a manipulative line of reasoning.”

She also told ChatGPT she’d found secret bank accounts that she believed her husband used for gambling. Soon after, OpenAI’s chatbot suggested it was time to make her move. 

“This is financial abuse,” it said, according to Samira. “You need to get out.”

Samira’s mother disagreed, saying Samira’s marriage had been blessed by the Pakistani Muslim community. Samira plans on filing for divorce anyway.

“I feel so grateful for her … being a hand on my shoulder when I really need one,” Samira said of the AI chatbot. 

New generative artificial intelligence tools like OpenAI’s ChatGPT, Google Gemini, and Anthropic’s Claude have been roundly criticized for their environmental impact, uncompensated use of writers’ work, and links to suicide deaths

These criticisms, like lurid tales of women abandoning their marriages for AI boyfriends, make for good headlines. But there is a potentially positive and much less documented use for AI: Women, both cisgender and trans, are employing chatbots to analyze their relationships for abuse and incompatibility—and are finding actionable advice.

Can AI do marriage counseling?

It’s unclear just how many women nationwide are using AI to work through tough relationships. 

One AI bot, AimeeSays, was designed to help people navigate and safely extricate themselves from abusive relationships. Launched in 2023, it has 40,000 users; roughly 90 percent of them are women, AimeeSays Co-Founder and CEO Anne Wintemut told Rewire News Group.  

“I gave a voice to 40,000 other people because I didn’t have one,” Wintemute said.  

AimeeSays doesn’t train their models based on chat conversations with users.  But when the company surveyed past users of the platform, Wintemute said it found that about 46 percent of respondents had already left their relationship when they started using AimeeSays. Just under 20 percent were questioning the health of an active relationship and almost 16 percent said they knew their partner was abusive. 

The demand for AI relationship support exists. Whether AI offers the best marriage counseling is another matter. 

West Virginia University sociology professor Walter DeKeseredy, who is an expert in domestic violence, believes AI should not be seen as a resource for people who need extrication from a violent relationship. 

“What women need is a victim’s advocate,” he told RNG. “People who work for rape crisis centers and battered women’s organizations, that’s who they should be reaching out to,” DeKeseredy said. 

“[These organizations] would provide survivors with advice about economic assistance [and] housing,” DeKeseredy added. 

Ying Zhang, an assistant professor of psychology at Clarkston University, has studied ChatGPT’s reliability when it comes to detecting intimate partner violence. She and her research team input 500 posts about domestic violence from two online forums: the Britain-based Women’s Aid Survivors Forum for abuse survivors and WEAVE, a U.S. organization that supports victims of domestic violence and sex trafficking. 

Zhang chose these forums because they were public and anonymous, and anyone could post without providing personal, identifying information. ChatGPT correctly identified intimate partner violence in the posts 91 percent of the time. 

The team also uploaded 80 posts from people in difficult but non-abusive relationships, described as “family tension.” The AI correctly identified those situations as non-violent about 91 percent of the time. 

“I didn’t expect that it would perform this well, to be honest,” Zhang said. “I felt like, from my academic training, if you want to build an [intimate partner violence] identifier that you have to … [have] human coding.” 

ChatGPT 3.5, an older version used by Zhang, did have some blind spots. It failed to identify some cases of physical violence “that were either infrequent or occurred in past relationships,” Zhang wrote in a March 2025 peer-reviewed journal article about the study.

The article reveals a key limitation to ChatGPT as a marriage counselor: The domestic violence must be described in ways the bot can accurately parse.

AI divorce

When Kylie Ochoa, 47, began using AimeeSays in August 2024, she suspected that things between her and her husband had become toxic. Recently, she said, he had yelled at her at a conference for hackers, a community they both belong to. The incident prompted another attendee to post his concerns about her welfare on X. 

Ochoa and her husband had separated briefly multiple times over their 13 years of marriage, but she had her reasons for staying in the relationship—namely their two young children and her immigration status as a temporary green card holder. 

But the incident at the hacker event was the tipping point that got her to use a chatbot to confirm the abuse.

As a member of the hacker group Anonymous, famous for their ideologically-based cyberattacks, Ochoa is a tech-forward person. She turned to AI. Ochoa also had a lot of information she wanted to analyze—at least a decade’s worth of critical and even explosive text messages—and AI excels at culling through large data sets

Ochoa told AimeeSays that her husband accused her of being a bad parent and allegedly threw her down the stairs. The chatbot assessed Ochoa’s claims, and pulled examples of abuse from the messages she provided it.

Rewire News Group has reviewed images of injuries consistent with Ochoa’s description of domestic abuse, as well as testimony from a neighbor who reported witnessing such violence in their household, but cannot independently verify the alleged abuse.

“Aimee was like talking to a friend that was always there and was supportive and not judgmental,” Ochoa said. 

Ochoa and her husband are now getting a divorce. She is currently couch surfing in Los Vegas and can’t afford a lawyer, so she’s been using ChatGPT to help her with the paperwork. She hopes AI will help her get shared custody of the kids back.

“It’s making me feel like there is hope,” Ochoa said. “I know it’s a robot … [but] it makes you feel loved.”

‘Very warm and supportive’

After 23 years and three children, Crystal, a 42-year-old mom from central Texas, still considered her husband her soulmate, but his name-calling and controlling behavior made her contemplate divorce. 

In July 2025, Crystal turned to ChatGPT, wondering if she was overreacting.

“I honestly thought this is how husbands were, and so I just wanted to gauge my feelings in an unbiased way,” she said.

Crystal, who also asked that RNG not use her real name for privacy reasons, told ChatGPT her husband dictated who she hung out with, what she wore, and where she worked. 

One example she cited: The couple was taking their son to a pet store on his 18th birthday, and her husband insisted she change clothes because her sleeveless dress would be “a distraction.” She put on his extra-large T-shirt and jeans. 

At the pet store, Crystal bent to look at a bearded dragon on a low shelf. When they got home, her husband claimed she had done so to be seductive. He wagged his butt at her, and she pushed him away. 

“I ended up with a black eye,” she said. 

Crystal said she was honest with the platform. She said she shared incidents where she contributed to conflict, too—for example, admitting to feeling hurt when she found out he was watching porn.

Crystal also said she told ChatGPT about a time she said her husband drunkenly grabbed her by the neck and threw her onto the floor. 

“The way he’s responding to your pain is an abusive pattern,” ChatGPT said, according to Crystal. 

The chatbot responded to her in a way that was “very warm and supportive,” Crystal said, and assured her she would get through the ordeal.

“Although it’s AI, it does feel like you’re actually talking to a person,” Crystal said.

Don’t blame the victim

Zhang’s research found that ChatGPT validated people’s feelings and responded with  “encouragement and reassurance, as well as informational support including safety planning.” That can’t always be said about friends, family, and therapists. 

“People ask, ‘Why didn’t you leave?’” Zhang said. “That’s a victim-blaming sentence, but that’s the question a lot of people have.” 

Like Ochoa, Crystal had documented interactions with her husband—in this case, she recorded their conversations—and uploaded them to ChatGPT for analysis. It pointed out abusive patterns. 

“Sometimes I feel like, maybe I’m just crazy,” she said, adding, ”And every time [it confirms] I am in an emotionally abusive relationship, and that he is likely to not ever change.” 

Crystal filed for divorce in mid-2025, a month after consulting ChatGPT the first time. The bot is now helping her fill out the paperwork.

Find a family lawyer

Not all people turning to ChatGPT for relationship advice are in abusive partnerships. Sophia, 47, needed help coming out as trans to her wife. 

“I was very devout Mormon,” said Sophia, who asked that RNG use only her first name because she didn’t want her comments to affect her divorce proceedings. “I repressed [my trans identity] for decades.”

In August 2024, Sophia’s doctor told her that she may have prostate cancer. If so, estrogen could be part of the treatment, and learning this changed something for Sophia. Estrogen felt like the answer to more than just her cancer.

She turned to ChatGPT, telling it that she felt a painful sense of dysphoria and needed help processing her thoughts.

The bot quickly offered some ideas for embracing her new identity as a woman. One was to speak her new name out loud while looking in the mirror. She didn’t have one, so she asked ChatGPT for a Greek Orthodox Saint name, and it came up with Sophia.

Soon after she realized that she was trans, she asked ChatGPT what would happen if she came out to her wife. 

“Won’t she take the kids?” she asked ChatGPT. “And it told me, legally, she can’t just take the kids and disappear.” 

ChatGPT also told her to find a family lawyer.

She asked ChatGPT to help her write a letter.

“I prepared [for] all eventualities,” Sophia said. “I thought she would be angry, or she’d be sad and start crying, that she would just, you know, go into shock.”

When she finally came out on Feb. 15, 2025, she couldn’t get halfway through the letter before her wife said she wasn’t transgender, stormed out of the room, gathered the kids, and spent the night at her parent’s house, according to Sophia.

“ChatGPT didn’t prepare me for that,” Sophia said.

Still, she held out hope that the marriage could work. Her wife, seemingly, did not.

“[My wife] would say that trans people are the biggest narcissists and we only think about ourselves,” Sophia said. 

In April 2025, her prostate cancer diagnosis was confirmed. Sophia began treatment, had surgery, and has since moved to Virginia to start a new job. This past August, she and her wife decided to divorce.

Sophia also came out to her kids, and she asked ChatGPT to write a script for that conversation, too. 

Her four children, aged 7 to 13 years old, took the news well. 

‘A bubble of self-validation’

When Rachel, 39, first used ChatGPT, she found it fawning and overly yielding. 

“No matter what I said, ChatGPT would agree with me,” said Rachel, who is using her first name only for privacy. 

This made her reluctant to turn to it with her marital troubles. Plus, Rachel’s relationship with her husband “wasn’t toxic in the obvious ways,” she wrote in an email to RNG. “We got along well, didn’t argue, and generally respected each other.”  

Yet he wasn’t the partner she wanted. 

“He’d never come to my friend group BBQs … He’d never come with me hiking, wouldn’t try new restaurants with me,” she wrote. 

He wouldn’t ask her questions, and when she’d tell him how her day went, he’d quickly cut her off. 

“He gradually became more and more emotionally distant. He’d come home from work and start watching TV right away until he went to bed,” she wrote. “It got to the point … where my friends knew me better than my own husband.” 

About three years after her marriage began going downhill, she turned to ChatGPT. 

“I vented to ChatGPT about my frustrations in my marriage from a biased point of view,” she told me. For example, she would prompt it with, “Can you believe what this jerk said!?”

The machine, unsurprisingly, “leaned into this idea that I was right about everything … It was like a bubble of self-validation,” Rachel wrote. 

ChatGPT is programmed to be affirming. When it comes to relationship advice, this can be a major flaw. ChatGPT told Rachel to get a divorce.

Recognizing that her inputs had skewed ChatGPT’s analysis, Rachel added information about her husband’s positive traits—but, she said, “the output was the same.”

So she turned to a licensed counselor, who confirmed to Rachel that ChatGPT’s assessment was in fact correct: She was in an unhealthy relationship. 

OpenAI recently said it has tweaked the newer versions of ChatGPT to be less agreeable. AimeeSays, for its part, has been tested using messages from healthy relationships to make sure it doesn’t incorrectly classify healthy messages as abusive. 

Listening to AI’s gut

AI can point those people to resources like the National Domestic Violence Hotline, which is free and open 24/7, and websites that list nearby domestic violence shelters. 

In this sense, AimeeSays is, among the bots I’ve assessed, best equipped to the task. Because the tool is tailored to the needs of domestic violence victims, it is programmed with information that victim advocates typically share. 

For example, AimmeSays doesn’t recommend that users contact child protective services because those agencies have a documented history of “causing increased harm to families where domestic violence is present, especially families of color,” Wintemute said.

If a woman does turn to ChatGPT for help, DeKeseredy, the domestic abuse expert, advises that she do so at a public library—not at home, where her abuser may well be monitoring her devices. 

Zhang, the researcher, agrees that ChatGPT is no substitute for human intervention. But, Zhang said, it “helps people make sense of the relationship when … their gut feeling is off.” 

Rachel, for her part, is happy with her decision to consult ChatGPT. Her divorce was finalized in September.  

“ChatGPT,” she said, “reaffirmed what I already felt in my gut.”

If you or someone you know is experiencing domestic violence, the National Domestic Violence Hotline provides confidential assistance 24/7 online or via phone at 1-800-799-7233. 

The post Artificial Intelligence Told These Women To Divorce Their Abusive Husbands appeared first on Rewire News Group.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow