World

Sunday, November 2, 2025, 11:33 GMT+7

The rise of the AI gut check

From divorce to relocation, people are turning to chatbots for guidance on major life decisions.

The rise of the AI gut check

An illustrative image. Photo: Reuters

When Katie Moran decided to break up with her boyfriend of six months in April this year, she credited an unlikely source of support: ChatGPT.

“It made me reflect and have a conversation with myself that I was avoiding,” the 33-year-old based in New Jersey said of the chatbot, which she affectionately refers to as “Chat.” Though she confided in her friends and family, she says it was ChatGPT who helped her realize that her relationship was the source of her anxiety. After going back and forth with the chatbot for a week, she decided to end it.

Most people are accustomed to turning to friends, family or a therapist for advice on major life decisions like breakups, career changes, or moving to a different country. But now, some people are turning to AI for on-demand, judgment-free gut checks. While some like Moran credit AI with giving them the confidence they need to make difficult choices, experts advise caution, noting that AI’s sycophantic nature can make for misleading results.

For Julie Neis, it was burnout that ultimately led her to confide in ChatGPT. She had been working in San Francisco’s tech scene for three years when she says she became overcome with anxiety, depression and chronic fatigue.

“I finally got to the point where I was like, I have to do something and change something,” she says of that period late last year. “I was a shell of a human.”

So she resolved to move — to France, specifically — and turned to ChatGPT for guidance. After detailing her criteria (a quiet town, with a decently sized expat community) and her red lines (no busy cities like Paris), the chatbot issued its recommendation: Uzès, a small town in the south of France, population 8,300.

Neis moved there in April, and says that outsourcing the decision-making process to ChatGPT helped her feel less overwhelmed by the whole process. Still, she says, it wasn’t perfect. Although Uzès does have a sizable population of expats from the U.S. and the UK, what ChatGPT failed to mention was that most of those people are retirees. Neis is 44.

About half of messages entered into ChatGPT fall under the category of “asking,” which it classifies as “seeking information or clarification to inform a decision,” according to a recent study by OpenAI, the developer of ChatGPT. Sam Altman, the CEO of OpenAI, noted that this trend is most pronounced among younger users.

“They don’t really make life decisions without asking ChatGPT what they should do,” Altman said at a talk at Sequoia Capital’s AI Ascent event in May, referring to users in their 20s and 30s. “It has the full context on every person in their life and what they’ve talked about.” (OpenAI did not respond to a request for comment.)

But it’s not just young people who are turning to AI in this way. Mike Brown of Kansas City, Missouri, was in his early 50s when, in 2023, he decided to confide in a chatbot for advice on what to do about his marriage of 36 years. Although his friends, priest and marriage counselor all advised that he file for divorce, it wasn’t until he had a 30-minute conversation with Pi.ai, an interactive chatbot launched that same year , that he says he felt sure of his decision.

“I need to play these thoughts through and need affirmation that this really is the direction,” Brown says, noting that he trusted the chatbot to give him a “credible” view on the situation.

Léonard Boussioux, a professor at University of Washington Foster School of Business who researches how human-AI collaboration can improve decision-making, says he understands why people are turning to AI in this way. It’s available 24/7, can provide answers much quicker than most humans, and can be seen to be more objective too.

“AI tends to be more diplomatic,” Boussioux says, whereas “humans tend to be very opinionated, especially with personal advice.”

However, Boussioux warns that because most AI models “tend to be sycophantic,” they may not be as concerned with giving the best advice as they are with pleasing the user. “They've been trained to be pleasing the user because if you please the user, then the user comes back,” he adds.

This was the case with Moran, who says she was surprised by how ChatGPT spoke like a friend, telling her things like, “You deserve someone who reassures you — not someone whose silence makes you spiral.”

None of those who spoke with Reuters say they regret relying on AI for decision-making. For Brown, it acted as a “passionate, neutral observer.” For Moran, it was akin to a “best friend.” Neis, meanwhile, says it helped her realize what she wanted.

Still, Boussioux offers a note of caution, warning that offloading our decision-making to AI runs the risk of dulling our own problem-solving skills. "I would say take a step back and reflect on the beauty of having to make decisions ourselves,” he says, “and to make sure that we are also doing the thinking.”

Reuters

Comment (0)
thông tin tài khoản
(Tuoitre News gives priority to approving comments from registered members.)
Most Popular Latest Give stars to members