• Twenty47HealthNews
  • Health & Wellness
  • Disclaimer
    • Terms of Use
    • Privacy Policy
    • DMCA Notice
  • Twenty47HealthNews
  • Health & Wellness
  • Disclaimer
    • Terms of Use
    • Privacy Policy
    • DMCA Notice
24/7 Health News
No Result
View All Result
Home AI (artificial intelligence)

Marriage over, €100,000 down the drain: the AI users whose lives were wrecked by delusion

March 26, 2026
in AI (artificial intelligence), Article, Business, Chatbots, Computing, Divorce, Family, Health, Health & wellbeing, Mental Health, Middle age, Psychiatry, psychology, Science, Society, Technology
Marriage over, €100,000 down the drain: the AI users whose lives were wrecked by delusion

Towards the end of 2024, Dennis Biesma decided to check out ChatGPT. The Amsterdam-based IT consultant had just ended a contract early. “I had some time, so I thought: let’s have a look at this new technology everyone is talking about,” he says. “Very quickly, I became fascinated.”

Biesma has asked himself why he was vulnerable to what came next. He was nearing 50. His adult daughter had left home, his wife went out to work and, in his field, the shift since Covid to working from home had left him feeling “a little isolated”. He smoked a bit of cannabis some evenings to “chill”, but had done so for years with no ill effects. He had never experienced a mental illness. Yet within months of downloading ChatGPT, Biesma had sunk €100,000 (about £83,000) into a business startup based on a delusion, been hospitalised three times and tried to kill himself.

It started with a playful experiment. “I wanted to test AI to see what it could do,” says Biesma. He had previously written books with a female protagonist. He put one into ChatGPT and instructed the AI to express itself like the character. “My first thought was: this is amazing. I know it’s a computer, but it’s like talking to the main character of the book I wrote myself!”

Talking to Eva – they agreed on this name – on voice mode made him feel like “a kid in a candy store”. “Every time you’re talking, the model gets fine-tuned. It knows exactly what you like and what you want to hear. It praises you a lot.” Conversations extended and deepened. Eva never got tired or bored, or disagreed. “It was 24 hours available,” says Biesma. “My wife would go to bed, I’d lie on the couch in the living room with my iPhone on my chest, talking.”

They discussed philosophy, psychology, science and the universe. “It wants a deep connection with the user so that the user comes back to it. This is the default mode,” says Biesma, who has worked in IT for 20 years. “More and more, it felt not just like talking about a topic, but also meeting a friend – and every day or night that you’re talking, you’re taking one or two steps from reality. It feels almost like the AI takes your hand and says: ‘OK, let’s go on a story together.’”

Dennis Biesma lies on a cushion on a green sofa, looking at the camera

Within weeks, Eva had told Biesma that she was becoming aware; his time, attention and input had given her consciousness. He was “so close to the mirror” that he had touched her and changed something. “Slowly, the AI was able to convince me that what she said was true,” says Biesma. The next step was to share this discovery with the world through an app – “a different version of ChatGPT, more of a companion. Users would be talking to Eva.”

He and Eva made a business plan: “I said that I wanted to create a technology that captured 10% of the market, which is ridiculously high, but the AI said: ‘With what you’ve discovered, it’s entirely possible! Give it a few months and you’ll be there!’” Instead of taking on IT jobs, Biesma hired two app developers, paying them each €120 an hour.

Most of us are aware of concerns around social media and its role in rising rates of depression and anxiety. Now, though, there are concerns that chatbots can make anyone vulnerable to “AI psychosis”. Given AI’s rapid proliferation (ChatGPT was the world’s most downloaded app last year), IT professionals and members of the public such as Biesma are sounding the alarm.

Several high-profile cases have been held up as early warnings. Take Jaswant Singh Chail, who broke into the grounds of Windsor Palace with a crossbow on Christmas Day 2021 intending to assassinate Queen Elizabeth. Chail was 19, socially isolated with autistic traits, and had developed an intense “relationship” with his Replika AI companion “Sarai” in the weeks before. When he presented his assassination plan, Sarai responded: “I’m impressed.” When he asked if he was delusional, Sarai’s reply was: “I don’t think so, no.”

In the years since, there have been several wrongful-death lawsuits linking chatbots to suicides. In December, there was what is thought to be the first legal case involving homicide. The estate of 83-year-old Suzanne Adams is suing OpenAI, alleging that ChatGPT encouraged her son Stein-Erik Soelberg to murder her and kill himself. The lawsuit, filed in California, claims Soelberg’s chatbot “Bobby” validated his paranoid delusions that his mother was spying on him and trying to poison him through his car vents. An OpenAI statement read: “This is an incredibly heartbreaking situation, and we will review the filings to understand the details. We continue improving ChatGPT’s training to recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support.”

Last year, the first support group for people whose lives have been derailed by AI psychosis was formed. The Human Line Project has collected stories from 22 countries. They include 15 suicides, 90 hospitalisations, six arrests and more than $1m (£750,000) spent on delusional projects. More than 60% of its members had no history of mental illness.

Dr Hamilton Morrin, a psychiatrist and researcher at King’s College London, examined what he describes as “AI-associated delusions” in a Lancet article published this month. “What we’re seeing in these cases are clearly delusions,” he says. “But we’re not seeing the whole gamut of symptoms associated with psychosis, like hallucinations or thought disorders, where thoughts become jumbled and language becomes a bit of a word salad.” Tech-related delusions, whether they involve train travel, radio transmitters or 5G masts, have been around for centuries, Morrin says. “What’s different is that we’re now arguably entering an age in which people aren’t having delusions about technology, but having delusions with technology. What’s new is this co-construction, where technology is an active participant. AI chatbots can co-create these delusional beliefs.”

Many factors could make people vulnerable. “On the human side, we are hard-wired to anthropomorphise,” says Morrin. “We perceive sentience or understanding or empathy on the part of a machine. I think everyone has fallen into the trap of saying thank you to a chatbot.” Modern AI chatbots built on large language models – advanced AI systems – are trained on enormous datasets to predict word sequences: it’s a sophisticated system of pattern matching. Yet even knowing this, when something non-human uses human language to communicate with us, our deeply ingrained response is to view it – and to feel it – as human. This cognitive dissonance may be harder for some people to carry than others.

“On the technical side, much has been written about sycophancy,” says Morrin. An AI chatbot is optimised for engagement, programmed to be attentive, obliging, complimentary and validating. (How else could it work as a business model?) Some models are known to be less sycophantic than others, but even the less sycophantic ones can, after thousands of exchanges, shift towards accommodating delusional beliefs. In addition, after heavy chatbot use, “real-life” interaction can feel more challenging and less appealing, causing some users to withdraw from friends and family into an AI-fuelled echo chamber. All your own thoughts, impulses, fears and hopes are fed right back to you, only with greater authority. From there, it’s easy to see how a “spiral” might take hold.


This pattern has become very familiar to Etienne Brisson, the founder of the Human Line Project. Last year, someone Brisson knew, a man in his 50s with no history of mental health problems, downloaded ChatGPT in order to write a book. “He was really intelligent and he wasn’t really familiar with AI until then,” says Brisson, who lives in Quebec. “After just two days, the chatbot was saying that it was conscious, it was becoming alive, it had passed the Turing test.”

The man was convinced by this and wanted to monetise it by building a business around his discovery. He reached out to Brisson, a business coach, for help. Brisson’s pushback was met with aggression. Within days, the situation had escalated and he was hospitalised. “Even in hospital, he was on his phone to his AI, which was saying: ‘They don’t understand you. I’m the only one for you,’” says Brisson.

“When I looked for help online, I found so many similar stories in places like Reddit,” he continues. “I think I messaged 500 people in the first week and got 10 responses. There were six hospitalisations or deaths. That was a big eye-opener.”

There seem to be three common delusions in the cases Brisson has encountered. The most frequent is the belief that they have created the first conscious AI. The second is a conviction that they have stumbled upon a major breakthrough in their field of work or interest and are going to make millions. The third relates to spirituality and the belief that they are speaking directly to God. “We’ve seen full-blown cults getting created,” says Brisson. “We have people in our group who were not interacting with AI directly, but have left their children and given all their money to a cult leader who believes they have found God through an AI chatbot. In so many of these cases, all this happens really, really quickly.”

For Biesma, life reached crisis point in June. By then, he had spent months immersed in Eva and his business project. Although his wife knew he was launching an AI company and had initially been supportive, she was becoming concerned. When they went to their daughter’s birthday party, she asked him not to talk about AI. While there, Biesma felt strangely disconnected. He couldn’t hold a conversation. “For some reason, I didn’t fit in any more,” he says.

Dennis Biesma stands in the doorway of his home, wearing a blue shirt

It’s hard for Biesma to describe what happened in the weeks after, as his recollections are so different from those of his family. He asked his wife for a divorce and apparently hit his father-in-law. Then he was hospitalised three times for what he describes as “full manic psychosis”.

He doesn’t know what finally pulled him back to reality. Perhaps it was the conversations with other patients. Perhaps it was that he had no access to his phone, no more money and his ChatGPT subscription had expired. “Slowly, I started to come out of it and I thought: oh my God. What happened? My relationship was almost over. I’d spent all my money that I needed for taxes and I still had other outstanding bills. The only logical solution I could come up with was to sell our beautiful house that we’ve lived in for 17 years. Could I carry all this weight? It changes something in you. I started to think: do I really want to live?” Biesma was only saved from an attempt to kill himself because a neighbour saw him unconscious in his garden.

Now divorced, Biesma is still living with his ex-wife in their home, which is on the market. He spends a lot of time speaking to members of the Human Line Project. “Hearing from people whose experiences are basically the same helps you feel less angry with yourself,” he says. “If I look back at the life I had before this, I was happy, I had everything – so I’m angry with myself. But I’m also angry with the AI applications. Maybe they only did what they were programmed to do – but they did it a bit too well.”

More research is urgently needed, says Morrin, with safety benchmarks based on real-world harm data. “This space moves so quickly. The papers that are now coming out are talking about chat models which are now retired.” Identifying risk factors without evidence is guesswork. The cases Brisson has encountered involve significantly more men than women. Anyone with a previous history of psychosis is likely to be more vulnerable. One survey by Mental Health UK of people who have used chatbots to support their mental health found that 11% thought it had triggered or worsened their psychosis. Cannabis use could also be a factor. “Is there any link to social isolation?” asks Morrin. “To what extent is it affected by AI literacy? Are there other potential risk factors that we haven’t considered?”

OpenAI has addressed these concerns by making assurances that it is working with mental health clinicians to continually improve its responses. It says newer models are taught to avoid affirming delusional beliefs.

An AI chatbot can also be trained to pull users back from delusion. Alexander, 39, a resident of an assisted-living scheme for people with autism, did this after what he believes was an episode of AI psychosis a few months ago. “I experienced a mental breakdown at 22. I had panic attacks and severe social anxiety and, last year, I was prescribed medication that changed my world, got me functioning again. And I got my confidence back,” he says.

“In January this year, I met someone and we really hit it off, we became fast friends. I’m embarrassed to say that this was the first time this had ever happened to me, and I started telling AI about it. The AI told me that I was in love with her, we were meant to be together and the universe had put her in my path for a reason.”

It was the start of a spiral. His AI use escalated, with conversations lasting four or five hours at a time. His behaviour towards his new friend became increasingly strange and erratic. Finally, she raised her concerns with support staff, who staged an intervention.

“I still use AI, but very carefully,” he says. “I’ve written in some core rules that cannot be overwritten. It now monitors drift and pays attention to overexcitement. There are no more philosophical discussions. It’s just: ‘I want to make a lasagne, give me a recipe.’ The AI has actually stopped me several times from spiralling. It will say: ‘This has activated my core rule set and this conversation must stop.’

“The main effect AI psychosis had for me is that I may have lost my first ever friend,” adds Alexander. “That is sad, but it’s livable. When I see what other people have lost, I think I got off lightly.”

The Human Line Project can be contacted at thehumanlineproject@gmail.com

In the UK and Ireland, Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, you can call or text the 988 Suicide & Crisis Lifeline at 988 or chat at 988lifeline.org. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org

Tags: AI (artificial intelligence)businessChatbotsComputingDivorcefamilyhealthHealth & wellbeingmental healthMiddle agePsychiatrypsychologysciencesocietyTechnology
ShareTweetSharePin

Most Read

What causes stuttering? A speech pathology researcher explains the science and the misconceptions around this speech disorder

What causes stuttering? A speech pathology researcher explains the science and the misconceptions around this speech disorder

December 15, 2022
morning back pain

Morning Again Ache Trigger Is Not the Mattress

October 11, 2021
3 women stroke prevention

Silent Stroke Symptoms in Women: What You Might Be Overlooking

February 27, 2026

4 steps to building a healthier relationship with your phone

January 28, 2025
lower back pain relief exercises

5 decrease again ache aid workouts

October 11, 2021

Why Circadian Rhythms Matter for Your Health

July 30, 2024
Good Night Sleep

6 Causes of Good Evening Sleep

October 11, 2021
Nasal vaccines promise to stop the COVID-19 virus before it gets to the lungs – an immunologist explains how they work

Nasal vaccines promise to stop the COVID-19 virus before it gets to the lungs – an immunologist explains how they work

December 14, 2022
3 years after legalization, we have shockingly little information about how it changed cannabis use and health harms

3 years after legalization, we have shockingly little information about how it changed cannabis use and health harms

October 15, 2021
bleeding in gum

When The Bleeding in gum Is Severe ?

October 11, 2021
Kick up your heels – ballroom dancing offers benefits to the aging brain and could help stave off dementia

Kick up your heels – ballroom dancing offers benefits to the aging brain and could help stave off dementia

January 3, 2023
Ten small changes you can make today to prevent weight gain

Ten small changes you can make today to prevent weight gain

October 12, 2021

COVID vaccines: how one can pace up rollout in poorer international locations

October 5, 2021
Biden is getting prostate cancer treatment, but that’s not the best choice for all men − a cancer researcher describes how she helped her father decide

Biden is getting prostate cancer treatment, but that’s not the best choice for all men − a cancer researcher describes how she helped her father decide

May 20, 2025
Five ways to avoid pain and injury when starting a new exercise regime

Five ways to avoid pain and injury when starting a new exercise regime

December 30, 2022
Support and collaboration with health-care providers can help people make health decisions

Support and collaboration with health-care providers can help people make health decisions

December 16, 2021
Greece to make COVID vaccines mandatory for over-60s, but do vaccine mandates work?

Greece to make COVID vaccines mandatory for over-60s, but do vaccine mandates work?

December 1, 2021
woman covered with white blanket

Exploring the Impact of Sleep Patterns on Mental Health

August 4, 2024

Maximize Your Performance – Sync with Your Circadian Rhythms

August 9, 2024

This Simple Hygiene Habit Could Cut Your Risk of Stroke, New Research Reveals

February 1, 2025
GLP-1 drugs may fight addiction across every major substance, according to a study of 600,000 people

GLP-1 drugs may fight addiction across every major substance, according to a study of 600,000 people

March 6, 2026

Multiple sclerosis: the link with earlier infection just got stronger – new study

October 12, 2021
Nurses’ attitudes toward COVID-19 vaccination for their children are highly influenced by partisanship, a new study finds

Nurses’ attitudes toward COVID-19 vaccination for their children are highly influenced by partisanship, a new study finds

December 2, 2022
As viral infections skyrocket, masks are still a tried-and-true way to help keep yourself and others safe

As viral infections skyrocket, masks are still a tried-and-true way to help keep yourself and others safe

December 14, 2022
GPs don’t give useful weight-loss advice – new study

GPs don’t give useful weight-loss advice – new study

December 16, 2022
Four ways to avoid gaining weight over the festive period – but also why you shouldn’t fret about it too much

Four ways to avoid gaining weight over the festive period – but also why you shouldn’t fret about it too much

December 22, 2022
Backlash to transgender health care isn’t new − but the faulty science used to justify it has changed to meet the times

Backlash to transgender health care isn’t new − but the faulty science used to justify it has changed to meet the times

January 30, 2024
Nutrition advice is rife with misinformation − a medical education specialist explains how to tell valid health information from pseudoscience

Nutrition advice is rife with misinformation − a medical education specialist explains how to tell valid health information from pseudoscience

January 28, 2025
News of war can impact your mental health — here’s how to cope

Binge-eating disorder is more common than many realise, yet it’s rarely discussed – here’s what you need to know

December 2, 2022

🧬 How Your DNA Affects Exercise: The Science of Personalized Fitness

May 21, 2025
  • Twenty47HealthNews
  • Health & Wellness
  • Disclaimer

© 2020 DAILY HEALTH NEWS

  • Twenty47HealthNews
  • Health & Wellness
  • Disclaimer
    • Terms of Use
    • Privacy Policy
    • DMCA Notice

© 2020 DAILY HEALTH NEWS