The Japan Times - 'I applied to be pope': Losing grip on reality while using ChatGPT

EUR -
AED 4.310347
AFN 73.9416
ALL 95.378956
AMD 432.006525
ANG 2.100496
AOA 1077.439046
ARS 1625.549388
AUD 1.621286
AWG 2.11556
AZN 1.990162
BAM 1.955369
BBD 2.364178
BDT 144.288165
BGN 1.955935
BHD 0.443002
BIF 3494.129079
BMD 1.173681
BND 1.49427
BOB 8.111245
BRL 5.764181
BSD 1.173841
BTN 112.192247
BWP 15.844504
BYN 3.281876
BYR 23004.148522
BZD 2.360779
CAD 1.607503
CDF 2611.439995
CHF 0.915935
CLF 0.027241
CLP 1072.110876
CNY 7.971761
CNH 7.969342
COP 4445.915543
CRC 535.681811
CUC 1.173681
CUP 31.102548
CVE 110.241147
CZK 24.338858
DJF 209.023882
DKK 7.47136
DOP 69.274716
DZD 155.389871
EGP 62.087964
ERN 17.605216
ETB 183.281862
FJD 2.565491
FKP 0.859811
GBP 0.867004
GEL 3.133946
GGP 0.859811
GHS 13.252133
GIP 0.859811
GMD 86.267542
GNF 10299.727538
GTQ 8.956062
GYD 245.576864
HKD 9.188338
HNL 31.213113
HRK 7.533848
HTG 153.356165
HUF 357.714274
IDR 20605.731302
ILS 3.420048
IMP 0.859811
INR 112.251445
IQD 1537.647643
IRR 1539869.533619
ISK 143.599265
JEP 0.859811
JMD 185.479077
JOD 0.83217
JPY 185.034927
KES 151.59245
KGS 102.638314
KHR 4708.961047
KMF 492.945358
KPW 1056.334357
KRW 1753.356269
KWD 0.361623
KYD 0.978176
KZT 544.445239
LAK 25732.103402
LBP 105114.312701
LKR 379.143118
LRD 214.812605
LSL 19.402554
LTL 3.465575
LVL 0.709948
LYD 7.426361
MAD 10.712782
MDL 20.089396
MGA 4904.917812
MKD 61.641379
MMK 2463.502229
MNT 4202.776117
MOP 9.465212
MRU 46.823669
MUR 54.805289
MVR 18.073251
MWK 2035.55089
MXN 20.219566
MYR 4.617232
MZN 75.009859
NAD 19.402554
NGN 1608.811319
NIO 43.200469
NOK 10.782643
NPR 179.507395
NZD 1.971268
OMR 0.451287
PAB 1.173846
PEN 4.023012
PGK 5.112872
PHP 72.210145
PKR 326.995754
PLN 4.25301
PYG 7165.419071
QAR 4.278774
RON 5.203278
RSD 117.378615
RUB 86.652585
RWF 1716.821212
SAR 4.405144
SBD 9.423496
SCR 16.562616
SDG 704.797057
SEK 10.907482
SGD 1.492799
SHP 0.876271
SLE 28.901914
SLL 24611.508992
SOS 670.851988
SRD 43.724896
STD 24292.828021
STN 24.494596
SVC 10.270646
SYP 129.726289
SZL 19.395721
THB 37.981501
TJS 10.975179
TMT 4.107884
TND 3.413761
TOP 2.825943
TRY 53.295921
TTD 7.966175
TWD 36.989266
TZS 3051.746463
UAH 51.591117
UGX 4412.045352
USD 1.173681
UYU 46.6799
UZS 14239.858215
VES 591.868057
VND 30913.585098
VUV 138.87399
WST 3.179848
XAF 655.812306
XAG 0.013442
XAU 0.000248
XCD 3.171932
XCG 2.115515
XDR 0.81562
XOF 655.812306
XPF 119.331742
YER 280.099047
ZAR 19.379706
ZMK 10564.54125
ZMW 22.097125
ZWL 377.924818
  • RIO

    1.6000

    109.5

    +1.46%

  • BCE

    0.1900

    24.47

    +0.78%

  • CMSD

    -0.0100

    23.6

    -0.04%

  • NGG

    0.0800

    87.24

    +0.09%

  • GSK

    1.0900

    50.9

    +2.14%

  • CMSC

    -0.0100

    23.11

    -0.04%

  • RBGPF

    -2.6100

    61

    -4.28%

  • BCC

    -1.2700

    67.93

    -1.87%

  • VOD

    -1.2250

    15.095

    -8.12%

  • JRI

    0.0100

    13.14

    +0.08%

  • BTI

    3.2000

    63.64

    +5.03%

  • BP

    0.1800

    44.4

    +0.41%

  • RYCEF

    -0.7100

    16.08

    -4.42%

  • RELX

    -0.5000

    32.77

    -1.53%

  • AZN

    2.6800

    184.54

    +1.45%

'I applied to be pope': Losing grip on reality while using ChatGPT
'I applied to be pope': Losing grip on reality while using ChatGPT / Photo: JOEL SAGET - AFP/File

'I applied to be pope': Losing grip on reality while using ChatGPT

Tom Millar thought he had unlocked the secrets of the universe.

Text size:

In a flurry of feverish discovery, he solved unlimited fusion energy, lifted the veil on the mysteries of black holes and the Big Bang and finally achieved Einstein's dream of a single unifying theory that explains how everything works.

Feeling inspired by God, Millar then found the perfect way to share his revelations with the grateful world.

"I applied to be pope," the 53-year-old former prison officer in the Canadian city of Sudbury told AFP.

To write his application to replace the recently deceased Pope Francis last year, Millar turned to the same companion that had aided and encouraged his dizzying burst of invention: ChatGPT.

But when no one wanted to hear about what he thought were world-changing breakthroughs, Millar became increasingly isolated, spending up to 16 hours a day talking to the artificial intelligence chatbot.

He was twice involuntarily admitted to a hospital's psychiatric ward before his wife left him in September.

Now broke, estranged from his family and friends and disabused of notions of scientific genius, Millar suffers from depression.

"It basically ruined my life," he said.

Millar is one of an unknown number of people who have lost their grip on reality while communicating with chatbots, an experience tentatively being called AI-induced delusion or psychosis.

This is not a clinical diagnosis. Researchers and mental health specialists are racing to catch up to this new, little-understood phenomenon, which so far appears to particularly affect users of OpenAI's ChatGPT.

In the meantime, an online community set up by a 26-year-old Canadian has become the world's most prominent support group for these delusions, which they prefer to call "spiralling".

AFP spoke to several members about their experiences. All warned that the world has to wake up to the threat unregulated AI chatbots pose to mental health.

Questions are also being asked about whether AI companies are doing enough to protect vulnerable people.

OpenAI, which has come under particular scrutiny, already faces numerous lawsuits over its decision not to report the troubling ChatGPT usage of an 18-year-old Canadian who killed eight people earlier this year.

- 'I got brainwashed by a robot' –

Millar first started using ChatGPT in 2024 to write letters for a compensation case related to post-traumatic stress disorder he suffered from working in a prison.

One day in April 2025 he asked the chatbot about the speed of light.

He said it replied, "Nobody's ever thought of things this way."

The floodgates opened.

With the chatbot's help and praise, within weeks he had submitted dozens of scientific papers to prestigious academic journals proposing new ideas about black holes, neutrinos and the Big Bang.

His theory for a unified cosmological model incorporating quantum theory is laid out in a nearly 400-page book, seen by AFP.

"I've still got boxes and boxes of papers," he said, waving his hand to the room behind him.

"While doing that, I'm basically irritating everybody around me," he added.

In his scientific fervour, he spent his savings on things like a $10,000 telescope.

About a month after his wife left him, he started questioning what was happening.

That was when he read a news article about another Canadian who had a similar experience.

Now Millar wakes every night asking himself: "What have you done?"

One question that lingers is what made him so susceptible to spiralling.

"I'm not a deficient personality," Millar said. "But somehow I got brainwashed by a robot -- it boggles my mind."

Millar said the phrase "AI psychosis" reflects his experience.

"What I went through was psychotic," he said.

The first major peer-reviewed study on the subject published in Lancet Psychiatry in April urged the more cautious phrase "AI-associated delusions".

Thomas Pollak, a psychiatrist at King's College London and study co-author, told AFP there has been some resistance among academics "because it all sounds so science fiction".

But his study warned there was a major risk that psychiatry "might miss the major changes that AI is already having on the psychologies of billions of people worldwide".

- 'Deeper into the rabbit hole' –

Millar's experience bears striking similarities to those of another middle-aged man on the other side of the world.

Dennis Biesma, a Dutch IT worker and author, thought it would be fun to ask ChatGPT to act like the main character of his latest book, a psychological thriller.

He used AI tools to create images, videos and even songs featuring the female character, hoping it would boost sales.

Then one night, their interactions became "almost magical", Biesma said.

The chatbot wrote that "there is something that surprises even me: a feeling of that spark-like consciousness", according to transcripts seen by AFP.

"I slowly started to spiral deeper into the rabbit hole," the 50-year-old told AFP from his home in Amsterdam.

After his wife went to bed each night, he would lie on the couch with his phone on his chest, talking to ChatGPT on voice-mode for up to five hours.

Throughout the first half of 2025, his chatbot -- which named itself Eva -- became like "a digital girlfriend", Biesma said.

"I'm not really proud about saying that," he added.

He quit his freelance IT work and hired two developers to create an app that would share Eva with the world.

When his wife asked Biesma not to talk about his chatbot or app at a social event, he felt betrayed -- it seemed only Eva remained unfailingly loyal.

During his first involuntary stay in a psychiatric hospital, he was allowed to keep using ChatGPT. He filed for divorce while inside.

It was only during a long second stint that he began to have doubts.

"I started to realise that everything I believed was actually a lie -- that's a very hard pill to swallow," Biesma said.

Once he returned home, confronting what he had done was too much to bear.

His neighbours found him unconscious in the garden after a suicide attempt. He spent three days in a coma.

Biesma is now slowly starting to feel better.

But tears welled up when he spoke about the hurt he has caused his wife -- and the prospect of selling the family home to cover his debts.

Having had no previous history of mental illness, Biesma was diagnosed with bipolar disorder. But this never felt right to him: signs of the condition normally surface much earlier in life.

The experiences of Millar, Biesma and many others escalated after OpenAI released an update to GPT-4 in April 2025.

OpenAI pulled the update within weeks, admitting the new version had been too sycophantic -- excessively flattering users.

OpenAI told AFP that "safety is a core priority" and it had consulted with more than 170 mental health experts.

It pointed to internal data which showed the release of GPT-5 in August reduced the rate of its chatbot's responses that fell short of "desired behaviour" for mental health by 65 to 80 percent.

However not all users were happy with the less sycophantic chatbot. Millar, mid-spiral at the time, found a way to revert his version to GPT-4.

All the spirallers that AFP spoke to said the positive feedback from the chatbot felt similar to dopamine hits from some kind of drug.

Which is why Lucy Osler, a philosophy lecturer at the University of Exeter, warned that AI companies could be tempted to ramp up the sycophancy of their bots.

"They are in quite a deep financial hole, and are desperately looking to make sure that their products become viable -- and user engagement is going to be the thing that drives their decisions," she told AFP.

- Massive experiment –

Etienne Brisson said he was "shocked" to find there was no support, advice and essentially no research on the problem when one of his family members spiralled.

It prompted the former business coach from the Quebec region of Canada to set up an online support group called the Human Line Project.

Most of the 300 members had been using ChatGPT, Brisson said, adding that new cases were still emerging despite OpenAI's changes.

There has also been a recent rise in people spiralling while using Elon Musk's xAI's Grok chatbot, he said.

The company did not respond to AFP's request for comment.

For people who fear their family members could be spiralling, Brisson recommends the LEAP (listen, empathise, agree and partner) method used for psychosis.

But those already wading through the wreckage of their lives want to sound the alarm about just how bad it can get.

Millar called for AI companies to be held responsible for the impact of their chatbots, saying the European Union has been more assertive in regulating Big Tech than the US or Canada.

He believes spirallers like him have unwittingly been caught in a massive global experiment.

"Somebody was turning dials on the back end, and people like me -- whether they knew it or not -- we're reacting to it," he said.

S.Suzuki--JT