The Japan Times - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.268315
AFN 74.383357
ALL 96.069565
AMD 438.430669
ANG 2.0805
AOA 1065.770893
ARS 1610.859736
AUD 1.673089
AWG 2.093478
AZN 1.935698
BAM 1.959148
BBD 2.34037
BDT 142.928584
BGN 1.986621
BHD 0.438831
BIF 3452.593924
BMD 1.162236
BND 1.490731
BOB 8.029137
BRL 5.986915
BSD 1.162021
BTN 107.846889
BWP 15.803894
BYN 3.455699
BYR 22779.833035
BZD 2.336995
CAD 1.614201
CDF 2655.709813
CHF 0.921212
CLF 0.027081
CLP 1069.176055
CNY 8.003798
CNH 7.989352
COP 4292.824668
CRC 540.253562
CUC 1.162236
CUP 30.799264
CVE 110.453301
CZK 24.521619
DJF 206.924337
DKK 7.471925
DOP 69.912194
DZD 154.160064
EGP 62.369209
ERN 17.433546
ETB 181.439465
FJD 2.623631
FKP 0.881558
GBP 0.871857
GEL 3.12639
GGP 0.881558
GHS 12.782506
GIP 0.881558
GMD 86.005571
GNF 10190.372536
GTQ 8.889154
GYD 243.198205
HKD 9.108923
HNL 30.867952
HRK 7.534319
HTG 152.529218
HUF 382.522792
IDR 19647.605993
ILS 3.645296
IMP 0.881558
INR 108.30288
IQD 1522.160462
IRR 1529357.795973
ISK 144.210321
JEP 0.881558
JMD 183.773297
JOD 0.823989
JPY 184.137177
KES 151.204654
KGS 101.637389
KHR 4649.205977
KMF 498.025366
KPW 1045.946896
KRW 1753.942231
KWD 0.359514
KYD 0.968409
KZT 552.401734
LAK 25609.090581
LBP 104057.817263
LKR 366.304475
LRD 213.22635
LSL 19.51547
LTL 3.431782
LVL 0.703025
LYD 7.411635
MAD 10.854405
MDL 20.469129
MGA 4916.656884
MKD 61.675934
MMK 2441.168262
MNT 4152.347734
MOP 9.382241
MRU 46.357029
MUR 54.381217
MVR 17.979526
MWK 2014.939086
MXN 20.706462
MYR 4.680306
MZN 74.32517
NAD 19.516311
NGN 1605.420575
NIO 42.764376
NOK 11.247845
NPR 172.555565
NZD 2.014254
OMR 0.446881
PAB 1.162046
PEN 4.043032
PGK 5.025481
PHP 69.946895
PKR 324.211215
PLN 4.280086
PYG 7546.800845
QAR 4.236686
RON 5.09652
RSD 117.423041
RUB 93.499543
RWF 1700.601609
SAR 4.36268
SBD 9.346748
SCR 16.101667
SDG 698.503739
SEK 10.890042
SGD 1.489417
SHP 0.871978
SLE 28.532786
SLL 24371.528338
SOS 664.072106
SRD 43.425788
STD 24055.946507
STN 24.54332
SVC 10.167333
SYP 128.714546
SZL 19.509435
THB 37.748856
TJS 11.111665
TMT 4.07945
TND 3.410986
TOP 2.798386
TRY 51.69999
TTD 7.886921
TWD 37.146187
TZS 3010.191905
UAH 50.847466
UGX 4328.528243
USD 1.162236
UYU 47.230519
UZS 14115.063345
VES 550.060735
VND 30607.49505
VUV 139.75194
WST 3.22836
XAF 657.116829
XAG 0.015374
XAU 0.000244
XCD 3.141002
XCG 2.09407
XDR 0.826295
XOF 657.071521
XPF 119.331742
YER 277.367942
ZAR 19.48344
ZMK 10461.519739
ZMW 22.397436
ZWL 374.23964
  • CMSC

    0.1500

    22.05

    +0.68%

  • NGG

    1.9600

    86.56

    +2.26%

  • GSK

    0.9250

    56.115

    +1.65%

  • RELX

    0.2500

    33.4

    +0.75%

  • BCE

    0.1650

    25.405

    +0.65%

  • RIO

    1.4700

    94.76

    +1.55%

  • BTI

    -0.9200

    57.55

    -1.6%

  • RBGPF

    -13.5000

    69

    -19.57%

  • BP

    -1.0900

    45.91

    -2.37%

  • RYCEF

    0.4000

    15.45

    +2.59%

  • VOD

    0.0850

    15.105

    +0.56%

  • CMSD

    0.1450

    22.245

    +0.65%

  • BCC

    -0.3500

    75.5

    -0.46%

  • JRI

    0.1200

    12.42

    +0.97%

  • AZN

    2.5140

    199.734

    +1.26%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

H.Hayashi--JT