The Japan Times - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.342194
AFN 76.852816
ALL 96.530759
AMD 446.007327
ANG 2.116509
AOA 1084.218673
ARS 1708.516422
AUD 1.684735
AWG 2.131194
AZN 1.999946
BAM 1.955402
BBD 2.373527
BDT 144.020684
BGN 1.985611
BHD 0.445715
BIF 3491.918741
BMD 1.182354
BND 1.497995
BOB 8.143342
BRL 6.194825
BSD 1.178465
BTN 106.473877
BWP 16.277755
BYN 3.376527
BYR 23174.144818
BZD 2.370128
CAD 1.61373
CDF 2601.179459
CHF 0.917204
CLF 0.025768
CLP 1017.463332
CNY 8.20341
CNH 8.196792
COP 4311.963467
CRC 585.303136
CUC 1.182354
CUP 31.33239
CVE 110.242094
CZK 24.342664
DJF 209.85817
DKK 7.468719
DOP 74.207719
DZD 153.521617
EGP 55.547238
ERN 17.735315
ETB 182.806147
FJD 2.60035
FKP 0.866064
GBP 0.862823
GEL 3.186419
GGP 0.866064
GHS 12.910372
GIP 0.866064
GMD 86.31144
GNF 10339.28891
GTQ 9.039122
GYD 246.549814
HKD 9.240158
HNL 31.136847
HRK 7.535494
HTG 154.578535
HUF 380.871748
IDR 19828.850602
ILS 3.644034
IMP 0.866064
INR 106.904163
IQD 1543.792284
IRR 49806.67623
ISK 144.9923
JEP 0.866064
JMD 184.689435
JOD 0.838276
JPY 184.767103
KES 151.968261
KGS 103.396805
KHR 4754.971784
KMF 494.223854
KPW 1064.103817
KRW 1717.860366
KWD 0.363172
KYD 0.9821
KZT 590.832232
LAK 25348.840151
LBP 105532.664721
LKR 364.765751
LRD 219.193528
LSL 18.875558
LTL 3.491185
LVL 0.715194
LYD 7.450515
MAD 10.8101
MDL 19.956938
MGA 5222.958935
MKD 61.627456
MMK 2483.085887
MNT 4219.147567
MOP 9.48361
MRU 47.046214
MUR 54.258114
MVR 18.267441
MWK 2043.492681
MXN 20.374862
MYR 4.641909
MZN 75.375066
NAD 18.875638
NGN 1641.2847
NIO 43.371538
NOK 11.386728
NPR 170.365805
NZD 1.9599
OMR 0.454635
PAB 1.17846
PEN 3.967292
PGK 5.049164
PHP 69.726392
PKR 329.590704
PLN 4.224717
PYG 7818.441591
QAR 4.28521
RON 5.094886
RSD 117.380557
RUB 91.041263
RWF 1720.015348
SAR 4.433847
SBD 9.527531
SCR 16.379389
SDG 711.183042
SEK 10.520222
SGD 1.502536
SHP 0.887072
SLE 28.938098
SLL 24793.378203
SOS 672.388724
SRD 45.064847
STD 24472.347414
STN 24.495946
SVC 10.311901
SYP 13076.336237
SZL 18.882236
THB 37.344646
TJS 11.012765
TMT 4.150064
TND 3.407792
TOP 2.846825
TRY 51.43233
TTD 7.982409
TWD 37.341703
TZS 3055.250699
UAH 51.000234
UGX 4201.144842
USD 1.182354
UYU 45.390377
UZS 14427.063318
VES 439.41083
VND 30712.83601
VUV 141.335778
WST 3.223472
XAF 655.848461
XAG 0.013642
XAU 0.000234
XCD 3.195372
XCG 2.123877
XDR 0.815637
XOF 655.826278
XPF 119.331742
YER 281.843715
ZAR 18.87258
ZMK 10642.611403
ZMW 23.12739
ZWL 380.717611
  • SCS

    0.0200

    16.14

    +0.12%

  • CMSD

    -0.1400

    23.94

    -0.58%

  • RBGPF

    -2.1000

    82.1

    -2.56%

  • BTI

    0.8800

    61.87

    +1.42%

  • RYCEF

    0.2600

    16.93

    +1.54%

  • CMSC

    -0.0900

    23.66

    -0.38%

  • NGG

    1.6200

    86.23

    +1.88%

  • RIO

    3.8500

    96.37

    +4%

  • GSK

    0.8700

    53.34

    +1.63%

  • BCE

    0.2700

    26.1

    +1.03%

  • VOD

    0.3400

    15.25

    +2.23%

  • BCC

    3.1800

    84.93

    +3.74%

  • BP

    1.1200

    38.82

    +2.89%

  • JRI

    -0.0300

    13.12

    -0.23%

  • AZN

    -4.0900

    184.32

    -2.22%

  • RELX

    -5.0200

    30.51

    -16.45%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

H.Hayashi--JT