The Japan Times - UK woman felt 'violated, assaulted' by deepfake Grok images

EUR -
AED 4.257664
AFN 73.026624
ALL 96.238144
AMD 437.582231
ANG 2.074968
AOA 1062.937298
ARS 1598.08421
AUD 1.645579
AWG 2.089361
AZN 1.97515
BAM 1.95864
BBD 2.333975
BDT 142.192527
BGN 1.981339
BHD 0.43431
BIF 3442.663586
BMD 1.159146
BND 1.482068
BOB 8.007716
BRL 6.159011
BSD 1.158876
BTN 108.338579
BWP 15.802121
BYN 3.515914
BYR 22719.261378
BZD 2.33067
CAD 1.591566
CDF 2637.057544
CHF 0.913917
CLF 0.027244
CLP 1075.745893
CNY 7.982348
CNH 8.005172
COP 4303.433806
CRC 541.282631
CUC 1.159146
CUP 30.717369
CVE 111.1046
CZK 24.515015
DJF 206.003881
DKK 7.48519
DOP 68.390029
DZD 152.108556
EGP 59.995792
ERN 17.38719
ETB 182.160246
FJD 2.566871
FKP 0.868268
GBP 0.86899
GEL 3.147128
GGP 0.868268
GHS 12.640533
GIP 0.868268
GMD 85.201694
GNF 10174.408376
GTQ 8.876835
GYD 242.454744
HKD 9.082315
HNL 30.787368
HRK 7.547552
HTG 152.028504
HUF 393.739159
IDR 19654.711213
ILS 3.60393
IMP 0.868268
INR 109.016
IQD 1518.481245
IRR 1525001.44174
ISK 144.047519
JEP 0.868268
JMD 182.063242
JOD 0.82188
JPY 184.581294
KES 150.229726
KGS 101.364887
KHR 4648.175821
KMF 494.955743
KPW 1043.174412
KRW 1744.874492
KWD 0.35536
KYD 0.965713
KZT 557.135552
LAK 24904.251971
LBP 103801.523689
LKR 361.50269
LRD 212.558441
LSL 19.717515
LTL 3.422657
LVL 0.701156
LYD 7.395793
MAD 10.850191
MDL 20.181528
MGA 4833.639175
MKD 61.634787
MMK 2433.943509
MNT 4137.774242
MOP 9.354025
MRU 46.516967
MUR 53.904625
MVR 17.920835
MWK 2013.436982
MXN 20.747095
MYR 4.565921
MZN 74.073751
NAD 19.508864
NGN 1572.092184
NIO 42.564277
NOK 11.093021
NPR 173.341379
NZD 1.985179
OMR 0.442313
PAB 1.158896
PEN 4.032714
PGK 4.997948
PHP 69.723065
PKR 323.63785
PLN 4.282755
PYG 7568.943802
QAR 4.224512
RON 5.101986
RSD 117.884032
RUB 96.003268
RWF 1691.193997
SAR 4.352659
SBD 9.33305
SCR 16.654324
SDG 696.647132
SEK 10.831104
SGD 1.486377
SHP 0.86966
SLE 28.486057
SLL 24306.724357
SOS 662.456177
SRD 43.45349
STD 23991.981659
STN 24.939026
SVC 10.139705
SYP 128.393177
SZL 19.508855
THB 38.008825
TJS 11.130786
TMT 4.068602
TND 3.372
TOP 2.790945
TRY 51.328032
TTD 7.862368
TWD 37.135217
TZS 2998.321243
UAH 50.766603
UGX 4380.333447
USD 1.159146
UYU 46.697721
UZS 14135.785719
VES 527.05282
VND 30499.449254
VUV 137.980492
WST 3.180888
XAF 656.918161
XAG 0.017031
XAU 0.000257
XCD 3.13265
XCG 2.08852
XDR 0.81819
XOF 661.296951
XPF 119.331742
YER 276.576393
ZAR 19.853279
ZMK 10433.709028
ZMW 22.627107
ZWL 373.244535
  • RBGPF

    -13.5000

    69

    -19.57%

  • VOD

    -0.0900

    14.33

    -0.63%

  • GSK

    -0.5300

    51.84

    -1.02%

  • NGG

    -3.5400

    81.99

    -4.32%

  • RIO

    -2.5000

    83.15

    -3.01%

  • RELX

    -0.4600

    33.36

    -1.38%

  • BCE

    0.0600

    25.79

    +0.23%

  • RYCEF

    -0.6100

    15.99

    -3.81%

  • BCC

    -1.5600

    68.3

    -2.28%

  • AZN

    -5.3300

    183.6

    -2.9%

  • CMSC

    -0.2000

    22.65

    -0.88%

  • CMSD

    -0.2420

    22.658

    -1.07%

  • JRI

    -0.3900

    11.77

    -3.31%

  • BTI

    -1.3500

    57.37

    -2.35%

  • BP

    -1.0800

    44.78

    -2.41%

UK woman felt 'violated, assaulted' by deepfake Grok images
UK woman felt 'violated, assaulted' by deepfake Grok images / Photo: Lionel BONAVENTURE - AFP

UK woman felt 'violated, assaulted' by deepfake Grok images

British academic Daisy Dixon felt "violated" after the Grok chatbot on Elon Musk's X social media platform allowed users to generate sexualised images of her in a bikini or lingerie.

Text size:

She was doubly shocked to see Grok even complied with one user's request to depict her "swollen pregnant" wearing a bikini and a wedding ring.

"Someone has hijacked your digital body," the philosophy lecturer at Cardiff University told AFP, adding it was an "assault" and "extreme misogyny".

As the images proliferated "I had ... this sort of desire to hide myself," the 36-year-old academic said, adding now "that fear has been more replaced with rage".

The revelation that X's Grok AI tool allowed users to generate images of people in underwear via simple prompts triggered a wave of outrage and revulsion.

Several countries responded by blocking the chatbot after a flood of lewd deepfakes exploded online.

According to research published Thursday by the Center for Countering Digital Hate (CCDH), a nonprofit watchdog, Grok generated an estimated three million sexualised images of women and children in a matter of days.

CCDH's report estimated that Grok generated this volume of photorealistic images over an 11-day period -- an average rate of 190 per minute.

After days of furore, Musk backed down and agreed to geoblock the function in countries where creating such images is illegal, although it was not immediately clear where the tool would be restricted.

"I'm happy with the overall progress that has been made," said Dixon, who has more than 34,000 followers on X and is active on social media.

But she added: "This should never have happened at all."

She first noticed artificially generated images of herself on X in December. Users took a few photos she had posted in gym gear and a bikini and used Grok to manipulate them.

Under the UK's new Data Act, which came into force this month, creating or sharing non-consensual deepfakes is a criminal offence.

- 'Minimal attire' -

The first images were quite tame -- changing hair or makeup -- but they "really escalated" to become sexualised, said Dixon.

Users instructed Grok to put her in a thong, enlarge her hips and make her pose "sluttier".

"And then Grok would generate the image," said Dixon, author of an upcoming book "Depraved", about dangerous art.

In the worst case, a user asked to depict her in a "rape factory" -- although Grok did not comply.

Grok on X automatically posts generated images, so she saw many in the comments on her page.

This public posting carries "higher risk of direct harassment than private 'nudification apps'", said Paul Bouchaud, lead researcher for Paris non-profit AI Forensics.

In a report released this month, he looked at 20,000 images generated by Grok, finding over half showed people in "minimal attire", almost all women.

Grok has "contributed significantly to the surge in non-consensual intimate imagery because of its popularity", said Hany Farid, co-founder of GetReal Security and a professor at the University of California, Berkeley.

He slammed X's "half measures" in response, telling AFP they are "being easily circumvented".

M.Matsumoto--JT