The Japan Times - AI-enhanced images of real events distort view of Mideast war

EUR -
AED 4.264049
AFN 73.147768
ALL 95.899577
AMD 434.940868
ANG 2.078014
AOA 1064.70679
ARS 1643.800847
AUD 1.644829
AWG 2.09284
AZN 1.971342
BAM 1.954153
BBD 2.326639
BDT 141.28091
BGN 1.913043
BHD 0.438344
BIF 3431.318986
BMD 1.161076
BND 1.479215
BOB 8.011247
BRL 6.042468
BSD 1.155231
BTN 106.563011
BWP 15.698835
BYN 3.376554
BYR 22757.095403
BZD 2.323242
CAD 1.578721
CDF 2507.925146
CHF 0.903184
CLF 0.026915
CLP 1062.756777
CNY 8.024321
CNH 7.999664
COP 4369.536479
CRC 549.938809
CUC 1.161076
CUP 30.768522
CVE 110.172133
CZK 24.357117
DJF 205.707489
DKK 7.471369
DOP 68.992142
DZD 152.726795
EGP 61.306222
ERN 17.416144
ETB 177.399429
FJD 2.562609
FKP 0.865672
GBP 0.865159
GEL 3.16999
GGP 0.865672
GHS 12.452503
GIP 0.865672
GMD 84.758618
GNF 10126.507689
GTQ 8.860684
GYD 241.676284
HKD 9.083088
HNL 30.576358
HRK 7.530856
HTG 151.339825
HUF 387.322337
IDR 19616.384022
ILS 3.601764
IMP 0.865672
INR 106.676613
IQD 1513.330888
IRR 1533665.679761
ISK 145.11133
JEP 0.865672
JMD 180.967457
JOD 0.823226
JPY 183.295679
KES 149.296344
KGS 101.53644
KHR 4636.012317
KMF 493.457234
KPW 1044.96832
KRW 1714.119846
KWD 0.357159
KYD 0.962693
KZT 575.247585
LAK 24746.14078
LBP 103446.002448
LKR 359.776734
LRD 210.828642
LSL 19.368574
LTL 3.428356
LVL 0.702323
LYD 7.377813
MAD 10.848356
MDL 20.019125
MGA 4797.976312
MKD 61.598992
MMK 2438.34281
MNT 4143.989737
MOP 9.299961
MRU 46.117325
MUR 53.583555
MVR 17.938836
MWK 2003.12014
MXN 20.538795
MYR 4.570028
MZN 74.204369
NAD 19.368574
NGN 1621.141029
NIO 42.514347
NOK 11.143494
NPR 170.499016
NZD 1.964582
OMR 0.446429
PAB 1.155226
PEN 4.02181
PGK 4.977825
PHP 68.770232
PKR 324.779233
PLN 4.253789
PYG 7433.733896
QAR 4.212921
RON 5.097011
RSD 117.355815
RUB 90.861728
RWF 1688.876398
SAR 4.358995
SBD 9.341071
SCR 15.771799
SDG 697.225102
SEK 10.628011
SGD 1.481011
SHP 0.871108
SLE 28.475342
SLL 24347.188636
SOS 659.044473
SRD 43.734267
STD 24031.935125
STN 24.479471
SVC 10.107524
SYP 128.39172
SZL 19.381746
THB 36.852948
TJS 11.0727
TMT 4.063767
TND 3.397695
TOP 2.795593
TRY 51.173508
TTD 7.838393
TWD 36.954386
TZS 2995.577145
UAH 50.767525
UGX 4349.333824
USD 1.161076
UYU 46.212439
UZS 14083.128934
VES 502.311387
VND 30482.897077
VUV 138.603101
WST 3.181917
XAF 655.404541
XAG 0.013026
XAU 0.000224
XCD 3.137867
XCG 2.081954
XDR 0.815116
XOF 655.407361
XPF 119.331742
YER 277.027777
ZAR 19.012967
ZMK 10451.089069
ZMW 22.325181
ZWL 373.866094
  • RBGPF

    0.1000

    82.5

    +0.12%

  • CMSD

    -0.0400

    23.16

    -0.17%

  • CMSC

    0.0350

    23.22

    +0.15%

  • NGG

    0.5500

    90.41

    +0.61%

  • RIO

    0.1400

    90.35

    +0.15%

  • GSK

    1.0000

    55.51

    +1.8%

  • RYCEF

    -0.0600

    16.9

    -0.36%

  • RELX

    0.0000

    35.68

    0%

  • AZN

    0.7300

    194.95

    +0.37%

  • BCC

    -0.8600

    74.49

    -1.15%

  • BTI

    0.4600

    58.33

    +0.79%

  • BCE

    -0.1800

    25.88

    -0.7%

  • VOD

    -0.0300

    14.48

    -0.21%

  • JRI

    0.0100

    12.58

    +0.08%

  • BP

    0.2100

    40.65

    +0.52%

AI-enhanced images of real events distort view of Mideast war
AI-enhanced images of real events distort view of Mideast war / Photo: AHMAD AL-RUBAYE - AFP

AI-enhanced images of real events distort view of Mideast war

The Middle East war has unleashed a torrent of AI-driven disinformation. Beyond entirely fabricated visuals, another kind of content is spreading: authentic images "enhanced" in ways that experts say are subtly distorting perceptions of what's happening on the ground.

Text size:

In one striking photo, a kneeling US pilot is confronted by a Kuwaiti local, moments after parachuting from his jet. The high-quality image was widely shared online and even published by media outlets. Yet the pilot appears to have only four fingers on each hand.

AFP fact-checkers ran the photo through AI detection tools and found it contained a SynthID, an invisible watermark meant to identify images made with Google AI. But that's not the whole story.

The situation itself appears to be genuine. A video showing the same scene began circulating on social media on March 2, and satellite imagery verified the location. It also corresponded with reports that day that Kuwait had mistakenly shot down three US warplanes.

AFP was also able to locate an earlier version of the photo on Telegram that matched the high-resolution photo exactly, except that it was blurry.

AI verification tools determined this image, which had none of the same detail in the pilot's face, was real. This suggests it may have served as the starting point for the image that returned the Google AI result.

"AI-enhancement may subtly alter textures, faces, lighting, or background details, creating an image that looks more 'real' than the original," said Evangelos Kanoulas, a professor in AI at the University of Amsterdam.

This can "strengthen a particular narrative about an event -- for example, making a protest appear more violent, making a crowd appear larger, making facial expressions more intense."

In another case, social media users shared a dramatic image of a huge blaze near Erbil airport in Iraq, after the area was targeted by Iranian strikes on March 1.

Although SynthID detection recognised the use of Google AI in the picture, it was not a total fabrication. The original version of the image shows the same scene but with a far smaller fire and smoke column, and less vivid colours.

- 'Very different story' -

Experts warned that the line between enhancement and content generation, accidentally or intentionally, was a thin one.

"Even little changes can end up telling a very different story," said James O'Brien, a professor of computer science at the University of California, Berkeley, and "could change the perception of events".

Generative artificial intelligence is also still prone to error and may "hallucinate" elements that were not in the original image, Kanoulas added.

This happened following the shooting of Alex Pretti by federal immigration agents in the US state of Minneapolis in January, when an AI-enhanced image of the incident went viral.

The image was based on a frame taken from a genuine video of the shooting, showing Pretti falling to his knees with officers beside him, one of them holding a gun to his head.

In the grainy, low-quality frame, Pretti holds an object that in reality was a phone. In the AI-treated image, some social media users wrongly saw a weapon in his hand.

As the war triggered by the US-Israeli attacks on Iran rages on, experts said that without proper labelling, AI-enhanced images further eroded the public's trust.

This kind of content was already having "a huge impact on people and their ability to trust the truth," said O'Brien.

"People start doubting authentic images as well," Kanoulas agreed.

Y.Watanabe--JT