The Japan Times - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.229988
AFN 73.146945
ALL 96.133079
AMD 434.212947
ANG 2.061819
AOA 1056.200947
ARS 1595.729488
AUD 1.676138
AWG 2.073241
AZN 1.95884
BAM 1.9575
BBD 2.319785
BDT 141.322745
BGN 1.968783
BHD 0.434815
BIF 3421.327021
BMD 1.1518
BND 1.483169
BOB 7.988181
BRL 6.046028
BSD 1.151795
BTN 109.176408
BWP 15.880861
BYN 3.428493
BYR 22575.287657
BZD 2.316392
CAD 1.600253
CDF 2628.988678
CHF 0.919315
CLF 0.02693
CLP 1063.36549
CNY 7.961072
CNH 7.958342
COP 4233.211976
CRC 534.857582
CUC 1.1518
CUP 30.52271
CVE 110.369005
CZK 24.518422
DJF 205.093682
DKK 7.472328
DOP 68.558058
DZD 153.334083
EGP 61.736268
ERN 17.277006
ETB 178.048178
FJD 2.580321
FKP 0.866974
GBP 0.867284
GEL 3.086771
GGP 0.866974
GHS 12.620455
GIP 0.866974
GMD 84.656271
GNF 10098.639609
GTQ 8.815384
GYD 241.106739
HKD 9.021621
HNL 30.579896
HRK 7.535884
HTG 150.976542
HUF 389.090264
IDR 19570.240438
ILS 3.616135
IMP 0.866974
INR 108.896278
IQD 1508.830137
IRR 1512601.862779
ISK 143.606561
JEP 0.866974
JMD 181.293527
JOD 0.816578
JPY 183.86078
KES 149.734428
KGS 100.724635
KHR 4612.886352
KMF 492.970864
KPW 1036.623761
KRW 1744.390407
KWD 0.354775
KYD 0.959846
KZT 556.830884
LAK 25050.648874
LBP 103140.830206
LKR 362.813545
LRD 211.358254
LSL 19.777978
LTL 3.400967
LVL 0.696713
LYD 7.352226
MAD 10.765177
MDL 20.230571
MGA 4800.106597
MKD 61.676346
MMK 2417.436221
MNT 4113.24352
MOP 9.293293
MRU 45.987343
MUR 54.017007
MVR 17.795778
MWK 1997.10857
MXN 20.796407
MYR 4.629663
MZN 73.657744
NAD 19.778236
NGN 1591.99517
NIO 42.386262
NOK 11.212362
NPR 174.665914
NZD 2.005595
OMR 0.442792
PAB 1.151815
PEN 4.012185
PGK 4.977258
PHP 69.977059
PKR 321.451413
PLN 4.279935
PYG 7530.377025
QAR 4.199475
RON 5.097752
RSD 117.405319
RUB 93.874992
RWF 1681.924321
SAR 4.322129
SBD 9.262822
SCR 17.163771
SDG 692.232263
SEK 10.889179
SGD 1.482949
SHP 0.864149
SLE 28.276608
SLL 24152.69076
SOS 658.257439
SRD 43.308822
STD 23839.942611
STN 24.520978
SVC 10.077884
SYP 127.305795
SZL 19.775833
THB 37.764652
TJS 11.005823
TMT 4.031301
TND 3.395971
TOP 2.773258
TRY 51.215473
TTD 7.825763
TWD 36.869937
TZS 2977.40446
UAH 50.484891
UGX 4290.85719
USD 1.1518
UYU 46.623733
UZS 14046.382845
VES 538.960062
VND 30332.663288
VUV 137.508177
WST 3.196803
XAF 656.512961
XAG 0.016275
XAU 0.000254
XCD 3.112798
XCG 2.07583
XDR 0.816616
XOF 656.512961
XPF 119.331742
YER 274.819021
ZAR 19.662788
ZMK 10367.582559
ZMW 21.681643
ZWL 370.879256
  • RYCEF

    0.6600

    14.95

    +4.41%

  • RBGPF

    -13.5000

    69

    -19.57%

  • CMSC

    -0.0728

    22.23

    -0.33%

  • GSK

    0.3000

    54.53

    +0.55%

  • RIO

    3.0060

    91.826

    +3.27%

  • BCC

    1.2000

    76.15

    +1.58%

  • NGG

    0.2600

    83.95

    +0.31%

  • CMSD

    0.0500

    22.55

    +0.22%

  • AZN

    0.3150

    194.195

    +0.16%

  • VOD

    0.2000

    14.9

    +1.34%

  • BCE

    -0.0750

    25.155

    -0.3%

  • RELX

    0.2600

    33.01

    +0.79%

  • JRI

    0.2700

    12.19

    +2.21%

  • BTI

    -0.3200

    57.94

    -0.55%

  • BP

    0.6220

    47.972

    +1.3%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

S.Ogawa--JT