The Japan Times - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.302379
AFN 77.630569
ALL 96.538014
AMD 446.976007
ANG 2.097477
AOA 1074.275501
ARS 1697.812677
AUD 1.7715
AWG 2.111649
AZN 1.988399
BAM 1.95657
BBD 2.359729
BDT 143.286422
BGN 1.95657
BHD 0.441674
BIF 3464.164096
BMD 1.171511
BND 1.514596
BOB 8.096188
BRL 6.491695
BSD 1.171561
BTN 104.976337
BWP 16.479489
BYN 3.443356
BYR 22961.621678
BZD 2.356328
CAD 1.615778
CDF 2997.309068
CHF 0.931329
CLF 0.027194
CLP 1066.824736
CNY 8.248552
CNH 8.240211
COP 4522.280754
CRC 585.130409
CUC 1.171511
CUP 31.04505
CVE 110.308437
CZK 24.328951
DJF 208.632154
DKK 7.469457
DOP 73.388899
DZD 152.378002
EGP 55.864539
ERN 17.57267
ETB 182.00927
FJD 2.675378
FKP 0.875597
GBP 0.875271
GEL 3.145526
GGP 0.875597
GHS 13.456299
GIP 0.875597
GMD 85.520537
GNF 10241.032647
GTQ 8.977535
GYD 245.11652
HKD 9.115606
HNL 30.865154
HRK 7.537036
HTG 153.610488
HUF 386.592292
IDR 19560.724345
ILS 3.757095
IMP 0.875597
INR 104.941054
IQD 1534.804365
IRR 49320.626361
ISK 147.176814
JEP 0.875597
JMD 187.463818
JOD 0.830623
JPY 184.597964
KES 151.019467
KGS 102.44844
KHR 4701.851464
KMF 492.034348
KPW 1054.359906
KRW 1728.822826
KWD 0.359923
KYD 0.976384
KZT 606.298744
LAK 25374.991999
LBP 104916.71342
LKR 362.742839
LRD 207.371657
LSL 19.654239
LTL 3.459169
LVL 0.708636
LYD 6.350501
MAD 10.739129
MDL 19.83481
MGA 5328.098064
MKD 61.574246
MMK 2460.509788
MNT 4160.172387
MOP 9.390298
MRU 46.887463
MUR 54.065043
MVR 18.100085
MWK 2031.59999
MXN 21.112051
MYR 4.77627
MZN 74.866593
NAD 19.654239
NGN 1710.59357
NIO 43.116978
NOK 11.867632
NPR 167.962139
NZD 2.034347
OMR 0.451528
PAB 1.171561
PEN 3.945454
PGK 4.983963
PHP 68.61665
PKR 328.252757
PLN 4.204513
PYG 7860.095097
QAR 4.271282
RON 5.078971
RSD 117.426239
RUB 94.25453
RWF 1705.871727
SAR 4.394365
SBD 9.544009
SCR 17.761994
SDG 704.665134
SEK 10.855317
SGD 1.5146
SHP 0.878937
SLE 28.175218
SLL 24566.01071
SOS 668.363184
SRD 45.034656
STD 24247.918847
STN 24.509651
SVC 10.251037
SYP 12955.112643
SZL 19.651738
THB 36.814765
TJS 10.796251
TMT 4.10029
TND 3.42935
TOP 2.820719
TRY 50.15797
TTD 7.952131
TWD 36.92475
TZS 2923.151059
UAH 49.537807
UGX 4190.650167
USD 1.171511
UYU 45.998113
UZS 14084.546121
VES 330.553221
VND 30825.391347
VUV 141.78771
WST 3.265972
XAF 656.2154
XAG 0.017352
XAU 0.000269
XCD 3.166068
XCG 2.111531
XDR 0.816121
XOF 656.2154
XPF 119.331742
YER 279.283144
ZAR 19.644956
ZMK 10545.005839
ZMW 26.507438
ZWL 377.226164
  • SCS

    0.0200

    16.14

    +0.12%

  • CMSC

    -0.1200

    23.17

    -0.52%

  • JRI

    -0.0500

    13.38

    -0.37%

  • BCC

    -2.9300

    74.77

    -3.92%

  • VOD

    0.0400

    12.84

    +0.31%

  • NGG

    -0.2800

    76.11

    -0.37%

  • RELX

    0.0800

    40.73

    +0.2%

  • RBGPF

    0.0000

    80.22

    0%

  • RYCEF

    0.2800

    15.68

    +1.79%

  • RIO

    0.6900

    78.32

    +0.88%

  • BCE

    -0.0100

    22.84

    -0.04%

  • CMSD

    -0.0300

    23.25

    -0.13%

  • GSK

    0.3200

    48.61

    +0.66%

  • AZN

    0.7500

    91.36

    +0.82%

  • BTI

    -0.5900

    56.45

    -1.05%

  • BP

    0.6300

    33.94

    +1.86%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

S.Ogawa--JT