"How Data-Fueled Neurotargeting Could Kill Democracy"
From MIT's Press Reader:
Left unchecked, the technique, which weaponizes emotional data for
political gain, could erode the foundations of a fair and informed
society.
One of the foundational concepts in modern democracies is what’s usually referred to as the marketplace of ideas,
a term coined by political philosopher John Stuart Mill in 1859, though
its roots stretch back at least another two centuries. The basic idea
is simple: In a democratic society, everyone should share their ideas in
the public sphere, and then, through reasoned debate, the people of a
country may decide which ideas are best and how to put them into action,
such as by passing new laws. This premise is a large part of the reason
that constitutional democracies are built around freedom of speech and a
free press — principles enshrined, for instance, in the First Amendment
to the U.S. Constitution.
Like so many other political ideals, the marketplace of ideas has
been more challenging in practice than in theory. For one thing, there
has never been a public sphere that was actually representative of its
general populace. Enfranchisement for women and racial minorities in the
United States took centuries to codify, and these citizens are still
disproportionately excluded from participating in elections by a variety of political mechanisms. Media ownership and employment also skews disproportionately male and white,
meaning that the voices of women and people of color are less likely to
be heard. And, even for people who overcome the many obstacles to
entering the public sphere, that doesn’t guarantee equal participation;
as a quick scroll through your social media feed may remind you, not all
voices are valued equally.
Above and beyond the challenges of
entrenched racism and sexism, the marketplace of ideas has another major
problem: Most political speech isn’t exactly what you’d call reasoned
debate. There’s nothing new about this observation; 2,400 years ago, the
Greek philosopher Aristotle argued that logos (reasoned argumentation) is only one element of political rhetoric, matched in importance by ethos (trustworthiness) and pathos (emotional resonance). But in the 21st century, thanks to the secret life of data, pathos
has become datafied, and therefore weaponized, at a hitherto
unimaginable scale. And this doesn’t leave us much room for logos,
spelling even more trouble for democracy.
An excellent — and alarming — example of the weaponization of emotional data is a relatively new technique called neurotargeting.
You may have heard this term in connection with the firm Cambridge
Analytica (CA), which briefly dominated headlines in 2018 after its role
in the 2016 U.S. presidential election and the UK’s Brexit vote came to
light. To better understand neurotargeting and its ongoing threats to
democracy, we spoke with one of the foremost experts on the subject:
Emma Briant, a journalism professor at Monash University and a leading
scholar of propaganda studies.
Modern
neurotargeting techniques trace back to U.S. intelligence experiments
examining brains exposed to both terrorist propaganda and American
counterpropaganda.
Neurotargeting, in its
simplest form, is the strategic use of large datasets to craft and
deliver a message intended to sideline the recipient’s focus on logos
and ethos and appeal directly to the pathos at their emotional core.
Neurotargeting is prized by political campaigns, marketers, and others
in the business of persuasion because they understand, from centuries of
experience, that provoking strong emotional responses is one of the
most reliable ways to get people to change their behavior. As Briant
explained, modern neurotargeting techniques can be traced back to
experiments undertaken by U.S. intelligence agencies in the early years
of the 21st century that used functional magnetic resonance imaging
(fMRI) machines to examine the brains of subjects as they watched both
terrorist propaganda and American counterpropaganda. One of the
commercial contractors working on these government experiments was
Strategic Communication Laboratories, or the SCL Group, the parent
company of CA.
A decade later, building on these insights, CA was
the leader in a burgeoning field of political campaign consultancies
that used neurotargeting to identify emotionally vulnerable voters in
democracies around the globe and influence their political participation
through specially crafted messaging. While the company was specifically
aligned with right-wing political movements in the United States and
the United Kingdom, it had a more mercenary approach elsewhere, selling
its services to the highest bidder seeking to win an election. Its
efforts to help Trump win the 2016 U.S. presidential election offer an
illuminating glimpse into how this process worked....