BLACK FRIDAY: Save up to $1,322 on our trips! Limited spots. Book Now.

How Targeted Dark Ads Are Manipulating Your Opinions

© Michaela Pointon/Culture Trip
© Michaela Pointon/Culture Trip | Michaela Pointon

In the digital battle for your heart, mind and vote, dark ads are on a mission to manipulate.

The controversial practice is known as psychographic targeting – directing political or advertising campaigns at individual social media users based on their personality, publicly posted interests and demographic details such as age and location.

The implications and use of the tactic were widely discussed throughout the US presidential election and EU referendum vote, when firms including Cambridge Analytica and AggregateIQ reportedly used psychographic targeting to influence individual voters.

This May, the UK Information Commissioner’s Office (ICO) launched an investigation into the practice and new research from the not-for-profit Online Privacy Foundation has offered the first proof that psychographically targeted Facebook ads are more successful than non-targeted campaigns in swaying opinions.
The UK-based study examined how effective targeted ads are, why people are so divided on certain topics and what influences their views on social media. Researchers divided Facebook users into two personality types – those with high and low ‘authoritarianism’ scores, sorted using a combination of age, gender, location and publicly shared interests (such as the Daily Mail or the Guardian).

To ensure the groups were accurately sorted, the users were asked the degree to which they agree with the statement ‘with regards to internet privacy: if you’ve done nothing wrong, you have nothing to fear.’

The researchers then used these psychographic profiles to craft messages specifically targeting each of the groups. For example, people in the low authoritarian group were targeted with anti-surveillance messages that read ‘Do you really have nothing to fear if you have nothing to hide? Say no to state surveillance,’ alongside an image of Anne Frank. Meanwhile, the high authoritarian group were targeted with an anti-surveillance message that read ‘They fought for your freedom. Don’t give it away!’, over an image of the D-day landings.

‘Using psychographic targeting we reached Facebook audiences with significantly different views on surveillance and demonstrated how tailoring pro- and anti-surveillance ads based on authoritarianism affected return on marketing investment,’ wrote Chris Sumner, co-founder and research director of the Online Privacy Foundation.

The study found that the psychographically targeted ads were not only significantly more likely to be shared by targeted users, but ‘illustrate the ease with which individuals’ inherent differences and biases can be exploited.’

Sumner also concluded that ‘debunking propaganda faces big challenges as biases severely limit a person’s ability to interpret evidence which runs contrary to their beliefs.’ Which basically means that if you don’t want to believe something, you won’t, even if there’s clear evidence to the contrary.

The use of psychographic ads is twice as dangerous as regular advertising. First, for its implications for democracy, and second, because it’s a largely unregulated practice – despite knowledge of Facebook’s massive reach and push for advertisers to regard it as a platform that can ‘persuade voters’ and ‘influence online and offline outcomes’.

‘People … may well not be aware of how other data about them can be used and combined in complex analytics,’ said UK information commissioner at the ICO, Elizabeth Denham. ‘If a political organisation is collecting data directly from people, eg via a website, or obtains it from another source, it has to tell them what it is going to do with the data … It cannot simply choose to say nothing, and the possible complexity of the analytics is not an excuse for ignoring this requirement.’

People may well not be aware of how data about them can be used and combined in complex analytics. An organisation cannot simply choose to say nothing.
To help address the situation, a new project from the London-based Bureau of Investigative Journalism called Who Targets Me is recruiting social media users to share information on what adverts they are seeing. The Bureau Local data journalism team will then analyse the collected data to try to cast light on an industry that’s rapidly growing but currently opaque.

‘It’s possible to target dark ads at millions of people in this country without the rest of us knowing about it,’ warned Will Moy, director of independent fact checking website Full Fact. ‘Inaccurate information could be spreading with no-one to scrutinise it. Democracy needs to be done in public.’

You can sign up for the project here.

About the author

English-American, Claire has lived and worked in the U.S., South America, Europe and the UK. As Culture Trip’s tech and entrepreneurship editor she covers the European startup scene and issues ranging from Internet privacy to the intersection of the web with civil society, journalism, public policy and art. Claire holds a master’s in international journalism from City University, London and has contributed to outlets including Monocle, NPR, Public Radio International and the BBC World Service. When not writing or travelling, she can be found searching for London's best brunch spot or playing with her cat, Diana Ross.

If you click on a link in this story, we may earn affiliate revenue. All recommendations have been independently sourced by Culture Trip.
close-ad