Have you had the experience of looking at some products online and then seeing ads for them all over your social media feeds? Not coincidentally, these examples of frighteningly accurate ads provide glimpses into the behind-the-scenes mechanisms that fuel an item you search on Google, “like” on social media or come across while browsing personalized ads on social media.
These mechanisms are increasingly being used for more nefarious purposes than aggressive advertising. The threat lies in how this targeted advertisement interacts with today’s deeply divided political landscape. As a social media researcher, I see how people who seek to radicalize others use targeted advertising to easily move people to extremist views.
Advertising for one audience
The advertising is clearly strong. The right advertising campaign can help shape or create demand for a new product, rehabilitate the image of an old product, or even the image of an entire company or brand. Political campaigns use similar strategies to advance candidates and ideas, and nations have historically used them to wage propaganda wars.
Advertising in the media is strong, but the media has an internal moderating force. When trying to move many people in one direction, the media can only move them as fast as the medium can afford. If he moved too far or too fast, he might alienate people in the middle.
The detailed profiles that social media companies create for each of their users make advertising more powerful by enabling advertisers to personalize their messages to individuals. These profiles often include the size and value of your home, the year you bought your car, whether you’re expecting a child, and whether you buy a lot of beer.
Thus, social media has a greater ability to expose people to ideas as quickly as they accept them individually. The same mechanisms that can recommend a specialized consumer product to just the right person or suggest an addictive substance only when someone is most at risk can also indicate a severe conspiracy theory when a person is willing to consider it.
It is increasingly common for friends and family to find themselves at odds in highly polarized discussions about important issues. Many people recognize social media as part of the problem, but how do these powerful personalized advertising techniques contribute to the contentious political landscape?
Breadcrumbs to the max
An important part of the answer is that people associated with foreign governments, without acknowledging who they are, take extreme positions in social media posts with the deliberate aim of fomenting division and conflict. These extremist posts make use of social media algorithms, which are designed to increase engagement, which means they reward content that elicits a response.
Another important part of the answer is that people who seek to radicalize others chart the paths of breadcrumbs to more and more extreme situations.
Pipelines of extremism on social media operate in much the same way whether recruiting jihadists or the January 6 rebels.
It may feel like you’re “doing your own research”, going from one source to another, but you’re really following an intentional pipeline of extremism designed to move you toward more and more extremist content at whatever pace you tolerate. For example, after analyzing more than 72 million user comments on more than 330,000 videos posted on 349 YouTube channels, researchers found that users are constantly migrating from more moderate to more extreme content.
The result of these pipelines of extremism is clear. Instead of most people having moderate views with fewer people having extreme views, there are fewer and fewer people in the middle.
How do you protect yourself
What can you do? First, I recommend a heavy dose of skepticism about social media recommendations. Most of the people have gone to social media looking for something and then find themselves searching for their phone after an hour or so and have little idea how to read or see what they just did or why. It is designed to be addictive.
I’ve been trying to chart a more in-depth path to the information I want and actively trying to avoid clicking on whatever is recommended to me. If I read or see what is suggested, I ask myself “How can this information be in someone else’s interest, and not mine?”
[Over 140,000 readers rely on The Conversation’s newsletters to understand the world. Sign up today.]
Second, consider supporting efforts that require social media platforms to offer users a choice of algorithms for recommendations and feed curation, including those based on easy-to-explain rules.
Third, and most importantly, I recommend investing more time in interacting with friends and family outside of social media. If I find myself needing to redirect a link to prove a point, I treat that as a warning bell because I actually don’t understand the problem well enough. If so, I might have found myself following a path built towards extremist content rather than consuming materials that actually help me understand the world better.