How People Get Sucked into Misinformation Rabbit Holes – and How to Get Them Out

As misinformation and radicalization rise, it’s tempting to look for something to blame: the internet, social media personalities, sensationalized political campaigns, religion, or conspiracy theories.
conspiracy beliefs
radicalization
information manipulation
Authors
Affiliation

Emily Booth

University of Technology Sydney

Marian-Andrei Rizoiu

University of Technology Sydney

Published

February 24, 2024

As misinformation and radicalization rise, it’s tempting to look for something to blame: the internet, social media personalities, sensationalized political campaigns, religion, or conspiracy theories. And once we’ve settled on a cause, solutions usually follow: do more fact-checking, regulate advertising, and ban YouTubers deemed to have “gone too far.”

However, if these strategies were the whole answer, we should already see a decrease in people being drawn into fringe communities and beliefs and less misinformation online. We’re not.

In new research published in the Journal of Sociology, we, with our colleagues, found radicalization is a process of increasingly intense stages, and only a small number of people progress to the point where they commit violent acts.

Our work shows the misinformation radicalization process is a pathway driven by human emotions rather than the information itself – and this understanding may be a first step in finding solutions.

A feeling of control

We analyzed dozens of public statements from newspapers and online in which former radicalized people described their experiences. We identified different intensity levels in misinformation and its online communities associated with common recurring behaviors.

In the early stages, we found people either encountered misinformation about an anxiety-inducing topic through algorithms or friends, or they went looking for an explanation for something that gave them a “bad feeling.”

Regardless, they often reported finding the same things: a new sense of certainty, a new community they could talk to, and feeling they had regained some control of their lives.

Once people reached the middle stages of our proposed radicalization pathway, we considered them invested in the new community, its goals, and its values.

Growing intensity

During these more intense stages, people began to report more negative impacts on their own lives. This could include the loss of friends and family, health issues caused by too much time spent on screens and too little sleep, and feelings of stress and paranoia. To soothe these pains, they returned to their fringe communities for support.

Most people in our dataset didn’t progress past these middle stages. However, their continued activity in these spaces kept the misinformation ecosystem alive.

Photo showing man and woman lying in bed in the dark, facing away from each other and looking at their phones.

Engagement with misinformation proceeds in stages. TimeImage / Shutterstock

When people did move further and reach the extreme final stages in our model, they were doing active harm.

In their recounting of their experiences at these high levels of intensity, individuals spoke of choosing to break ties with loved ones, participating in public acts of disruption, and, in some cases, engaging in violence against other people in the name of their cause.

Once people reached this stage, it took strong interventions to get them out. The challenge is how to intervene safely and effectively when people are in the earlier stages of being drawn into a fringe community.

Respond with empathy, not shame

We have a few suggestions. For people still in the earlier stages, friends and trusted advisers, like doctors or nurses, can have a big impact by simply responding with empathy.

If a loved one starts voicing possible fringe views, like fear of vaccines or animosity against women or other marginalized groups, a calm response that seeks to understand the person’s underlying concern can go a long way.

The worst response might leave them feeling ashamed or upset. It may drive them back to their fringe community and accelerate their radicalization.

Even if the person’s views intensify, maintaining your connection with them can turn you into a lifeline that will see them get out sooner rather than later.

Once people reached the middle stages, we found third-party online content – not produced by the government, but by regular users – could reach people without backfiring. Considering that many people in our research sample had their radicalization instigated by social media, we also suggest the private companies behind such platforms should be held responsible for the effects of their automated tools on society.

By the middle stages, arguments based on logic or fact are ineffective. It doesn’t matter whether a friend, a news anchor, or a platform-affiliated fact-checking tool delivers them.

At the most extreme final stages, we found that only heavy-handed interventions worked, such as family members forcibly hospitalizing their radicalized relative or individuals undergoing government-supported deradicalization programs.

How not to be radicalised

After all this, you might be wondering: how do you protect yourself from being radicalized?

As much of society becomes more dependent on digital technologies, we’re going to get exposed to even more misinformation, and our world will likely get smaller through online echo chambers.

One strategy is to foster your critical thinking skills by reading long-form texts from paper books.

Another is to protect yourself from the emotional manipulation of platform algorithms by limiting your social media use to small, infrequent, purposefully-directed pockets of time.

And a third is to sustain connections with other humans and lead a more analog life – which has other benefits as well.

So, log off, read a book, and spend time with people you care about. The Conversation

Emily Booth, Research assistant, University of Technology Sydney and Marian-Andrei Rizoiu, Associate Professor in Behavioral Data Science, University of Technology Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Citation

BibTeX citation:
@article{booth2024,
  author = {Booth, Emily and Rizoiu, Marian-Andrei},
  publisher = {InfoEpi Lab},
  title = {How {People} {Get} {Sucked} into {Misinformation} {Rabbit}
    {Holes} – and {How} to {Get} {Them} {Out}},
  journal = {InfoEpi Lab},
  date = {2024-02-24},
  url = {https://theconversation.com/how-people-get-sucked-into-misinformation-rabbit-holes-and-how-to-get-them-out-223717},
  langid = {en}
}
For attribution, please cite this work as:
Booth, Emily, and Marian-Andrei Rizoiu. 2024. “How People Get Sucked into Misinformation Rabbit Holes – and How to Get Them Out.” InfoEpi Lab, February. https://theconversation.com/how-people-get-sucked-into-misinformation-rabbit-holes-and-how-to-get-them-out-223717.