Fake News and Confirmation Bias

When it comes to believing "fake news," we are all at a disadvantage.  As people we suffer from many different cognitive biases (see chart).

The most prominent of these, when it comes to "fake news," is confirmation bias, defined as "the tendency to process information by looking for, or interpreting, information that is consistent with one’s existing beliefs" (Britannica). This is an innate bias that often we are unaware of, and even when we are we need to keep reminding ourselves. When it comes to our interaction with "fake news" we are much more likely to agree with information that matches what we want to be true, and are less likely to verify information that confirms our beliefs. This is largely how "fake news" spreads as misinformation, we are not deliberately trying to share false information with our friends and family, but we may genuinely believe it is true because we want it to be. 

VeryWellMind lists 4 Examples:

1. Not seeking out objective facts,

2. Interpreting information to support existing belief,

3. Only remembering details that uphold your belief,

4. Ignoring information that challenges your belief

 You can read the full content of the graphic in the article "What Is the Confirmation Bias?"

 

Learn more about our brains response to fake news. From several years ago but still relevant.

Creating Filter Bubbles -personlaized for you...

A concept closely tied to that of confirmation bias is the filter bubble.  Filter bubbles are "intellectual isolation that can occur when websites make use of algorithms to selectively assume the information a user would want to see, and then give information to the user according to this assumption. Websites make these assumptions based on the information related to the user, such as former click behavior, browsing history, search history and location" (Techopedia). As we can tell from the definition, we are the creators of our own filter bubbles.

Varying algorithms from social media sites and search engines keep track of the content we choose to interact with and then begin to show us only what the algorithm has determined we want to see. When we pair this with our innate confirmation bias, we begin creating a bubble of information that only confirms the beliefs we already have. This can aid in the spread of "fake news" because we are more likely to interact with people who hold similar beliefs as us, when those people share "fake news" it is likely to appeal to us also and so we take up the story and share it. Similarly, when we share "fake news" it is likely to appeal to our friends and they will share it. The real issue is that these algorithms are so common we may not even notice that information from other viewpoints is missing. 

 

The TED Talk below from Eli Pariser (in 2011) popularized the term filter bubble and explains how they work.

Some has changed since Pariser's TedTalk, Google made some changes in response, but the filter bubbles still exist. A more recent review and discussion can be found at the below site.

Cognitive Biases Chart

The chart below is taken from Decision Lab's Biases page that lists 50 cognitive biases, read and explore the detail on the site. Aligning them with 6 categories: Memory, Social, Learning, Belief, Money, Politics