Fake News and Confirmation Bias

When it comes to believing "fake news," we are all at a disadvantage.  As people we suffer from many different cognitive biases (see chart). The most prominent of these, when it comes to "fake news," is confirmation bias, defined as "the tendency to process information by looking for, or interpreting, information that is consistent with one’s existing beliefs" (Britannica). This is an innate bias that often we are unaware of, and even when we are we need to keep reminding ourselves. When it comes to our interaction with "fake news" we are much more likely to agree with information that matches what we want to be true, and are less likely to verify information that confirms our beliefs. This is largely how "fake news" spreads as misinformation, we are not deliberately trying to share false information with our friends and family, but we may genuinely believe it is true because we want it to be. 

Below is a graphic of ways confirmation bias can present itself as we interact with information. You can read the full content of the graphic in the article "What Is the Confirmation Bias?" from VeryWellMind.

Confirmation Bias infographic showing 4 examples

 

Watch the video below to learn more about the science of why our brains are drawn to "fake news."

Creating Filter Bubbles

A concept closely tied to that of confirmation bias is the filter bubble.  Filter bubbles are "intellectual isolation that can occur when websites make use of algorithms to selectively assume the information a user would want to see, and then give information to the user according to this assumption. Websites make these assumptions based on the information related to the user, such as former click behavior, browsing history, search history and location" (Techopedia). As we can tell from the definition, we are the creators of our own filter bubbles. Varying algorithms from social media sites and search engines keep track of the content we choose to interact with and then begin to show us only what the algorithm has determined we want to see. When we pair this with our innate confirmation bias, we begin creating a bubble of information that only confirms the beliefs we already have. This can aid in the spread of "fake news" because we are more likely to interact with people who hold similar beliefs as us, when those people share "fake news" it is likely to appeal to us also and so we take up the story and share it. Similarly, when we share "fake news" it is likely to appeal to our friends and they will share it. The real issue is that these algorithms are so common we may not even notice that information from other viewpoints is missing. 

 

The TED Talk below from Eli Pariser popularized the term filter bubble and explains how they work.