Learn To Discern: How To Take Ownership Of Your News Diet

I am tired of keeping up with the news these days. The sheer volume of information is intimidating. It creates a challenge to filter relevant news from political noise only to then begin a process of analyzing the information for its integrity and accuracy. I certainly struggle to identify subtle misinformation when faced with it. That’s why I became interested in the psychological triggers weaved into the news to better understand my decision-making and conclusions. Pennycook and Rand wrote an excellent research paper on the human psychology of fake news.

tl;dr

We synthesize a burgeoning literature investigating why people believe and share false or highly misleading news online. Contrary to a common narrative whereby politics drives susceptibility to fake news, people are ‘better’ at discerning truth from falsehood (despite greater overall belief) when evaluating politically concordant news. Instead, poor truth discernment is associated with lack of careful reasoning and relevant knowledge, and the use of heuristics such as familiarity. Furthermore, there is a substantial disconnect between what people believe and what they share on social media. This dissociation is largely driven by inattention, more so than by purposeful sharing of misinformation. Thus, interventions can successfully nudge social media users to focus more on accuracy. Crowdsourced veracity ratings can also be leveraged to improve social media ranking algorithms.


Make sure to read the full paper titled The Psychology of Fake News by Gordon Pennycook and David G. Rand at https://www.sciencedirect.com/science/article/pii/S1364661321000516

This recent research paper by psychologists of the University of Regina and the Sloan School of Management at Massachusetts Institute of Technology took a closer look at the sources of political polarization, hyperpartisan news, and the underlying psychology that influences our decision-making on whether news is accurate or misinformation. They answered the question of why people fall for misinformation on social media. Lessons that can be drawn from this research will be helpful to build effective tools to intercept and mitigate misinformation online. It will further advance our understanding of the underlying human psychology when interacting with information on social media. And while the topic could fill entire libraries, they limited their scope of research to individual examples of misinformation rather than organized, coordinated campaigns of inauthentic behavior excluding the spread of disinformation.

So, Why Do People Fall For Fake News?

There are two fundamental concepts that explain the psychological dynamics when faced with misinformation: truth discernment aims to establish a belief in the relative accuracy of news that is greater than known-to-be false information on the same event. Basically, this concept is rooted in active recognition and critical analysis of the information to capture people’s overall beliefs. Another concept that is used to explain why people fall for misinformation is the idea of truth acceptance. Thereunder the accuracy of news is not a factor but the overall belief of it. Instead of critical analysis of the information people chose to average or combine all available information, true or false, to establish an opinion about the veracity of the news that captures people’s overall belief. This commonly results in a biased perception of news. Other concepts related to this question look at motives. Political motivations can influence people’s willingness to reason based on their partisan, political identity. In other words, when faced with news that is consistent with their political beliefs, the information is regarded as true; when faced with news that is inconsistent with their political beliefs, the information is regarded as false. Loyalty to their political ideology can become so strong that it can override an apparent falsehood for the sake of party loyalty. Interestingly, the researchers found that political partisanship has much less weight than the actual veracity of news when assessing information. Misinformation that is in harmony with people’s political beliefs is less trustworthy than accurate information that is against people’s political beliefs. They also discovered that people tend to be better at analyzing information that is in harmony with our political beliefs, which helps to discern truth from falsehood. But if people hardly fall for misinformation consistent with our political beliefs, which characteristics make people fall for misinformation?

“People who are more reflective are less likely to believe false news content – and are better at discerning between truth and falsehood – regardless of whether the news is consistent or inconsistent with their partisanship”

Well, this brings us back to truth discernment. Belief in misinformation is commonly associated with overconfidence, lack of reflection, zealotry, delusionality, or overclaiming where an individual acts on completely fabricated information as a self-proclaimed expert. All of these factors indicate an inability of analytical thinking. On the opposite side of the spectrum, people determine the veracity of information through cognitive reflection and tapping into their relevant existing knowledge. This can be general political knowledge, a basic understanding of established scientific theories, or simple online media literacy.

“Thus, when it comes to the role of reasoning, it seems that people fail to discern truth from falsehood because they do not stop to reflect sufficiently on their prior knowledge (or have insufficient or inaccurate prior knowledge) – and not because their reasoning abilities are hijacked by political motivations.” 

The researchers found that the truth has little impact on sharing intentions. They describe three types of information-sharing on social media:

  • Confusion-based sharing: this concept encompasses a genuine belief in the veracity of the information-shared (even though the person is mistaken)
  • Preference-based sharing: this concept places political ideology, or related motives such as virtue signaling, above the truth of the information shared accepting misinformation as a collateral
  • Inattention-based sharing: thereunder people are only intending to share accurate information, but are distracted by the social media environment

Steps To Own What You Know

If prior knowledge is a critical factor to identify misinformation, then familiarity with accurate information goes a long way. An awareness of familiar information is critical to determine whether the information presented is the information that you already know or a slightly manipulated version. Be familiar with social media products. What does virality look like on platform XYZ? Is the uploader a verified actor? What is the source of the news? In general, sources are a critical signal to determine veracity. The more credible and established a source, the likelier the information is well-researched and accurate. Finally, a red flag for misinformation is emotional headlines, provocative captions, or shocking images.

Challenges To Identify Misinformation

Truth is not a binary metric. In order to determine the veracity of news, a piece of information may be falsified or laced with inaccuracies or compared against established, known information. Therefore the accuracy and precision, or overall quality, of a machine learning classifier for misinformation hinges on the clarity of the provided training data times the depth of exposure on the platform where the classifier will be deployed. Another challenge to consider is the almost ever-changing landscape of misinformation. Misinformation is rapidly evolving, convulsing into conspiracy theories and maybe (mistakenly) supported by established influencers and institutions. This creates problems to discern the elements of a news story, which undermines the chances to determine accuracy. Inoculation (deliberate exposure to misinformation to improve recognition abilities) is in part ineffective because people fail to stop, reflect and consider the accuracy of the information at all. Therefore successful interventions to minimize misinformation may start with efforts to slow down interactions on social media. This can be achieved by changing the user interface to introduce friction and prompts to help induce active reflection. Lastly, human fact-checking is not scalable. For so many reasons: time, accuracy, integrity, etc. Leveraging a community-based (crowd-sourced) fact-checking model might be an alternative until a more automated solution will be ready. Twitter has recently introduced experiments with these types of crowd-sourced products. Their platform is called Birdwatch.

This research paper didn’t unearth breakthrough findings or new material. It rather helped me to learn more about the dynamics of human psychology when exposed to a set of information. Looking at the individual concepts people use to determine the accuracy of information, the underlying motives that drive our attention, and the dynamics for when we decide to share news made this paper a worthwhile read. Its concluding remarks to improve the technical environment by leveraging technology to facilitate a more reflective, conscious experience of news on social media leaves me optimistic for better products to come. 

Leave a Reply

Your email address will not be published. Required fields are marked *