Democracy is built upon making informed decisions by the rule of the majority. As a society, we can’t make informed decisions if the majority is confused by fake news in the shape of false information distributed and labeled as real news. It has the potential to erode trust in democratic institutions, stir up social conflict and facilitate voter suppression. This paper by researchers from New York and Cambridge University examines the psychological drivers of sharing political misinformation and is providing solutions to reduce the proliferation of misinformation online.
The spread of misinformation, including “fake news,” disinformation, and conspiracy theories, represents a serious threat to society, as it has the potential to alter beliefs, behavior, and policy. Research is beginning to disentangle how and why misinformation is spread and identify processes that contribute to this social problem. This paper reviews the social and political psychology that underlies the dissemination of misinformation and highlights strategies that might be effective in mitigating this problem. However, the spread of misinformation is also a rapidly growing and evolving problem; thus, scholars also need to identify and test novel solutions, and simultaneously work with policy makers to evaluate and deploy these solutions. Hence, this paper provides a roadmap for future research to identify where scholars should invest their energy in order to have the greatest overall impact.
Make sure to read the full paper titled Political psychology in the digital (mis)information age by Jay J. Van Bavel, Elizabeth Harris, Philip Pärnamets, Steve Rathje, Kimberly C. Doell, Joshua A. Tucker at https://psyarxiv.com/u5yts/
It’s no surprise that misinformation spreads significantly faster than the truth. The illusory truth effect describes this phenomenon as misinformation that people had heard before were more likely to be believed. We all have heard of a juicy rumor in the office before learning it is remotely true or made up altogether. Political misinformation takes the dissemination rate to the next level. It has far greater rates of sharing due to its polarizing nature driven by partisan beliefs and personal values. Even simple measures seemingly beneficial to all of society are faced with an onslaught of misinformation. For example, California proposition 15 designed to close corporate tax loopholes was opposed by conservative groups resorting to spread misinformation about the reach of the law. They conflated corporations with individuals making it a family affair to solicit an emotional response from the electorate. It’s a prime example for a dangerous cycle in which political positions are the drivers of misinformation which in turn is facilitating political division and obstructing the truth to make informed decisions. Misinformation is found to be shared more willingly, quicker and despite contradicting facts if the misinformation was in line with the political identity and seeking to derogate the opposition. In the example above, misinformation about proposition 15 was largely shared if it (a) contained information in line with partisan beliefs and (b) it sought to undercut the opponents of the measure. As described in the paper, the more polarized a topic is (e.g. climate change, immigration, pandemic response, taxation of the rich, police brutality etc.) the more likely misinformation will be shared by its individual political in-groups to be used against their political out-groups without further review of its factual truth. This predisposed ‘need for chaos’ is hard to mitigate because the feeling of being marginalized is a complex, societal problem that no one administration can resolve. Further, political misinformation tends to be novel and trigger more extreme emotions of fear and disgust. It tends to confuse the idea of being better off is equal to being better than another political out-group.
Potential solutions to limit the spread of political misinformation can already be observed across social media:
- Third-Party Fact Checking, is the second review by a dedicated, independent fact-checker committed to neutrality in reporting information. Fact-checking does reduce belief in misinformation but is less effective for political misinformation. Ideological commitments and exposure to partisan information foster a different reality that, in rare extreme cases, can create scepticism of fact-checks leading to an increased sharing of political misinformation, the so-called backfire effect.
- Investing in media literacy to drive efforts of ‘pre-bunking’ false information before they gain traction including to offer tips or engage in critical reflection of certain information is likely to produce optimal long-term results. Though it might be problematic to implement effectively for political information as media literacy is dependent on the provider and bi-partisan efforts are likely to be opposed by their respective extreme counterparts.
- Disincentivizing viral content by changing the monetization structure to a blend of views, ratings and civic benefit would be a potent deterrent for creating and sharing political misinformation. However, this measure would likely conflict with growth objectives of social media platforms in a shareholder-centric economy.
This paper is an important contribution to the current landscape of behavioral psychology. Future research will need to focus on developing a more comprehensive theory of why we believe and share political misinformation but also how political psychology correlates with incentives to create political misinformation. It will be interesting to learn how to manipulate the underlying psychology to alter the lifecycle of political information on different platforms, in different mediums and through new channels.