Skip to content

The Thinking Behind Misinformation

January 11, 2022
by

While misinformation has been on the public mind the last few years, libraries have dealt with this problem for a long time.  What many librarians may not be aware of are some of the psychological factors at play when people choose to believe misinformation.  Here is a summary of a few of those ideas.

Continued Influence Effect:  This idea was new to me but once I heard it the concept rang true. The Continued Influence effect states that once a person believes misinformation that misinformation can sometimes stay in their mind and continue to influence future thinking, even after they have been presented with incontrovertible counter evidence that disproves that misinformation.  The person may even acknowledge that the misinformation wasn’t true, yet somehow the idea maintains a hold in their mind. There doesn’t seem to be a broad consensus why this happens.  Perhaps people cling to information they want to believe.

The Illusory Truth Effect:  This theory posits that people are more likely to believe information that is simple and familiar over the complex and novel.  So phrases that are said repeatedly can make their way into a person’s mind from shear familiarity.  Claims like “people only use 10% of their brain” are a good example of this, since there is no proof of this claim yet many of us believe it from having heard it so often.  The complexity of an idea also matters to the Illusory Truth effect. According to this theory people are more apt to believe simple ideas instead of complex hard to understand ones.  That two plus two equals four is simple and intuitive, but the theory of relativity is complex and hard to grasp, so the mind wants to reject it. Another example, some people entertain the idea that “5g causes COVID” without any rational evidence for that.  Yet, if you consider that idea in light of the Illusory Truth Effect, you can see how people might be drawn to the simplicity of “5g causes COVID”.  That idea, however irrational, is simpler than concepts of spike proteins, viral load, airborne transmission, etc.

Anger and Misinformation:  Information is not emotionally neutral and information that triggers specific emotions will have cause different reactions. Information that triggers anger is especially likely to cause strong reactions from people. With our emotions aroused we are more likely to be swayed by illogical arguments.  Misinformation can use this natural reaction to spread.  Facebook has used algorithms to promote anger inducing content which then generates more clicks and ad revenue.

Confirmation bias:  Most people have heard of this before and the role it can play in misinformation is clear.  People are more susceptible to misinformation that appeals to their preconceived notions about the world.

If you find these ideas interesting, I highly recommend for you to continue reading about them.  Much of the knowledge shared above comes from the excellent Debunking Handbook 2020.  It might give you some ideas to integrate into your information literacy teaching.

2 Comments leave one →
  1. January 11, 2022 4:27 pm

    For those interested in teaching fact checking and digital media literacy – as well as understanding what motivates people to believe in conspiracy theories – Wineburg & McGrew’s work with the Stanford History Education Group (https://sheg.stanford.edu/) and Mike Caulfield’s work – especially SIFT (https://hapgood.us/2019/06/19/sift-the-four-moves/) – has been a touchstone for my teaching since 2017.

Trackbacks

  1. Still Troubling, Still Useful: Thoughts on Twitter | CRD of PaLA

Leave a Reply to emliss Cancel reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: