Updated: Jul 15, 2020
What is misinformation and what does it do?
Misinformation can take various forms: fake news, alternative facts, conspiracy theories, and growing partisanship.
These forms of information all have a common basis: they are unreliable, non-factual and frequently non-truthful. They are frequently also constructed for a motivated cause and focused on negative emotions and mistrustful intuitions. All this is why these forms of information are called misinformation.
The issue of misinformation has emerged in recent times as an increasing challenge to established democratic norms. Misinformation provides obstacles for the acceptance of compelling scientific evidence. It holds up public policy with pseudoscience and stubborn, irrational fears. It limits the effectiveness of public health measures. It undermines trust in not just some, but all institutional information. Because of this, it creates partisan approaches to understanding issues, and even reality itself. In short, misinformation makes responding to the national and global challenges of our times increasingly difficult. As Heidi J. Larson puts it, the current “deluge of conflicting information, misinformation and manipulated information on social media should be recognized as a global public-health threat”. Accordingly, the WHO now considers one of the big global threats, heightened by the current global pandemic, to be uncontrollable misinformation, or an “infodemic”. On a personal level, it's easy enough for us to spot misinformation. We don't have to go too far to find our friends and facebook or twitter followers sharing dubious stories. These stories are sometimes even deliberate disinformation, and are shared at rates that, at times, seems threatening and even virus-like. But if we have ever tried to do something about these symptoms of the "infodemic", we know how hard they are to fight.
Why fighting goes wrong
Over the years, I've learned to reassess how I fight misinformation. This is the result of many frustrating encounters in which attempts to be a truth justice warrior go awry. I understand what it feels like to have long conversations over fake news stories that don't visibly seem to go anywhere. To fact-check someone's post, and to be met by angry reactions, or more commonly, totally ignored. I've had more failures than successes.
This cartoon sums up how many of us wish the world was and people around us would behave. If only we could give information, we think, and it was received. Perhaps people could pause so they aren't overfilling, but otherwise - why don't they just take the knowledge, dammit! That this assumption about how people should work is a humorous cartoon should remind us that neither the world is nor people behave like that. The biggest difficulty in fighting misinformation is fighting our own tendencies. In other areas of social life, we can more readily spot the kind of conflict that unwanted gifts and busybodies can create. But in giving information, we tend to assume that it is the duty of the receiver to accept it. It is difficult for us to escape the way we were conditioned by our schooling. High achieving students may find information vital, but they are also often only half-aware of how influenced they are by the biases of their institutional and social circles.
We can fail to see that most people may not have liked their experience of school, and specifically that they did not accept the formal culture that validates the 'discipline' of people accepting correction. Acting as technocrats of information, we forget that people do not behave as we may expect. The imp of the perverse in people is always present. Most people have a stronger desire to have been right, that is, about something they have said or done in the past, then to be right in the present. The truth is, most people seek affirmation not information. Numerous social psychology experiments and studies have confirmed this reality. That is why correcting the information that people have shared when they are not prepared for that correction is only confirming their fears. So considering all that, what can we do to help correct misinformation or address conspiracy theories in our friendship or social media circle? Based on my experience and research, successses and ongoing failures, these are the 4 simple steps to always consider before addressing misinformation:
1. Pause to think before you act.
Pausing before you address misinformation is crucial. Not only do you want to be sure that the information is incorrect or, worse, fabricated, we want to understand why and how. In addition, we want to understand the person who shared it, why and who they are. While many people share stories simply because they like the headline of a story, they are often seeking an affirmation of their own beliefs, fears or desires.
Rather than falsely moralise those beliefs, we need to recognise that there could be personal, social and structural sources of human behaviour that are behind the person's decision to share the information.
We should analyse these sources that may be relevant to the particular person before we make a personal judgement about this person. Vitalsmarts provides a useful, short guide to six ways of thinking these through.
Once we know more about the person, we then need to accept that we cannot simply fight misinformation without being prepared to learn more and to teach only when teaching is sought. If we are going to have any chance of having them correct their own misinformation, we have to be encouraging. So approaching a conversation in an angry or even neutral way is likely to only create conflict. The exceptions to this rule are people have been influenced enough to have slow, considered opinions and to challenge their bias and seek information rather than affirmation. In those cases, these exchanges are easier.
Consider how much time we may have to invest in correcting the misinformation. Do a quick cost benefit analysis. Are the person's views particularly influential, and so would having them correct their views have a wider impact? Decide if you want to spend the time required to address the person who has spread the misinformation based on what we think we have discovered in the analysis in step 1.
Are we going to be able, for example, address the social isolation or the lack of education in science and evidence in any way that we can hope to support?
In asking these questions, we are fighting a tendency we may have to always telescope in and correct all and any bits of information. Remember, we want to be strategic and think of the war not just the battle.
3. Take the right approach
So let's say we do want to spend the time to connect with someone to address misinformation. Then, we need to adopt a method of approach that is focused on making the person we want to change feel comfortable, safe, and even inspired.
The behaviour of the person sharing the misinformation may be based on lack of attention or carelessness, but we must expect that they will still want to be affirmed for their behaviour and won't want to be caught out having been wrong. If we share information immediately and visibly on their post or thread, their response is likely to be based on fear and lack of trust, so we need to hold off and encourage them to feel safe and trust us. Connecting with them and their emotions is vital for them to trust us enough to ask us to share information with them. So rather than posting on their thread or post, we could reach out to them in their private or personal mailbox. While we may want to correct information on their public post, we should consider the strategy in doing so: is it to change their mind, or that of the people that may be influenced by them? If it is the latter, a combination of both public and personal messaging may be more effective. Social psychology research has revealed a simple truth: people reduce the amount of misinformation they share whenever they set a goal to share the truth. The issue is getting them to that goal in the first place. Many emotions can cloud our intentions to share the truth. Among them are emotions that prevent learning such as anger, fear, and hatred. So we need to cultivate emotions that help us encourage and honor the truth. Scientists have identified emotions such as interest, surprise, confusion and awe to be the most helpful "knowledge emotions" in fostering learning, reflecting and exploring. If we ourselves approach people with these emotions, we're much more likely to inspire in them those same habits of thinking.
We could express interest in what they have shared, or confusion about what it means, surprise that something new is being shared, or marvel about the complexity of the issue. All these ways of approaching the issue are more likely to inspire learning responses. A useful method to consider structuring your conversations with anyone who shares misinformation, including science deniers, has been created by social scientist Gleb Tsipursky. Consider the process involved in EGRIP:
Emotions - reflect your understanding of another's feelings
Goals - set a goal with the person
Rapport - affirm them and connect
Information - share facts when asked
Positive Reinforcement - affirm their acceptance of information that may be facts contrary to their beliefs
Information is way down this list for a reason. All this suggests why social goal setting and collaboration are so important in any attempt to create truth-seeking behaviour. A complementary way to encourage other people to work on changing their misinformation habits is to share with them a tool that was co-founded by Gleb Tsipursky, the Pro-Truth Pledge. This simple, evidence based tool asks people to form the intention to share, honour and encourage the truth.
With hundreds of pledge takers around the world, including institutions and companies, the Pro-Truth Pledge is one of the vital tools in building a new form of civil collaboration. It is precisely this kind of collaborative truth-seeking that, more than one social media conversation, will reduce misinformation and restore trust in truth seeking behaviours.
4. Don't despair and get creative
So if we decide against step 3 above or we fail in changing someone's mind, let's not despair. We can recognise that our effort in gathering useful information is not wasted.
A great service we can perform in helping correct misinformation is to provide clear and helpful information in whatever channels we can use.
Here are some related tips for this step based on the advice of experts who have studied misinformation in social media:
"[Y]ou can actually make a dent in correcting misperceptions on social media, not necessarily with the person who posted the misinformation on Facebook or Twitter but for all of the hundreds, maybe even thousands, of people in their network who are watching... [Y]ou can correct misperceptions at least somewhat by doing a few simple things...like, linking to an expert source, saying the facts as simply as possible without repeating the myth...And finally, there's power in multiple corrections."
That same power of "multiple corrections", however, can have a negative side effect. It may unintentionally and over time cause a culture of fear. This cultural fear of being wrong and of being punished for being wrong is what prevents many people from admitting they were wrong. So what more can we do? We can in addition use social media to foster the habits that allow for people to express, bravely, that they are wrong. As truth seekers, we forget that how we ourselves act provides an influential model for others. A perfect example might be this post from Jose Louis Casal, which over 17,000 people shared, and that has became almost as viral as many stories of misinformation:
If you're reading this, try sharing it now and see what kind of reaction it receives. As my friend Tony Earthbeat suggested:
At the point when we can all think in this way, the world will be saved.