Conspiracy theories & cognitive dissonance

Conspiracy theories are on the rise. The Corona virus, 5G, vaccinations, pizzagate, Q-anon, election fraud, wall street, modern medicine…it’s a never-ending menu of topics for which alternative theories exist.

To me, these conspiracies seem pretty strange. None of the evidence makes any sense, none of the reasoning makes any sense, so how can people believe in it? Particularly when scientists write clear guides, reports, and reasons why the conspiracy theories are wrong?

A reason for these conspiracy theories is sometimes given from the theory of cognitive dissonance, from which it’s argued that “coping with the nuances of contradictory ideas or experiences is mentally stressful. It requires energy and effort to sit with those seemingly opposite things that all seem true. Festinger argued that some people would inevitably resolve dissonance by blindly believing whatever they wanted to believe.”

However, I don’t think that that frame fits very well on conspiracy theories. No anti-vaxxer wants the government to be injecting (poison, microchips, …) into their kids. These people are not believing what they want to believe. They’re believing what they don’t want to believe! If that makes sense.

The filter bubble

The conspiracy theorists often disregard the traditional channels of knowledge dissemination, i.e., the “mainstream” media and science writing. This can also be framed as a way to reduce cognitive dissonance: people don’t want to read about conflicting ideas, so they go back to the famous “bubble” which feeds them things they already know. But I don’t think this is true either. No one wants to be wrong. I have often visited The Daily Express and Fox News, despite not being a right-leaning person myself. I often disagreed with the reporting and opinionating done there which is thought was obviously misleading. But it didn’t stop me from looking at these websites. And how would you even manage to entirely remove any disagreeing thought from you life, with endless Twitter feeds that include everyone’s response, with people’s posts on Facebook, or just talking in the coffee corner over lunch. You can’t fully seal a real-life bubble!

I think that the disregard for traditional communication channels is much easier seen through the eyes of credibility, and an evolving concept of knowledge. Which is why I call this blog post Confirmation Bayes, after the theory that describes how knowledge is updated – in people, and in science. Bayesian inference is a theory of information and knowledge; of how to think about different probabilities. It’s the mathematical version of cognitive dissonance.

Confirmation bias in science

Science is a process of approaching the truth with incremental steps. Physical constants aren’t established by the first experiment, but require many experiments slowly creeping towards the most-likely correct values. Famously, plotting the obtained values for the charge of an electron over time, you see that researchers started out at the wrong value, and slowly crept towards the correct value. The reason for this slow creeping was that when researchers found a value that was too far away from the previously published values, they would re-check their equipment and set-up, and do so until the discrepancies weren’t so large anymore. When their value approached the known values well-enough, they would finally become a lot less critical about their experimental set-up; they would assume it probably worked as it should.

One might say (after Richard Feynmann) that it is a shameful thing that scientists worked like this. I don’t think this is so bad. (First of all, the process asymptotically moved towards the currently accepted value, so it self-corrected.) Knowledge isn’t about what you want to believe. It’s about what you think is credible. And if your experiment leads to a credible result, you probably did things right.

If that’s a bit vague, let’s consider an opposite case. Consider you’re a physicist, and you’re approached by someone who devised an experiment that proves that the speed of light is double the value that we know today. Would you ignore this information because it stresses you out (cognitive dissonance), would you choose to stop the conversation as quick as you can and quickly read another explanation of why the speed of light is what it is (confirmation bias), or would you simply laugh and say that it’s wrong (and perhaps help criticize all the things that are wrong)? See, what I’m trying to say is that you can be presented with information, and not believe it not because of any psychological factor, but not believe it because it is not a credible result, given what you know to be true! I think that the latter version is the most likely explanation.

Carl Sagan said “extraordinary claims require extraordinary evidence”, and that’s a nice way to think about Bayes’ rule in this setting. Scientific knowledge is updated, all the time, but only when there is overwhelming evidence to do so. Otherwise, it is healthy to stay with what we already know.

Harry Potter and different meanings of information

The example of updating knowledge in science is no different for normal people. When Harry Potter was presented with a train ticket for Platform 9 3/4, he had no reason to believe that there was such a thing. But then he met other wizards who claimed that there really was such a thing. Hmm, maybe there was something to it. And then he saw evidence that people indeed did disappear between platforms 9 and 10. It might really be the truth then? And when he went for the platform himself, and he knew it to be true.

Humans all learn things this way. We have beliefs we already hold, formed somewhere between our conception and the present moment. And then we’re presented pieces of information, and we decide to update those beliefs (or not). I think that, this way, a person who already leans towards distrusting the government and “big pharma”, can easily be swayed by someone story that there is a micro-chip in a vaccine, as it is close to their already held beliefs. Conversely, when I read that, I just laugh about it, as it is so far away from what I believe to be true.

That way, presenting people with the exact same information, still leads to different outcomes….because of what we already hold to be true.

Moving to and fro

I think that the above steps outline my thoughts about the uselesness of writing up the scientific facts about 5G to disuade conspiracy theorists from believing in all kinds of strange stories about 5G. They have simply sunk into a different valley of knowledge, where the scientific facts aren’t credible anymore! Just like knowledge is attained at small steps, I think that moving away from wrong ideas also only happens in small steps. You don’t start with the scientific facts and the mainstream media, but you have to find something closer to their beliefs.

I just don’t know how you can do this ethically. If it requires leaning into the conspiracy theory a little bit, you’re still not doing society a service, because you’ll sway people away from the truth. I think that satirists and other forms of humor work best in this case. Don’t let the scientist try to sway the conspiracy theorist, but let other people stand in-between. In the way that spiritual people are fighting back against conspiracy theorists amongst them (with the funny neologism conspirituality). I don’t believe in the spiritual things myself, but if they sort out the problems with conspiracy theorists amongst them, then they’re doing an excellent job! :-)

In short. If two people are presented identical piece of information, but have different prior ideas of what they believe is true, they may end up with one person sliding towards a more realistic representation of the truth, and the other sliding away from it! In that sense, I feel sad for the conspiracy theorists, because they were probably just presented with the wrong piece of evidence at the wrong time, leading them off a straight path.

Also, my thoughts are just meandering, they haven’t found out how to update themselves into a clearer picture.