Why people believe in UFOs and lab leaks – Quartz
For ufologists, the highly anticipated US government report on “unidentified aerial phenomena” can be a major disappointment. It goes further than any previous report by admitting unknowns. But conspiracy theorists will likely dismiss it as a cover-up.
But they’re not the only ones who tend to reject anything that contradicts their accepted narrative.
Take the “lab leak theory”. In January, for example, the Washington Post not only called the idea that COVID-19 was man-made a “debunked fringe theory.” He also called the theory from the Wuhan Institute of Virology a “contested marginal theory”.
Facebook banned claims the virus was fabricated in a lab to be fake and debunked in February. He has now overturned that decision, with US President Joe Biden ordering his intelligence experts to “bring us closer to a final conclusion” by the end of August.
The problem has been complicated by hyper-partisan media conflating the Facebook ban with the censorship of the lab leak theory. But many have also dismissed the theory of lab leaks too easily, confusing it with other conspiracy theories.
We are all inclined to accept a story and stick to it, regardless of the evidence. This problem is not just “out there”. Behavioral research offers lessons that we should all keep at the forefront.
See what we want to see
Even though we pride ourselves on having an independent mind, we can still fall prey to cognitive biases.
This is in part due to overconfidence in our own decision-making abilities.
This is not just the result of the phenomenon known as the Dunning-Kruger effect, in which we tend to overestimate our skills in areas where we are incompetent. Very intelligent people are also likely to believe in highly irrational ideas, as shown by the list of Nobel Prize-winning scientists who have adopted scientifically questionable beliefs.
Part of it also has to do with believing that what we want to be true.
We decide most of our opinions by nothing better than instant judgment or instincts. Our internal “press officer” – a mental module that convinces us of our own infallibility – then justifies our reasons for having these opinions after the fact.
Behavior specialists call it reasoned reasoning — when your personal preferences cloud your understanding of reality.
As Malcolm Gladwell writes in his book Blink: The Power of Thinking Without Thinking (Little, Brown, 2005): “Our selection decisions are far less rational than we think.
How long is a string? You tell me
One cognitive bias particularly magnified by social media is good old-fashioned conformism.
The power of conformist thinking has been graphically demonstrated by a psychologist
Solomon Asch in his classic 1956 study showing that we can even ignore the evidence with our own eyes when it contradicts the majority opinion.
Asch brought groups of participants together and asked them which of the three numbered lines was the same length as a target line.
Which numbered line is the same length as the one on the left? The answer should be easy. But in Asch’s group, only one person was a real participant. The other six were “stooges”, responsible for sometimes giving the same obviously wrong answer before the subject of the experiment answered.
The result: About a third of the time, subjects agreed with the majority opinion, even though it was clearly wrong. The painful lesson: we are social creatures, influenced by the group, ready to even sacrifice the truth just to fit in.
Locked in the echo chamber
Facebook, Twitter, and other social media sites can reinforce all of the above instincts by creating “echo chambers” that validate what we have chosen to believe.
Exposure to different ideas does not mesh well with the online media economy, in which platforms and content creators on these platforms compete for limited attention by appealing to preferences and biases.
We like the echo chambers.
According to psychologist Jonathan Haidt, we seem to be born with a “complacency gene” – an inherent need to be right. We are more inclined to defend our opinions by criticizing others. We find comfort in validation.
Once we have made our opinion known to others, we are stubbornly reluctant to change course. Looking consistent can become more important than looking fair, which is why we will go to great lengths to support opinions that are scrutinized.
These weaknesses could be endearing if they did not have such serious implications. Believing in disinformation is an undeniable problem.
But we’re going to need a different way of dealing with conspiracy theories than just trying to ban them. Seeking to enforce a single accepted narrative is not the solution.
If Facebook or the mainstream media are the arbiters of who speaks up and who doesn’t, then we will be pushed further towards our own filter bubbles and conspiracy theorists towards theirs.
This article is republished from The Conversation under a Creative Commons license. Read the original article.