A neuroscientist explains the problem of ignorance and how we can fight it
David Reinert holds up a large "Q" sign while waiting in line to see Donald Trump at his rally on August 2, 2018 (Photo by Rick Loomis/Getty Images)

The great paradox of modern times is that we have access to more information than ever, but ignorance seems to be growing.

People in the United States and around the world believe more bogus theories now than they did 10 years ago. Comment sections on social media reveal that most people are just as gullible as ever, and in some ways, even more likely to believe outlandish things. This ignorance has consequences of global importance, because an increase in ignorance will lead to ignorant people getting elected to positions of power. I don’t think I need to give an example here because you’re probably already thinking it.

Ignorance spreads like a virus if we don’t actively combat it. But we can’t attack the problem if we don’t fully understand it. Therefore, let’s learn about what ignorance is from a scientific and philosophical perspective, then plot a course for inoculating against it.

First, we should understand that we’re all ignorant — to some degree. You could say that ignorance is a fact of life. To understand why, we have to understand the nature of life. For an organism to exist in the world, it has to accomplish certain survival goals. For example, it must be able to find food and avoid threats in a chaotic and often unpredictable world. These tasks require that the organism have a map or model of its environment.

Because humans live in a complex physical and social world, we have very sophisticated mental models of the world. But as incredible as those maps of the world are, they are still abstract, simplified representations of a much more complicated reality. And they really have to be — a map that is as complicated as the thing it is mapping wouldn’t be very useful because it would contain more information than we could process. Scientists and artificial intelligence researchers are very aware of this point. They often remark that “the map is not the territory,” and there is a common saying that “all models are wrong, but some are useful.”

This idea has been summarized as the “Principle of Incomplete Knowledge,” and it says that because our mental model of the world always contains some uncertainty or error, we all have a certain amount of ignorance.

In this context, ignorance is the difference between our model of reality and how reality really is. To live in an optimal way — that is, to make the best decisions and increase your chances of success — we should always be trying to reduce the error in our model of the world. We do this by “updating our model” when evidence tells us that reality is different than we thought it was. According to an influential new neuroscience theory called “the Bayesian Brain Hypothesis,” our ability to update the model and reduce our ignorance is central to intelligence.

Your model of the world consists of all your beliefs about reality. Minimizing your model’s ignorance means changing your beliefs when evidence and logic suggests they are inaccurate.

Let’s consider an idealized example. Imagine someone who believes the Earth is flat blasts off into space in a rocket. The person will see with their own eyes that the Earth is round. If they come back down to Earth, continuing to believe that it is flat, they have not updated their model in light of new evidence. This is an extreme example, but most if not all of us hold some beliefs that are similarly, if less dramatically, inaccurate. In some cases, we still hold these beliefs even when they are contradicted by the evidence.

It is far from easy to determine which of your beliefs are in line with the evidence offered by reality. If you believe in something, it is usually because you’ve found something about that argument to be convincing (though that is not always the case, because we may also believe in unconvincing things that we find comforting).

This is why it is important that we test our beliefs. For example, let’s say you’re into New Age medicine. You’ve been told that a certain crystal has healing powers. Now, there is no good scientific reason to believe that this is true. But because even our best scientific theories will contain some amount of error, the best way to determine if there’s any validity to a belief is to test it. One could use the crystals only half of the time when they get sick, and they can keep a record of the recovery time (while trying to keep other variables, such as the kind of illness itself, constant). To increase the sample size of the study, that person could give the crystals to their friends and family who would also like to try the experiment. If 10 people try the healing method for one year and there’s no clear indication that there’s any difference between healing times associated with the crystal versus without, then one can suspect that the crystal is ineffective and won’t cure illness.

Society would almost certainly improve if everyone questioned and tested their own beliefs. In practice, this is not so easy. In the above example, there is the problem of the famous placebo effect, so the crystal might actually be effective in healing not because of any intrinsic property, but because of the user’s positive thoughts. For this reason, the best strategy for people defending against ignorance is becoming scientifically literate. Consult the peer-reviewed literature that exists on a given topic when something is in question, because empirical studies test theories in a properly controlled way and with a sufficient sample size (ideally). However, I should repeat that this does not mean we shouldn’t be skeptical of our current scientific theories and existing empirical evidence. Scientific theories, by design, aren’t immutable. They are pathways to knowledge, not final destinations. Our theories are always getting updates because they contain some degree of error, and it is important to be aware of that. But we should have the appropriate amount of skepticism, given all the evidence we have so far.

There’s a practical approach to reducing our ignorance and optimizing our world model’s accuracy. That takes us back to Bayesian reasoning, named for the 18th century statistician and philosopher Thomas Bayes. Bayesian reasoning is a procedure for updating your theory, model, or belief-system in the face of new evidence. In scientific practice, it involves a relatively complex mathematical formula. But you don’t need to know any math to use informal Bayesian reasoning in everyday life — as philosopher Julia Galef explains in this short and accessible video.

Here’s what you do:

1.) Consider all possible explanations for something, rather than relying purely on “gut instinct.”

2.) Rank and rate each theory according to how likely it is to be true based on all the known facts.

3.) Test each theory by using it to make future predictions.

4,) Update how you ranked and rated the likelihood of each being true to reflect what you learned from the testing phase.

Some of our most respected scientists, including cognitive psychologist Steven Pinker and theoretical physicist Sean Carroll, have identified Bayesian reasoning as a powerful tool in the war against irrationality. In particular, it can combat the kind of misinformation and bogus conspiracy theories that so frequently permeate our politics. At the same time, Bayesian reasoning can reveal real conspiracies, should they exist, by demonstrating that a particular theory about a conspiracy explains the facts better than the alternatives. What Bayesian reasoning provides is a universal approach to determining truth. Beliefs should not be believed blindly; they should be tested continually.

If we can make some simplified form of Bayesian reasoning common practice for everyone, it would reduce the collective ignorance of society practically overnight. People who believe irrational things would begin to shed their beliefs that are contradicted by reality and testing. Scientists and medical professionals would likewise not overstate their certainty, which they tend to do (studies show that physicians fail to use Bayesian reasoning as much as average people do).

So, the question is, if this form of logic is our weapon against irrationality and ignorance, how do we make it go mainstream? For one, logical reasoning and evidence-based thinking should be a part of standard education curriculums. New methods of education, such as gaming and virtual reality, could also provide ways to make Bayesian reasoning stick.

Being ignorant about a particular topic isn’t shameful. None of us know everything — that’s an impossible task. Ignorance does not come from a lack of education, but an unwillingness to seek education. Ignorance is a consequence of refusing to change your beliefs when reality is constantly contradicting them. If we want to increase our chances of success in life, and minimize our ignorance, then we must be willing to challenge our own viewpoints and update our models of reality in light of new evidence.

Bobby Azarian is a cognitive neuroscientist and the author of the new book The Romance of Reality: How the Universe Organizes Itself to Create Life, Consciousness, and Cosmic Complexity. He is also a blogger for Psychology Today and the creator of the Substack Road to Omega. Follow him @BobbyAzarian.