The Illusion of Knowing
How the Dunning-Kruger Effect shapes our thinking and erodes trust in science.
This post is part two in a series examining how easy it is for us to oversimplify complex ideas, especially when it comes to science and health.
We have all met someone who speaks with total confidence about a topic they clearly do not understand. Maybe it is the co-worker who suddenly became an expert in nutrition after watching a documentary, or the uncle who insists he knows more than climate scientists because he read a few articles online. If we are honest, we have probably been that person ourselves.
The Dunning-Kruger Effect
This is where the Dunning-Kruger Effect comes in. First described by psychologists David Dunning and Justin Kruger in 1999, it explains how people with limited knowledge or skill in a domain tend to overestimate their ability. At the same time, experts often underestimate themselves because they are more aware of the limits of their knowledge.
In the context of public health, this matters a lot. Interpreting research is complex. It requires training, statistical literacy, and an ability to weigh evidence across multiple studies. Yet the Dunning-Kruger Effect helps explain why people with little scientific background feel so confident that they can “see the truth” about vaccines, nutrition, or wellness trends. Their certainty is appealing. It is easier to trust a simple story told with confidence than a nuanced explanation filled with uncertainty.
But here is the catch. The Dunning-Kruger Effect has its own limitations. Some researchers argue that it oversimplifies human behavior and leans too heavily on the idea that ignorance alone explains overconfidence. In reality, confidence is shaped by many factors, including personality, culture, and context. People are not always blind to their gaps in knowledge. Sometimes they overstate their confidence for social reasons, or because the situation rewards certainty over humility. Consider politicians or influencers who understand that bold claims are more likely to go viral than careful explanations.
So while the Dunning-Kruger Effect is a useful tool for understanding why people sometimes speak with misplaced authority, it is not the whole story. Overconfidence is not always ignorance. Sometimes it is performance, sometimes it is strategy, and sometimes it is a mix of both.
For me, the lesson is not to weaponize the Dunning-Kruger Effect as a way to dismiss others. It is to remember how easy it is for all of us to slip into overconfidence when the topic is complicated. Science and public health require humility. The most honest answer is often the least satisfying one: “It is complicated, and I do not fully know.” It’s an invitation to embrace the discomfort of nuance.
Why is this important?
When we put this all together, it is easy to see why this matters so much for how we think about science. Oversimplification is understandable. Complicated topics are overwhelming, and our brains naturally look for shortcuts. It feels better to land on a quick explanation than to sit in uncertainty. But this is also where we get into trouble.
Conspiracy theories thrive in that space. They often start with fear and then offer a simple story to explain it. At first glance, that makes sense. If something feels threatening or confusing, our instinct is to reduce it to something we can control. The problem is that these stories erode trust in experts. And when that trust breaks down, we are left with a void. That void quickly fills with louder voices who are not trained in science but who are very skilled at using fear to gain attention and influence.
It is important to note that the people who believe these stories are not foolish or broken. Fear makes these ideas compelling, and fear is a deeply human response. Having compassion for why people are drawn to conspiracy theories does not mean we agree with them. It means we understand that fear is powerful, and if we want to counter it, we need to build trust and offer something more meaningful than soundbites. I highly recommend you watch this piece from the NY Times. (Yes, it’s behind a paywall. Yes, if you’re not already subscribed to the NY Times, you’ll have to sign up. Yes, 100% subscribing to the NY Times is worth every cent.)
What’s Next?
Next week, we will continue with another way our brains trip us up: the post hoc ergo propter hoc fallacy. It is the tendency to believe that because one thing happened after another, the first must have caused the second. It is another shortcut our brains take, and it explains a lot about why conspiracy theories spread so quickly.

