Your brain is a filter. Think about that for a second. Right now, as you read this, you aren’t seeing the world as it actually is; you’re seeing it through a thick lens of past traumas, family dinners, that one weird book you read in college, and the city where you grew up. When someone asks, "do you believe in this or that?" they aren't just asking for a fact-check. They’re asking about your identity.
Belief isn't a choice we make in a vacuum. It’s a messy, biological, and deeply social process. You don't just "decide" to believe the sky is blue or that your favorite sports team is the best. It’s ingrained. But why? Why do we cling to ideas even when the evidence says we’re dead wrong? Honestly, the answer is a mix of survival instincts and how our neurons fire when we feel threatened.
The Cognitive Glue: How Ideas Stick
We like to think we're rational. We aren't. Not really. Humans are actually "rationalizing" creatures rather than rational ones. We start with a feeling—a gut instinct—and then we hunt for the "facts" to back it up. Psychologists call this motivated reasoning.
Take the work of Leon Festinger. Back in the 1950s, he looked at a cult that believed the world was ending. When the world didn't end, did they quit? No. They believed harder. They claimed their prayers saved the world. This is cognitive dissonance in the wild. When your core belief is challenged, it actually hurts. Literally. Brain scans show that when people hear political or religious arguments that contradict their views, the amygdala—the "fight or flight" center—lights up. You aren't just disagreeing; you're feeling attacked.
Why Do You Believe the Things That Hold You Back?
So much of what we think is "true" is just a story we told ourselves to survive a specific moment. Maybe a teacher told you that you were bad at math when you were seven. Now, at thirty-five, you still say, "I'm not a numbers person."
👉 See also: Sleeping With Your Neighbor: Why It Is More Complicated Than You Think
That’s a belief. Is it true? Probably not. But it’s comfortable.
Social psychologist Jonathan Haidt argues in The Righteous Mind that our moral beliefs are like a rider on an elephant. The elephant is our emotion—big, powerful, and hard to turn. The rider is our conscious mind, trying to steer. Most of the time, the elephant goes where it wants, and the rider just makes up excuses for why they "meant" to go that way all along.
The Social Cost of Changing Your Mind
Humans are pack animals. Historically, if you got kicked out of the tribe, you died. Simple as that. This is why do you believe what your neighbors believe is often a question of safety.
If everyone in your social circle thinks a certain way about the economy or a specific celebrity, and you start to think differently, you risk social "death." We prioritize belonging over being right. We see this in echo chambers online every single day. Algorithms don't just show us what we like; they reinforce the walls of our digital tribes.
✨ Don't miss: At Home French Manicure: Why Yours Looks Cheap and How to Fix It
It’s scary to change your mind. It means admitting you were wrong, which is one thing, but it also means potentially losing your community.
The Neuroscience of Certainty
When you feel "certain" about something, your brain releases dopamine. It feels good to be right! It feels like a reward. Conversely, uncertainty feels like physical pain or anxiety.
Robert Burton, a neurologist, has written extensively about the "feeling of knowing." He argues that certainty is a mental sensation, not a logical conclusion. It’s a bit like an itch or an orgasm—it’s something that happens to you. This explains why you can’t just "argue" someone out of a belief. You’re trying to use logic to fight a chemical sensation. It’s like trying to talk someone out of being hungry.
Breaking the Cycle: Intellectual Humility
If we want to actually grow, we have to get comfortable with being uncomfortable. We have to ask ourselves: "What would it take to change my mind?"
🔗 Read more: Popeyes Louisiana Kitchen Menu: Why You’re Probably Ordering Wrong
If the answer is "nothing," then you aren't holding a belief; the belief is holding you.
Real intelligence isn't about how much you know. It’s about how much you’re willing to unlearn. This is what researchers call intellectual humility. It’s the recognition that your "filter" might be smudged.
Actionable Steps to Audit Your Beliefs
You don't have to be a slave to your subconscious programming. You can actually poke at your own convictions to see if they hold water. It’s not about being a cynic; it’s about being a conscious participant in your own life.
- Identify the "Shoulds": Pay attention to every time you say "I should" or "People should." These are usually inherited beliefs from parents or society that you haven't vetted yourself.
- Steel-manning: Instead of "straw-manning" an argument you hate, try to build the strongest possible version of it. If you can't understand why a smart person would believe the opposite of you, you don't actually understand the issue yet.
- Check Your Sources (Internal and External): Ask yourself, "Where did I first hear this?" If the answer is "I've just always known it," it's time to do some digging.
- Lean into the Flinch: When you read something that makes you angry or defensive, don't close the tab. That "flinch" is your brain's defense mechanism. Sit with it. Ask why that specific idea feels like a threat to your identity.
- Diversify Your Inputs: Follow people who disagree with you but are clearly intelligent and well-intentioned. It breaks the "us vs. them" binary that the brain loves so much.
The world is far more complex than the stories we tell ourselves. By examining why do you believe what you believe, you stop being a passenger in your own mind. You start choosing your direction based on reality, not just the comfortable echoes of the past. It’s a constant process of editing, deleting, and updating. It never really ends. And that's okay.