Ever wonder why good people do terrible things? It’s a question that keeps philosophers up at night and haunts the pages of every history textbook covering the 20th century. In 1961, a young psychologist at Yale named Stanley Milgram decided to stop theorizing and start measuring. He wanted to know the breaking point. Specifically, what was the purpose of the milgram experiment if not to see just how far an average Joe would go before saying "no" to a man in a lab coat?
People often think Milgram was just some guy who liked shocking people. That's a misunderstanding of the era he lived in. The trial of Adolf Eichmann, a major architect of the Holocaust, was happening in Jerusalem right as Milgram was setting up his lab. Eichmann’s defense was basically, "I was just following orders." Most people in America at the time thought that was a uniquely German trait. We thought we were different. We thought we were "too individualistic" to succumb to blind obedience. Milgram suspected we were wrong.
Breaking Down the Actual Purpose of the Milgram Experiment
At its heart, the experiment wasn't about cruelty. It was about the social psychology of authority. Milgram wanted to see if the "Germans are different" hypothesis held water. He set up a scenario that looked like a study on "memory and learning." There was a "teacher" (the real subject) and a "learner" (an actor named Mr. Wallace). Every time the learner got a word-pair wrong, the teacher was told to flip a switch.
The shocks started at 15 volts. They went all the way up to a lethal 450 volts, marked with chilling labels like "Danger: Severe Shock" and finally just "XXX."
You’ve probably heard the results. They were terrifying. Roughly 65% of participants went all the way to the end. They didn't do it because they were monsters. Many were shaking. They were sweating. They were stuttering and digging their fingernails into their palms. But they kept flipping the switches because a man in a grey lab coat told them, "The experiment requires that you continue."
The Authority Figure and the Agentic State
Milgram’s true goal was to observe a transition he called the agentic state. This is when a person stops seeing themselves as a free-thinking individual and starts seeing themselves as an "agent" of someone else’s will.
🔗 Read more: Deg f to deg c: Why We’re Still Doing Mental Math in 2026
Once you enter that state, you feel like you aren't responsible for what happens. The burden of morality shifts to the person giving the orders. It’s a scary mental loophole. Honestly, we see this today in corporate scandals or even in toxic internet dogpiles. People stop asking, "Is this right?" and start asking, "Am I doing what I was told?"
The Misconceptions We Still Carry
There’s a huge myth that the participants were just mindless drones. They weren't. They argued. They complained. They begged the experimenter to check on the guy in the other room. But—and this is the crucial part—they didn't stop.
There's a subtle difference between "dissent" and "disobedience." You can complain all you want, but if you're still pressing the button, you're still obedient. Milgram proved that talk is cheap. Our actions are dictated by the social architecture around us far more than our internal moral compass.
Another weird detail? The "learner" wasn't actually getting shocked. Mr. Wallace was in another room, playing a tape-recorded set of screams. At 150 volts, he’d yell about his heart condition. At 300 volts, he’d kick the wall. After 330, he went dead silent. The "teacher" had to assume the guy was unconscious or worse. And yet, the majority kept going.
The Ethical Firestorm
You can't talk about the purpose of the Milgram experiment without talking about the mess it left behind. The participants were basically traumatized. Imagine finding out you’re capable of murdering a stranger because someone told you to. That’s a heavy realization to carry home on a Tuesday afternoon.
💡 You might also like: Defining Chic: Why It Is Not Just About the Clothes You Wear
Critics like Diana Baumrind slammed Milgram for the psychological damage he caused. She argued the "knowledge" gained didn't justify the emotional scarring of the subjects. Because of this study, we now have Institutional Review Boards (IRBs). You literally cannot do this experiment today in most developed countries. The rules changed specifically because Milgram pushed the envelope too far.
Why This Still Matters in 2026
We like to think we’re more "enlightened" now. We have the internet. We have social awareness. But does that change our hardwiring?
Recent replications (with lower voltage limits for safety) suggest the answer is no. Whether it’s Jerry Burger’s 2009 study or various international versions, the numbers remain depressingly consistent. About two-thirds of us will fold under the pressure of a perceived authority.
It’s not just about lab coats anymore. It’s about "the algorithm," the "boss," or "the party line." The purpose of the milgram experiment was to show us the mirror. It showed us that the line between a "good citizen" and a "perpetrator" is thinner than a sheet of paper.
Redefining Our Understanding of "Evil"
Hannah Arendt famously coined the phrase "the banality of evil." Milgram provided the empirical evidence for it. Evil isn't always a cackling villain. Often, it’s just a guy who wants to do a good job and not make a scene.
📖 Related: Deep Wave Short Hair Styles: Why Your Texture Might Be Failing You
If you want to avoid being the person who flips the switch, you have to recognize the pressure in the moment. You have to realize that "just doing my job" is the first step toward the agentic state.
Actionable Insights for the Real World
Understanding this isn't just about trivia. It’s about building a "disobedience muscle." Here is how you actually use this information:
- Identify the Grey Lab Coat: When someone asks you to do something that feels "off," ask yourself: Is this person an authority I should actually trust? Or do they just look like one?
- Acknowledge Personal Responsibility: Tell yourself, "If I do this, it is my fault." Never use the word "they." Don't say "They told me to." Say "I am choosing to do this." It changes the chemistry of the decision.
- Find an Ally: Milgram found that if a second teacher was in the room and that person refused to continue, obedience plummeted. It is incredibly hard to be the first person to say no. It’s much easier to be the second. Be the person who gives someone else the courage to stop.
- Watch for Incrementalism: The shocks didn't start at 450 volts. They started at 15. The slide into unethical behavior is almost always a slow ramp. If you find yourself justifying a "small" compromise today, you’re training yourself for a "large" one tomorrow.
The true legacy of Stanley Milgram isn't a set of statistics. It’s a warning. He didn't find that people are inherently bad; he found that we are dangerously susceptible to the social structures we inhabit. To stay human, you have to be willing to be "disobedient" when the situation demands it. The experiment ended decades ago, but the test continues every single day in offices, schools, and homes across the world.
Don't wait for a scream from the other room to decide where your line is. Decide now. Awareness is the only real armor we have against the "requirements" of the experiment.