Milgram’s obedience experiment explains why Congress is corrupt
In 1961, Stanley Milgram conducted the first of a series of very famous psychological experiments to gauge people’s level of obedience to authority. What happens when you pit someone’s conscience against the orders of an authority figure? In his own words, “Could it be that Eichmann and his million accomplices in the Holocaust were just following orders?”
The setup was simple. A study participant came in and was told chance would determine whether he was the “teacher” or “learner.” The participant always ended up the teacher while an accomplice of the experimenter was the learner. The experimenter directed the teacher to deliver increasing electric shocks to the learner if he answered questions incorrectly. Surprisingly, two-thirds of participants are willing to deliver fatal 450-volt shocks, regardless of many other variables and in the face of screaming, pleading, and even reference to a heart condition.
Why are people willing to nearly kill someone they just met because a guy in a lab coat requests it? An explanation isn’t even offered. Is this really something we’re supposed to find obvious; people are just obedient to authority? Extraordinary results require an extraordinary explanation, and the nearly circular “authority is authoritative” is not satisfying.
Similarly perplexing is the extreme corruption of those few who have the power to do great good. Our elected officials, whom we vote for because their values match ours, are quite regularly convinced by lobbyists to go against these values. It seems they’re in the perfect position to take a stand. Could it be that those who seek to represent us are invariably bad, corrupt people? Unlikely. This also must have a more satisfying explanation.
What argument might persuade representatives to compromise their values in exchange for favors? Consider how the argument might play out. A lobbyist offers you vote his way on his issue and you will be compensated. You rebut that this is against your principles. The lobbyist then takes advantage of you being only one of a large group of one to several hundred. “It will be done regardless of how you vote. Either you do me this favor or someone else will and they will get my favor in return.” Your individual voice carries no weight unless most others share your conviction. This changes things significantly.
With lobbyists tossed into the mix, a voting system becomes a genuine prisoner’s dilemma. If most of your peers side with you, everyone wins. If an insufficient number do, everyone loses. However, any number of you can jump ship and take the lobbyists’ offers if you believe you would lose regardless of your vote, resulting in a half victory for those who did. As far as game theory, the prisoner’s dilemma does an excellent job of explaining how and why people seem to lose their conscience when individuals are capable of compromising a group victory. It doesn’t even need to cast them as the sociopathic drones of Milgram’s experiment, simply rationally self-interested actors.
Surprisingly, Milgram’s experiment bears a great resemblance to our corrupt officials’ gaming of the system, and the same game theory understanding of the prisoner’s dilemma can be used to understand it. His subjects were also part of a group and subject to lobbying. These influences were just invisible. Unless you look deeper.
The group, the participant’s metaphorical fellow congressmen, were the others brought in for the experiment. Even though participants weren’t told they were one of many, they certainly weren’t stupid. We all know what a study is. The elaborate setup of dials and rooms is not spontaneously dismantled if you walk out. The next person is simply brought in. This is so obvious it couldn’t even be denied explicitly by the experimental protocol without sounding comically deceptive. When the experimenter asks you to throw the switch, clearly it will be done one way or the other. He could do it himself, if he were so disposed, but apparently he is not. He is the lobbyist asking you to cast a vote against your conscience. You do so not because you are a bad person, but because he will eventually find someone to do it. If it will be done, why not be the one to do it and perhaps get a favor in return? The unconscious mind takes over, and we see that it’s not so much about authority as it is about power. Show this man who somehow has the power to cage and electrocute humans that you are capable of doing dirty work, and perhaps he will reward you when you need it.
Since this new conclusion also supposes a new hypothesis, an experiment is easily designed around it. Create a situation in which the participants feel they have the power to decide whether an individual will be shocked or go home unharmed. An experimental setting with an authority figure who can clearly do the job himself does not suggest to participants that they are much in control of ultimate outcomes, regardless of whether they can choose not to push a button. Alternatively, we could ask participants in the original experiments how strong an effect they believe their individual vote has on the outcome of an election. Serious voters should be less obedient; they already believe their actions meaningful even in large groups.
If further experiments verify this new hypothesis, we’re left with the crucial task of circumventing this feeling of powerless opportunism that allows people to do harm simply because others will if they don’t. We now have a pool of elected officials who are effectively delivering the deadly 450-volt shock to us all in an obedience experiment conducted by lobbyists.