When a legitimate authority figure has spoken, your common sense seems to become irrelevant.
|Dec 23, 2018||Public post|
In their book Medication Errors: Causes and Prevention, professors Michael Cohen and Neil Davis talk about a strange case of the “rectal earache”.
A physician ordered ear drops to be administered to the right ear of a patient suffering from pain and infection. Instead of writing out completely “Right ear” on the prescription, the doctor abbreviated it, “place in R ear.” The duty nurse misread “R ear” to be “Rear.” Upon receiving the prescription, she promptly put the required number of ear drops into the patient’s anus.
This is laughable and obviously made no sense, but neither the patient nor the nurse questioned it. The important lesson of this story is that in many situations in which a legitimate authority figure has spoken, your common sense seems to become irrelevant.
In the 1950s, Yale University psychologist Stanley Milgram conducted a series of experiments, popularly known as Milgram Experiment today. It was a series of social psychology experiments that measured the willingness of participants — men from a diverse range of occupations with varying levels of education — to obey an authority figure who instructed them to perform acts conflicting with their personal conscience.
Participants were led to believe that they were assisting an experiment in which they had to administer electric shocks to a subject. The subject was a trained actor faking pain and agony upon receiving shocks, and the electric shocks were actually fake. These fake electric shocks gradually increased to levels that would have been fatal had they been real.
The experiment found, unexpectedly, that a very high proportion of men would fully obey the instructions, albeit reluctantly.
I observed a mature and initially poised businessman enter the laboratory smiling and confident. Within 20 minutes he was reduced to a twitching, stuttering wreck, who was rapidly approaching a point of nervous collapse. He constantly pulled on his earlobe and twisted his hands. At one point he pushed his fist into his forehead and muttered: “Oh, God, let’s stop it.” And yet he continued to respond to every word of the experimenter and obeyed to the end.
— Stanley Milgram
The experiments began after the start of the trial of German Nazi war criminal Adolf Eichmann in Jerusalem. Eichmann was one of the major organisers of the Jewish genocide.
Milgram devised his psychological study to answer the popular contemporary question: “Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?”
The experiment was repeated many times around the globe, with fairly consistent results.
Authority bias is the tendency to attribute greater accuracy to the opinion of an authority figure and be more influenced by that opinion.
You’ve been trained since birth to believe that obedience to proper authority is right and disobedience is wrong, and hence this bias comes naturally to you. It’s hidden underneath your good nature.
This message of being obedient had filled the parental, societal, and the school lessons of your childhood. It’s even carried forward in all the systems you encounter as adults now.
Early on, these people (parents, teachers) knew more than we did, and we found that taking their advice proved beneficial — partly because of their greater wisdom and partly because they controlled our rewards and punishments. As adults, the same benefits persist for the same reasons, though the authority figures are now employers, judges, and government leaders. Because their positions speak of greater access to information and power, it makes sense to comply with the wishes of properly constituted authorities. It makes so much sense, in fact, that we often do so when it makes no sense at all.
— Influence: Science and Practice, Robert B. Cialdini.
The very first book of the Bible describes how failure to obey the ultimate authority resulted in the loss of paradise for Adam, Eve, and the rest of the human race. This is also what motivational writers, business pundits, startup advisors, CEOs, economists, consultants, and stock market gurus would also want to have you believe.
You look to those in power as having something special you lack —intellect, knowledge, or some spark of something you would like to see inside yourself. This plays well with your vanity, no matter how intelligent a person you are. This is why people sometimes subscribe to the beliefs of celebrities who endorse exotic religions, or denounce sound medicines.
This is also why celebrity endorsed ads work so well. When a celebrity player tells you to buy a particular brand of batteries, you generally don’t ask yourself if the player seems like an expert on electrochemical energy storage.
In general, you rarely go over the pros and cons of what an authority suggests. Hence, the worrisome situation arises when your “boss” makes a clearly wrong decision, and no one lower in the hierarchy thinks of questioning it.
If you team lead, boss, manager, or the CEO knows anything about this bias, she would definitely try to remind you and your colleagues over and over again not to follow her blindly, or act upon her decisions just because she asked. You should instead question her motives, be aware of her intentions, and when needed, counter her arguments and ideas if you happen to have better insight.
Unfortunately, many companies are light-years from this sort of foresight, especially those businesses with domineering CEOs — where employees are likely to keep their “lesser” opinions to themselves, much to the detriment of the business.
A good rule of thumb to avoid this bias is to learn and use the basic elements of you boss’s nature and job, whenever possible, to be able to analyse her judgement and make an unbiased assessment. You don’t have to know everything. It’s possible for you to know enough so that you can form an independent opinion of her decisions.
This same strategy applies outside the office as well. In fields that deal with a lot of uncertainty, when an expert is trying to give you her professional advice, try to scrutinise it extensively — especially when it is good for the advisor. This would generally apply to your wealth manager, investment adviser, or risk analyst. There’s a reason why brokers have personal yachts, but not their customers.
In conditions where the stakes are very high (e.g., stocks, startups, investments, high stakes poker), always, always double check, disbelieve, or replace much of what you’ve been suggested by an authority, or an expert, or a guru — to a degree that seems appropriate to you after giving it a good amount of objective thought. This is easier said than done, but it can be mastered with time, knowledge, and practice. It’s good to keep this credo in mind: your money is your money — not your advisor’s. Your business is your business — not your investor’s.
Having said that, I’m not suggesting that you should always be sceptical, and take every word of any authority figure with a sack full of salt. If you are a novice and have just started learning about a craft, it would be suggested to follow your teacher.
For other words, should you listen to a highly trained scuba diver’s advice before plunging into the depths of the ocean? Yes! Should you blindly believe that person when the diver talks about seeing a mermaid making love to a dolphin? Hell no!
P.S. I’m writing a book on avoiding and exploiting cognitive biases—both in business and in life. If this excites you and you want to collaborate in the research and creation of the book, please visit my Patreon page.