👋 Hey there! My name is Abhishek. Welcome to a new edition of The Sunday Wisdom! This is the best way to learn new things with the least amount of effort.
It’s a collection of weekly explorations and inquiries into many curiosities, such as business, human nature, society, and life’s big questions. My primary goal is to give you some new perspective to think about things.
Note: If you find this issue valuable, can you do me a favour and click the little grey heart below my name (above)? It helps get the word out about this budding newsletter. 😍
Q: What is a good way to measure the strength of an opinion?
Today, let’s talk about making decisions in a group. We never have all the information to make informed decisions. So, it’s better to make a provisional decision (based on what we know so far) instead of waiting around for the correct information to present itself. This is the first step.
The second step is to gather data to disprove this hypothesis and avoid confirmation bias. While this framework may work in personal decisions, it may not work so well in groups. Following this principle requires strong personal discipline, but when group dynamics come into play, other members may not be as diligent as an individual. In the subsequent paragraphs I discuss couple of ways we can attack this problem.
In the mid-80s, futurist and entrepreneur Paul Saffo developed a mantra that spread through Silicon Valley in numerous company cultures: strong opinions weakly held. The saying was intended to combat a fairly common problem in startups: the indecision and paralysis that comes from uncertainty and ambiguity.
“I have found that the fastest way to an effective forecast is often through a sequence of lousy forecasts. Instead of withholding judgment until an exhaustive search for data is complete, I will force myself to make a tentative forecast based on the information available, and then systematically tear it apart, using the insights gained to guide my search for further indicators and information. Iterate the process a few times, and it is surprising how quickly one can get to a useful forecast,” Saffo wrote.
In other words, we should allow our judgement to come to a stopgap conclusion — no matter how imperfect. This is the “strong opinion” part. But we should also engage in creative doubt, think in bets, and look for indicators that may point towards a different direction. We should let our intuition guide us towards a different conclusion if the new evidence says so. This is the “weakly held” part. According to Saffo, if we do this enough times, we’ll be able to reach a useful result through a sequence of faulty conclusions.
So far, so good. But a problem arises when the “boss” has a strong opinion. No matter how weakly held their opinion is, it needs an inordinate amount of conviction and gumption to challenge that.
But, this is solvable. As long as there are good protocols to challenge somebody’s opinion in a civilised manner, this is achievable. The real problem is when this principle is hijacked for the convenience of bullshitters — people who like to have opinions loudly over and over without any acknowledgement that their statements are unsubstantiated, or possibly even damaging. In other words, the “trumps”.
For example, the biggest trump of all, the great Donald Trump “strongly” held the opinion that Barack Obama wasn’t born in America, that the legitimacy of his presidency was a sham. This forced the then Obama govt. to procure his birth certificate from Hawaii and put it up for the public to see. Trump didn’t back down until then.
You see, the problem is that trumps have a tendency to turn strong opinions weakly held into strong opinions assumed to be true until you prove otherwise. They hold a strong opinion (based on nothing) but they put the onus of proving them wrong upon others. Instead of using this principle, they abuse it.
Donald Trump maybe the most obvious (and obnoxious) case, but trumps are lurking everywhere — in teams, meetings, discussions, etc. They may not be full time trumps. They aren’t even evil. They are just self-serving folks who want to sway the group towards their own agenda. Most of the time, they aren’t even doing it consciously. I mean, let’s face it, nobody would like to have a strong opinion only to have it debased by somebody else. It’s against human nature.
There’s always a shrewd person in a meeting who can state their case with absolute certainty and shut down further discussion. Others either assume that this person knows best, or don’t want to stick their neck out and risk criticism. This is especially true if the person in question is senior in hierarchy, or if there is some sort of power differential in the group.
We can’t do much about the Trumps in politics. In their defence, they have to have strong opinions strongly held to get the support of the crowd. If a leader says their chances of curbing a virus is only 80%, they may not win the next election.
But in a much smaller setting — such as in a group — there are certain measures we can take to design a process that makes sure voices are heard and strong opinions are weakly held for real. And the first step in the process is to know the strength of one’s opinion.
But when asked how sure is somebody about something the late and great Amos Tversky used to joke that the human brain falls back to “yes I believe that, no I don’t believe that, and maybe” — a simple three-dial setting by default. But sound decision-making — especially in groups — needs more nuance, and for lack of better word, more dials.
Let me illustrate this with an example to explain the gravity of the problem. In the late 40s, the Communist government of Yugoslavia broke from the Soviet Union, raising fears the Soviets would invade them. The US released a report (the National Intelligence Estimate or NIE) based on their analysis. “Although it is impossible to determine which course of action the Kremlin is likely to adopt,” it concluded, “we believe that the extent of Eastern European military and propaganda preparations indicates that an attack on Yugoslavia in 1951 should be considered a serious possibility.”
By most standards, that is clear, meaningful language. A “serious possibility” indicates a strong likelihood of the attack. No one suggested otherwise when the report was read by top officials in the US.
But it was found later that by “serious possibility” the report actually suggested that “the odds were about 65 to 35 in favour of an attack.” Interestingly, the analysts who wrote the report had varying interpretations of the phrase themselves. When asked, one of them said it meant odds of about 80 to 20 (or four times more likely than not) that there would be an invasion, while another thought it meant odds of 20 to 80 — exactly the opposite. Other answers were scattered between these extremes, despite all having written the report together.
This floored Sherman Kent, who headed the NIE report. You see, something that is “possible” has a likelihood ranging from almost zero to almost 100%, but that’s not helpful. Kent suggested that we should narrow the range of our estimates to better communicate our forecast. Not only that, we should also assign a numerical value to our estimates, so as to avoid any ambiguity. He suggested the following:
100% → Certain
93% (give or take about 6%) → Almost certain
75% (give or take about 12%) → Probable
50% (give or take about 10%) → Chances about even
30% (give or take about 10%) → Probably not
7% (give or take about 5%) → Almost certainly not
0% → Impossible
Words like “serious possibility” suggest the same thing these numbers do. The only real difference being that numbers make it explicit, thereby reducing the risk of confusion. Bingo! Sherman Kent is often described as the father of intelligence analysis, and not without good reason.
Now, how can we translate this into an office setting? First, if we state our strong opinion as a probability judgement, we are forced to calibrate the strength of our opinion. In other words, we are forced to let go of the ‘yes, no, maybe’ dial in our head.
Not only that, by framing our forecast/conclusion/opinion/prediction as a bet, we suddenly have skin in the game, and are motivated to get things right from the beginning.
When the boss replaces “there is a strong possibility that we can increase our revenue by increasing our prices” with “I’m 65% sure that we can increase our revenue by increasing our prices,” light bulbs go off everywhere. Now everyone can look into the variables involved, consider the case from the boss’s point of view, and weigh them against their own conclusions.
If the probability seems high, the line of enquiry flows effortlessly into exploring why their mental model differs so much, and how to synchronise them. This way, we can argue on principle instead of person.
But many people struggle to view the world in terms of estimates and probabilities. For example, if we say that wearing a mask makes us 70% less likely to accidentally infect someone, it won’t be a strong enough argument to motivate people to change their behaviour. To motivate people to change the status quo, we must eliminate all uncertainty.
People don’t understand probability and hence it’s better to not use them unless absolutely necessary. Saying something like, “There’s a 75% chance this company’s gonna go down, but let’s give our 100%,” isn’t likely to get the team motivated.
Truth is, we love catchphrases. We love drama. Poetry moves people. We should use language appropriately when we need to rally people’s support for a cause. But when we want to bring people together to make an informed decision, stating predictions in terms of estimated probability is a quicker and more effective way to draw people into the process. Because when it comes to finding the truth, it’s better to be a bookie that a poet. I’m 90% sure about it.
Interesting Finds
We are living in a time when pop culture is being increasingly determined by TikTok’s algorithms. This also tends to mean that what we’re seeing is the lowest common denominator of what human beings want to look at, appealing to our most base impulses and exploiting existing biases toward thinness, whiteness, and wealth. (Rebecca Jennings / Vox)
Humanity has completely changed the food it eats. Processed food isn’t just a modern invention, created in factories from artificial ingredients. It is as old as humanity itself and may have helped create our species. (Nicola Temple / BBC)
Why are urban fish ponds and why are they important? Brilliant article on how low-tech aquaponic-fish pond sewage systems provides not only clean water but also converts otherwise wasted nutrients to fertiliser and food, with current and historical examples from India, Germany, Vietnam, China, etc.
In 2019, Instagram announced it would test a feed without likes. After more than two years of testing, Instagram announced what it found: removing likes doesn’t seem to meaningfully depressurise Instagram, for young people or anyone else, and so likes will remain publicly viewable by default. But all users will now get the ability to switch them off if they like, either for their whole feed or on a per-post basis. Turns out one-size-fits-all solutions make us miserable.
We Become What We Behold is a mini-game about the news! It’s a simple game that demonstrates what gets captured, why it spreads, and how it affects us and future news. Try it on your computer. Pretty smartly done.
Humour is a powerful tool in a parent’s arsenal. But it’s smart to stop every once in a while and consider one of comedy’s biggest rules: timing. If you crack wise to avoid serious emotions, you may seem less capable of helping your kid handle serious things. Whether you are a parent or not, it’s important to know when to use humour and when not to. (Adam Bulger / Fatherly)
Google Maps began life as a thought bubble at Where 2’s headquarters. Before starting Where 2, Stephen Ma was working at a petrol station, Noel Gordon was working as a fabric cutter in his father-in-law’s clothing factory in the inner Sydney suburb of Newtown, and Jens Rasmussen had been sleeping on his mother’s couch back home in Denmark. Even a decade ago this was not a typical start-up. Not only were the founders more middle-aged, but thanks to Gordon’s then girlfriend, now wife, they worked civilised hours. Everyone had to down tools by 6pm and take the weekends off. (Stephen Hutcheon)
Quote to Note
The thing that makes money money is trust—when we trust that we will be able to buy stuff with this piece of paper, or this lump of metal, tomorrow, and next month, and next year.
— Jacob Goldstein, Money
Tiny Thought
Take bets that have less tail risk and high tail profit, i.e., convex bets that have less downside (if you lose) and high upside (if you win).
Starting a business (even a side business) may be the best example. Downside of a few years worth of salary lost, but with an upside of becoming a millionaire.
Before You Go…
If you’re finding this newsletter valuable, share it with a friend. Also, consider subscribing. If you aren’t ready to become a paid subscriber yet, you can also give a tip by buying me a coffee. ☕️
I’ll see you next Sunday,
Abhishek 👋