Coronavirus. It’s a crazy time to be a human being on planet earth – travel restrictions, lockdowns, quarantines, social distancing are all words and phrases we’ve seen repeated over and over again the past weeks. But how did we get here? I am not an epidemiologist, nor am I a political insider so I can’t answer that question. What I can talk about are the reactions many the population are having towards how the situation was and is being handled.
I’ve seen many people out in the world talking about how we can’t trust the government or experts because these people repeatedly downplayed the risk of coronavirus and now, we’re paying for the laissez-faire attitude we originally had. Some mention how they “knew” this was a bigger deal than originally thought to be; that the government was hiding the truth from them (albeit a reasonable argument given the news around Senator Richard Burr and others in the USA selling off stocks after a classified briefing on coronavirus); and that the experts are always wrong – whether they oversell or undersell the current issue at hand. These people call for others to reject authority, trust your own instincts (and conversely, to trust no one), and to call out the elitist, condescending attitudes in which some experts and political figures projected when conversing about coronavirus. Yet, this is incredibly misguided.
What many fail to realize is that trusting experts is always going to be the positive expected value decision and that just because it is more likely to be the correct decision, doesn’t mean it is always going to be right. Experts are expert for a reason – they’ve studied a particular topic for years and often decades. Simply put, they know more than the person on social media who is suddenly an expert in epidemiology. However, experts are sometimes wrong. This should be a given, no one is ever right every single time. Mistakes get made, estimates get over/under blown, assumptions are just that – assumptions, and often times not all information is known when it’s time to make a decision. Early on in coronavirus, no one quite knew just how fast and easily it could spread and therefore, decisions were made based on how previous viruses acted. This in and of itself can be a misstep (“past performance is not indicative of future performance” is a phrase we’ve all heard before) but sometimes that’s all the information you have available at the time.
Circling back to listening to experts being the positive expected value decision. What this means is an expert is more likely to be right than a non-expert (you, your friend, your co-worker, or a President who thinks he knows everything). The odds of you getting something right could be 10% or even 90%. But odds of the expert getting it right will be more than you. The difference could be 0.1%, or even 90% in favour of the expert but all that matters it that they are more likely to be right than you. Conversely, just because they are more likely to be right than you does not mean they will always be right. Let’s say you have a 5% chance of being right, and the expert has a 10% chance of being right. You should listen to the expert, but you should also acknowledge there is a 90% chance of them being wrong. However, in the long run, the best outcomes will be a result of the expert, not you.
This is not to say one should blindly follow an expert. As mentioned, it’s possible the expert only has a 10% chance of being right. You should always question and think critically of why advice is being given, what factors are at play, the incentives involved, what information is being included or excluded that should or shouldn’t be, what information isn’t available that could impact the decision, and so on. The key here is to analyze the decision-making process and to re-evaluate once the situation has materialized. I’ve seen people say “I just knew coronavirus would be bad” without offering any evidence as to why they thought that or what made their decision a sound one. A feeling is not sufficient grounds to make a decision on how to handle a crisis. A feeling may be a good indicator in other situations like whether to take a job or break-up with your significant other – but it is definitely not useful in decisions like these.
Additionally, numerous biases come into play with everything I have just written. Here’s a list of relevant biases. It’s important to constantly review and analyze how cognitive biases are influencing decision-making both on your decisions and on the decision’s others make.
- Choice-supportive bias
- The tendency to remember one’s choices as better than they actually were
- Conservatism (belief revision)
- The tendency to revise one’s belief insufficiently when presented with new evidence
- Dread aversion
- Losses yield double the emotional impact of gains; dread yields double the emotional impact of savouring
- Dunning-Kruger
- The tendency for unskilled individuals to overestimate their own ability and the tendency for skilled individuals to underestimate their own ability
- Gambler’s fallacy
- The tendency to think that future probabilities are altered by past events, when in reality they are unchanged
- Hindsight bias
- The tendency to see past events as being predictable at the time those events happened
- Hot-hand fallacy
- The belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts
- Neglect of probability
- The tendency to completely disregard probability when making a decision under uncertainty
- Normalcy bias
- The refusal to plan for, or react to, a disaster which has never happened before
- Outcome bias
- The tendency to judge a decision by its eventual outcome instead of based on the quality of decision at the time it was made
- Plan continuation bias
- Failure to recognize that the original plan of action is no longer appropriate for a changing situation or for a situation that is different than anticipated
- Pseudo certainty effect
- The tendency to make risk averse choices if the expected outcome is positive, but make risk seeking choices to avoid negative outcomes
I’d like to highlight the pseudo certainty effect because it brings to light an issue in politics. Almost every politician is not an expert in epidemics. However, they are experts in getting elected. It is easy to see a possibility where experts were consulting with the government and advising them to do things like shutting down borders and to aggressively implement social distancing much earlier than those decisions happened. It is also to see the negative effect of these measures (massive hit to the economy). Early on, it was a politically risky decision to shut things down as it would have destroyed the economy and lead to the general population asking why and being upset. The benefit would have been enormous but a benefit you can’t see (ie. people not getting sick) is a benefit that doesn’t exist, to most people. It was a risky decision to continue on as normal (normalcy bias) to avoid a negative outcome (economy tanking, angry populace, getting voted out next election). This likely lead into somewhat of a plan continuation bias until it was too late to take meaningful preliminary action and now here we are – tanking the economy anyways but everyone is also sick.
In closing, have trust in experts, trust politicians less, and question both. Consider the decision-making framework you employ, evaluate it regularly for sound processes, and don’t get overconfident if you get something right. Continue social distancing and stay safe friends.