April 16, 2020 Reading Time: 7 minutes

Think again if you believe draconian controls recommended by a few (but far from all) medical experts are saving many lives from COVID-19. Facts reported by mathematician Yitzhak Ben Israel of Tel Aviv University don’t support such beliefs. 

Professor Israel found that no matter how much or little politicians quarantined the population, “coronavirus peaked and subsided in the exact same way.” Whether the country relied on politicians shutting the country down (the US and UK, for example) or private voluntary actions (Sweden), Prof. Israel’s work shows that “all countries experienced seemingly identical coronavirus infection patterns, with the number of infected peaking in the sixth week and rapidly subsiding by the eighth week.” 

In short, coercive measures imposed to protect the public from COVID-19 are as effective as throwing magical “tiger dust” in Central Park to keep tigers at bay in Manhattan. 

Why are we so enamored with experts and their tiger dust? Simply, we don’t understand the inherent fallibility of human beings. Well-meaning experts can be as destructive as authoritarian politicians. 

Nobel laureate Daniel Kahneman is a behavioral economist and psychologist. In his book Thinking, Fast and Slow, he writes, “Every policy question involves assumptions about human nature, in particular about the choices that people may make and the consequences of their choices for themselves and for society.” Mistaken or unexamined assumptions corrupt decision-making. 

Experts Have Cognitive Biases

In Thinking, Fast and Slow, Kahneman catalogs the many cognitive biases impairing human beings. Kahneman and his late research partner Amos Tversky “documented systematic errors in the thinking of normal people, and [they] traced these errors to the design of the machinery of cognition rather than to the corruption of thought by emotion.” In short, “severe and systematic errors” in cognition prevent us from being the rational thinkers we’d like to think we are. 

Reporting on the work of psychologist Paul Slovic, Kahneman writes, “[Slovic] probably knows more about the peculiarities of human judgment of risk than any other individual.” 

Slovic found “an affect heuristic” leads people to “let their likes and dislikes determine their beliefs about the world.” If you are biased towards strong government action to combat the coronavirus, you will believe the “benefits are substantial and its costs more manageable than the costs of alternatives,” such as relying more on voluntary adjustments by businesses and the public. 

Kahneman writes, “The affect heuristic is an instance of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think about it?).”

Media coverage of the impact of the coronavirus is “biased towards novelty and poignancy.” The resulting emotional reaction shapes our estimates of risk; many fear the consequences if draconian actions are not imposed by government.

If you’re thinking, that’s why we need to entrust these decisions to experts, you would be wrong: Experts have the same cognitive biases as the rest of us.

Because of expert bias, Slovic “strongly resists the view that the experts should rule” and dissuades us from believing “their opinions should be accepted without question when they conflict with the opinions and wishes of other citizens.”

Kahneman is clear about the implications of Slovic’s work. The public and expert policy-makers often have conflicting values. The public distinguishes between types of deaths, such as a 90-year-old-man with a heart condition and a 30-year-old-mother in good health. Nuances are lost in aggregate statistics.

Kahneman concludes, “Slovic argues from such observations that the public has a richer conception of risks than the experts do.” 

Risk is not objective. Kahneman writes, “In his desire to wrest sole control of risk policy from experts, Slovic has challenged the foundation of their expertise. Kahneman quotes Slovic: 

“Risk” does not exist “out there,” independent of our minds and culture, waiting to be measured. Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as “real risk” or “objective risk.”

Our evaluation of risk “may have been guided by a preference for one outcome or another.” “Defining risk is thus an exercise in power,” explains Slovik.

In other words, no matter how sincere the experts are, their preference for decisive action on the part of the government will warp their conception of risk and guide their policy recommendations. 

Don’t be fooled by the appearance of confidence on the part of the experts. Their confidence is no reason to trust them. Kahneman warns: “Overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.”

The Arrogant Can’t Solve Problems

 “Our ignorance is sobering and boundless,” philosopher Karl Popper famously observed.

The world is full of challenging problems, such as the coronavirus, and individuals have boundless ignorance. It is not surprising that Popper believed, “There are no ultimate sources of knowledge.” We can only “hope to detect and eliminate error.” We detect and eliminate error by allowing criticism of the theories of others and our own.

Popper provided us what could be a credo for humble individuals willing to admit the limits of individual knowledge: 

With each step forward, with each problem which we solve, we not only discover new and unsolved problems, but we also discover that where we believed that we were standing on firm and safe ground, all things are, in truth, insecure and in a state of flux.

The arrogant can’t solve problems because they are blind to their limits. Decision-makers, consumed by hubris, ignorant of their limited understanding, do not tap into knowledge held by others.  

In truth, each of us knows very little. As individuals, we are fallible decision-makers. We don’t know where solutions will emerge, or as economist William Easterly instructs us in his book, The Tyranny of Experts, “what will be the solution,” or “who will have the solution.” 

Cognitive Diversity

In his book The Wisdom of Crowds, James Surowiecki explains “there’s no real evidence that one can become expert in something as broad as ‘decision-making’ or ‘policy.’”

Among scientists who face a “torrent of information” each day, Surowiecki points out, “reverence for the well-known tends to be accompanied by a disdain for the not so well known.” 

Surowiecki is not arguing “that reputation should be irrelevant,” yet reputation “should not become the basis of a scientific hierarchy.” Instead, a “resolute commitment to meritocracy” fuels discovery. In the current crisis, where dissenting voices are being shut out, it is hard to see how commitment to meritocracy is being upheld.

Surowiecki points to “cognitive diversity” as a key to forming teams that are more than the sum of their members. Surowiecki has counterintuitive conclusions for those who believe in decision making by elite experts:

If you can assemble a diverse group of people who possess varying degrees of knowledge and insight, you’re better off entrusting it with major decisions rather than leaving them in the hands of one or two people, no matter how smart these people are.

Surowiecki warns of groupthink that occurs when in groups, we emphasize “consensus over dissent.” 

Examining decision-making at the National Aeronautics and Space Administration (NASA) during the space shuttle Columbia catastrophe, Surowiecki found an abject lesson: “Instead of making people wiser, being in a group can actually make them dumber.” 

In that crisis, “team members were urged on many different occasions to collect the information they needed to make a reasonable estimate of the shuttle’s safety.”

The evidence was there of the potential consequences of the debris strike on the Columbia shuttle. 

Yet, during the group meeting after debris struck Columbia, there was “the utter absence of debate and minority opinions.”

The team could have made “different choices that could have dramatically improved the chances of the crew surviving,” yet they “never came close to making the right decision on what to do about the Columbia.

Why? “First, the team started not with an open mind but from the assumption that the question of whether a foam strike could seriously damage the shuttle had already been answered.” Surowiecki explains:

Rather than begin with the evidence and work toward a conclusion, the team members worked in the opposite direction. More egregiously, their skepticism about the possibility that something might really be wrong made them dismiss the need to gather more information.

Today, during the coronavirus crisis, decision-makers have begun with the conclusion that the human consequences to the economy are less critical than their preferred methods to contain the virus. 

In the case of Columbia, the “conviction that nothing was wrong limited discussion and made them discount evidence to the contrary.” All of us, including experts, are subject to confirmation bias, “which causes decision-makers to unconsciously seek those bits of information that confirm their underlying intuitions.”

In policy-making teams, Surowiecki writes, “the evidence suggests that the order in which people speak has a profound effect on the course of a discussion.” He adds, “Earlier comments are more influential, and they tend to provide a framework within which the discussion occurs. As in an information cascade, once that framework is in place, it’s difficult for a dissenter to break it down.” 

Dr. Anthony Fauci, Dr. Deborah Birx, and others recommending policy are fallible human beings. Like us, these experts are not readily able to see what they do not know. Kahneman and Slovic would tell us they are subject to the same cognitive biases as other human beings. They and other members of their team are not exempt from human frailties, such as the desire for power. They tend to be overconfident. Their expertise is likely “spectacularly narrow.” 

Surowiecki asks us to consider why we “cling so tightly” to the false belief “that the right expert will save us.” Why do we believe that the right experts somehow charge forward and demonstrate their superior expertise? Surowiecki points us to a series of studies by those who found “experts’ judgments to be neither consistent with the judgments of other experts in the field nor internally consistent.”

Looking only in one direction, groupthink is the result. Surowiecki cautions that groupthink “works not so much by censoring dissent as by making dissent seem somehow improbable.” 

There is an alternative to politicians and experts deciding when to reopen the country. Decisions made in the market, by businesses and individuals are naturally cognitively diverse. Provided with uncensored information, when free to utilize their knowledge and wisdom, they will make better decisions than the experts and politicians.

Barry Brownstein

Barry Brownstein

Barry Brownstein is professor emeritus of economics and leadership at the University of Baltimore.

He is the author of The Inner-Work of Leadership, and his essays have appeared in publications such as the Foundation for Economic Education and Intellectual Takeout.

To receive Barry’s essays in your inbox, visit mindsetshifts.com

Get notified of new articles from Barry Brownstein and AIER.