Avoid the Risk of Believing in Things Confidently Without Verifying Their Accuracy

Avoid the Risk of Believing in Things Confidently Without Verifying Their Accuracy

Have you ever believed in something in your business life or during project execution that you were sure would happen? You were so confident, the situation was so clear, that you didn’t give it any more thought and made decisions on this basis. And when something went wrong, you asked yourself, why didn’t we invest more time before we decided? Read on and learn how you can reduce such risks in the future.

It Is Often Not a Lack of Knowledge That Leads to Problems

We are confronted with smaller or larger problems every day, whether privately, in business or in projects. And often we have asked ourselves: “It was all so obviously clear, why did this go wrong now?” There is an apt quote from Mark Twain about this circumstance:

It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.“

Mark Twain

Mark Twain is saying that it is not a lack of knowledge that leads to problems, but rather the false beliefs that people hold as truth. It is believing in things confidently without verifying their accuracy. These false beliefs can cause people to make incorrect decisions and take actions that can result in trouble. The quote emphasizes the importance of critical thinking and being open to the possibility that what we think we know may not be accurate, encouraging us to approach information with a healthy degree of skepticism.”

On the one hand, it is a matter of not always trusting all facts that are generally assumed to be obviously true, or of building on them without much thought, but especially in the case of information from unfamiliar sources.

“It’s frightening to think that you might not know something, but more frightening to think that, by and large, the world is run by people who have faith that they know exactly what is going on.”

Amos Tversky

How to Think Like a Scientist

Carl Sagan (November 9, 1934–December 20, 1996) was many things. An award-winning professor of astronomy who explored the universe, a voracious reader, a romantic person, and a brilliant philosopher. But above all, he is remembered as one of the greatest champions of reason and critical thinking, a master of the vital balance between skepticism and openness.

In his book “The Demon-Haunted World: Science as a Candle in the Dark” , Sagan shares his secret to upholding the rites of reason, even in the face of society’s most shameless untruths and outrageous propaganda.

In a chapter titled “The Fine Art of Baloney Detection,” Sagan reflects on the many types of deception to which we’re susceptible — from psychics to religious zealotry to paid product endorsements by scientists, which he held in especially low regard, noting that they “betray contempt for the intelligence of their customers” .
It should show us, that falling for such fictions doesn’t make us stupid or bad people, but simply means that we need to equip ourselves with the right tools against them.

Through their training, scientists are equipped with what Sagan calls a “baloney detection kit” — a set of cognitive tools and techniques that fortify the mind against penetration by falsehoods.

The Baloney Detection Kit

The kit is used whenever new ideas are offered for consideration. If the new idea survives examination by the tools in our kit, we grant it warm, although tentative, acceptance. There are ways to protect yourself from accepting false ideas by using a tried-and-true, consumer-tested method.

But the kit, Sagan argues, isn’t merely a tool of science — rather, it contains invaluable tools of healthy skepticism that apply just as elegantly, and just as necessarily, to everyday life. By adopting the kit, we can all shield ourselves against clueless guile and deliberate manipulation. Sagan shares nine of these tools:

  1. Wherever possible, there must be an independent confirmation of the “facts.”
  2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  3. Arguments from authority carry little weight — “authorities” have made many mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
  5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course, there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
  7. If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
  8. Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
  9. Always ask whether the hypothesis can be, at least in principle, falsified. it is important to be able to test hypotheses to see if they are true or false. Ideas that cannot be tested or proven wrong are not very valuable. Get information from outside to prove or disprove the idea. It is important to allow others to examine and replicate experiments in order to verify the results.

(Source: The Demon-Haunted World, Carl Sagan, Random House Publishing Group 1997)

Our world is full of unsubstantiated claims about all sorts of things. Most of the time, the foundations on which these claims are based are missing. There is a lack of data to support the claims with some statistical certainty. Therefore, if you do anything on some foundation, reduce the risk of being wrong and ask yourself: is there evidence for it, and is it credible? This applies, e.g., to claims made by politicians or by salesmen, financial forecasts or weather forecasts.

Another interesting article you shouldn’t miss: How Developed is Your Risk Awareness?

Here You Can Find Even More Knowledge

Would you like to learn more about how to make your projects more successful with Project Risk Management? My book Project Risk Management – Practical Guide takes you an important step further!

Do you know somebody who might be interested in this article? Then simply forward it or share it. Thank you!

Project-Risk-Management-Book-Cover

Posted in Risk Management.