The Savage and Hubbard Top Ten List of Lame Excuses for Not Quantifying Uncertainty

Sam Doug.png

from the Introduction of Chancification: How to Fix the Flaw of Averages

Doug Hubbard and I have been urging people to quantify uncertainty for decades. When I discovered that we had independently been witnessing the same set of lame excuses for avoiding probability, and that we had even been telling the same well-known jokes to describe these situations, I realized that we needed a top ten list. Here it is.

10. Our situation is too complex to model, so we do it in our head based on experience.

Doug says:
If you are doing it in your head, you ARE using a model. Calibrate your own estimates, then compare them to other modeling methods. You’ll be surprised how inconsistent you are without quantitative measurement.

Sam says:
Even if the model does not give you the right answer, the very act of modeling often leads to the right question.

9. We don’t have the specialized software to run simulations.

Sam says: That was a valid excuse before the Data Table function in Excel could do millions of calculations. This does for simulation what penicillin (based on moldy cantaloupe) did for antibiotics.

Doug says: Excel is specialized?

8. We don’t know the probability distribution.

Sam says: In the land of the blind, the one-eyed man is king. In the land of averages, the same can be said of the man with the wrong distribution.

Doug says: The one distribution we know is wrong is a single point average. There are more plausible distributions regardless of shape.

7. I can’t build a model until I collect the data.

Doug and Sam say in unison: Information value theory teaches that you won’t know what data to collect until you start building a model.

6. My boss will still ask for a single number for two reasons:

a. you need to order a single number of items for inventory, and

b. you can’t do calculations with probability distributions.

a.

Doug says: The order is exact, but demand is uncertain. Models show that the best decision is usually to purchase either more or less than the “average” demand, depending on the penalties of missing the mark.

Sam says: When the boss asks for a single number, you can now say: “What do you want it to be? I’ll give you the chance of meeting your goal.”

b.

Doug says: It is true that you can’t do the calculations in your head, but Monte Carlo simulation is now everywhere (see 9 above).

Sam says: Probability management lets you perform calculations with uncertainties using the same keystrokes you would with numbers, with SIPs playing the role of Arabic numerals.

5. We don’t need to quantify uncertainty, we just need better forecasts.

Sam says: If you’re about to play craps, do you waste your time trying to forecast the numbers on the individual dice or do you model that sevens are way more likely than twos or twelves when you roll two dice?

Doug says: The only way to know if you have a better forecast is to quantify the reduction in uncertainty.

4. A simulation is only as good as the distributions you put into it. “Garbage in, garbage out.”

Doug says: You are always using a model (see Excuse 10). The question is how much garbage the model itself adds or removes compared to other models.

Sam says: When you shake a ladder before climbing on it, you are simulating it with a distribution of random forces to see if it’s steady. Unfortunately, the distribution of forces when you climb a ladder is different than when you shake it. No one ever told me they will stop shaking ladders after I point out they are using the wrong distribution.  

3. My situation is unique, so there is no way to estimate a distribution of outcomes.

Sam says: This is like saying, “I’m not going to shake my ladder this time because I am setting it up over broken beer bottles, which I’ve never done before.” Instead, you should ask if the chance of disaster is great enough that you should set up the ladder somewhere else.

Doug says: People simulate in their head using recalled “experience” (again, see Excuse 10), and although the excuse itself is founded in lack of experience, they say this with no sense of irony.

2. My quantitative model might not be right.

Doug says: This is the Exsupero Ursus fallacy. Two hikers are being tracked by a hungry bear, when one pulls a pair of track shoes out of his backpack. “You can’t outrun a bear,” says his partner. “I just need to outrun you,” says the other. You just need to ‘outrun’ modeling with intuition alone.

Sam says: George Box said, “All models are wrong, but some models are useful.” Doug adds a corollary to Box: “…and some models have been shown to be measurably more useful than others.”

… and the Number 1 lame excuse for not quantifying uncertainty:

1. I don’t have enough data, so I’m just going with a single best guess number.

Doug says: You have MORE data than you think and you need LESS data than you think to estimate pretty good probability distributions.

Sam says: Not using probability because of a lack of data is like not doing yoga because you are too stiff, not taking a shower because you are too dirty, or not using a parachute because the wing of your plane is on fire.