Nate Silver is currently predicting that Republicans will gain 47 House seats in the upcoming election—plus or minus 30. In other words, he’s fairly sure that Republicans will gain between 17 and 77 seats, although there’s a slight chance the gain will be even less, or even more.
What good is a prediction when it carries such a wide margin of error? Answer: Far, far better than a prediction with no stated margin of error at all.
Understanding uncertainties, and learning to live with uncertainties, and stubbornly insisting on knowing the uncertainties, is the hallmark of a good scientist. A constant awareness of uncertainties may be the most important thinking pattern that distinguishes the two cultures (“techies” and “fuzzies”, we called them at Stanford) from each other.
If you study physics, we try to teach you about uncertainties through lab exercises. Suppose someone predicts that “g” (the acceleration of a freely flying projectile) should equal 9.8 meters per second squared. You measure its value and get 9.6. Have you confirmed the prediction or not? Yes, if the uncertainty in your measurement is at least 0.2. But if your uncertainly is only 0.02, you’ve disproved the prediction (and perhaps discovered a local gravitational anomaly).
Our introductory physics lab students think they’ll be graded on how close they get to the “right” answer, but they’re wrong. We actually grade them on whether they make a good uncertainty estimate, and on whether they interpret their results correctly in light of this uncertainty.
Perhaps I learned this lesson in undergraduate lab courses myself; I can’t remember. What I do remember, vividly, is my graduate school days at the Stanford Linear Accelerator Center, when I must have sat through hundreds of seminars on experimental results and their theoretical interpretation. Each speaker would spend a major portion of the talk meticulously explaining how the uncertainties had been estimated. Most of the after-talk discussion would center on whether these estimates were accurate and whether the speaker had, given the uncertainties, drawn the right conclusion. In elementary particle physics, where the experiments cost hundreds of millions of dollars and the data are inherently statistical, you had better interpret your results correctly.
But uncertainties don’t play well in politics. Whenever a climatologist admits that there’s any uncertainty at all in the predicted rise in earth’s average temperature, global warming deniers loudly yell “See! The scientists admit they don’t know what will happen!” Or to take another example from the not-so-distant past: Imagine that President Bush had said in 2002 that he actually wasn’t sure whether Iraq was developing weapons of mass destruction, but there seemed to be a 50/50 chance of it. Would Congress still have authorized the war?
Nate Silver is a hero (and an anomaly) because he’s able to look at all the data, make his best prediction, and still be honest about his uncertainties—even when the subject is politics.
I’ll end (once again) with a local political example. A respected economic consultant recently predicted that the middle segment of Ogden’s proposed streetcar system will stimulate $8.5 million of investment if it follows one proposed alignment, but only $1.5 million if it follows an alternate alignment. Ten days ago I asked her what the uncertainty range is on those numbers, and she replied, “Well, you can see that we rounded them to the nearest half million.” I’m afraid I laughed at that point, and tried unsuccessfully to convince her that the uncertainties were many times larger. I knew the numbers had been calculated from property value assessments, and that these assessments can be systematically off by 50% or even more. Worse, I knew that the lists of properties to be included in the calculations had been compiled through a subjective, undocumented process. After our conversation I looked up some of the property assessments and quickly saw that you could increase the $1.5 million prediction to over $9 million by excluding just two properties (out of several dozen) from the list. A fair estimate of the uncertainty would be much higher still.
But economic consultants apparently aren’t in the habit of thinking about uncertainty. Undoubtedly this is because their clients don’t want to hear about it; they just want simple answers. In this case the client was the Utah Transit Authority—a government agency that supposedly represents the people. Ultimately, it is the citizens at large who need to learn to think like scientists.