People are bad at discussing probabilities—and it could kill us all.
A weatherman may predict that there is an 80% probability of rain, but then when it doesn't rain we'll often say “he was wrong.” It's rare to hear anyone say “this must be the one in five cases where he said there would be no rain” even though that is the very meaning of an 80% chance of rain.
People also shorten “the weatherman says there's an 80% chance of rain” to just “the weatherman is predicting rain,” losing the part about an “80% chance” as if it were a mere detail or, even worse, as if it were a phrase that contributed no useful information at all.
In speaking of an 80% chance, the forecaster is saying “given these particular weather conditions, four times out of five there will be rain and one time out of five there will not be rain.” Or, put another way, an 80% chance of rain is the same as a 20% chance (one chance in five) that it won't rain. So if there are 250 occasions where there is such a chance of rain, and if 200 of those 250 (4 out of 5) have rain while 50 of those 250 (1 out of 5) don't have rain, then he was 100% right—that is, he was completely correct, not 80% correct, in his claim that there was an 80% likelihood of rain.
Unfortunately, when it doesn't rain, the weatherman just takes heat. I call that The Weatherman Problem, and it must be maddening to someone who is struggling to do the right thing.
Not only is it bad for their sense of personal happiness and self-image, but it could be actively bad for their career. If people are wanting to shoot the messenger in the case of bad news, why not bias things in a way that makes them want to do it less? Better to hide behind “Who could have known for sure?” than “I should have known better.”
So the tendency, I'm suggesting, is to bias downward. In our society, failing to know something is often more defensible than claiming to know something with too much certainty. The latter appears to be the sin of arrogance and if we even smell the possibility of that, we punish it unreasonably harshly.
Effects like this are enough to make one not to want to make predictions. Not just predictions about weather, predictions about anything. Why sign up for that kind of grief? One wants to be sure, but one cannot be 100% sure. Maybe one is 80%, 90%, 95% sure. Maybe it's hard to turn into a specific number. When is the right time to make a prediction with any confidence? Better to hold back until one really knows.
The Weatherman Effect happens in any domain, not just weather. I'm just giving it this name to remind us of how familiar it should be. But it happens anywhere that requires science augmented by guesswork. Medicine, for example. Or legal liability. Or Climate science.
It's hard to say for any given individual with lung cancer precisely what the cause is. Yet if 100% certainty were the requirement, we might not have warnings on the side of cigarette packs where they belong. In spite of this lack of complete proof, we've come to believe as a society that smoking is bad for us—that it's better to reason about this as a truth than as a “questionable claim.”
Of course, there are always skeptics, and in the modern world many of them are paid by people and organizations with special interests, such as the cigarette industry or the fossil fuel industry.
Skeptics eat legitimate ethical concern for breakfast, finding ways to run on it the rest of the day. They are focused on telling us that the things that Climate scientists tell us are not true, and often their flimsy rationale is that nothing about nature can ever be known to 100% certainty. Well, duh. But seriously—do they expect this should shut down all discussion on all things we're not certain of? That's just crazy talk. Or cynically manipulative talk. Either way, it's not good for society.
We know many things well enough. 100% certainty is too high a bar. For most things it's enough to know probably true.
We hear a lot from Climate skeptics about how they think there's too much speculation being done about Climate. They want it dialed back. But what if they're wrong? What if there's really too little speculation because the people who mostly know the truth are afraid to speak—afraid of having their reputations ruined by people who misunderstand or manipulate this problem I'm calling The Weatherman Problem.
The skeptics would have us believe that we don't need to know what's going on about Climate. But I'm suggesting it's just the opposite. I'm suggesting not just that we need to know, but that we already might know and that people are being incentivized not to say. There's no reward for being almost right, only a huge penalty for being wrong. There's no incentive to speculate even a little, so no incentive to talk about what's “probably true.”
What if, as a consequence of this fear, the Climate problem is worse than people talk about because we've not been overhyping the problem but actually underhyping it?
What if respectable scientists don't want to risk their reputation falling victim to The Weatherman Problem so they delay telling us things they only probably know? By holding back, critical information about effects that will only probably happen are not mentioned at all.
If that happens a lot, it could spell disaster.
In the skewed conversational space we've created for Climate, there is only what's certain and “what's not discussed,” which is not the same as ”what's not happening.” But it's easy to confuse the two.
But if we're going to be honest about risk, we have to be willing to be wrong by having prepared for things that maybe won't happen or maybe won't happen yet. That's far less dangerous than failing to prepare for things that do.
We would never prepare for terrorism or disease by looking only to what we are certain of and never raising a finger against the things that probably will happen or even just might happen. Yet somehow we don't do the same with Climate.
We need to change the way we talk about probabilities, or it could kill us. Can I say that with a certainty? No. But I can say it with enough probability what we ought not be dismissing it.
If you got value from this post, please "rate" it.