Uncertainties, prudent planning, and duties.
There's a post at Majikthise that sifts through the matter of what the Federal Emergency Management Agency (FEMA) did or did not know about about the likely effects of a hurricane on New Orleans. It's an interesting case of people in FEMA and the Army Corps of Engineers not being on the same page. I'm resisting the impulse to put it down to evil intent (e.g., changing your story on what you were expecting once your response to what actually happened turns out to be wholly inadequate). Rather, I think the problem may come down to how people have dealt with uncertainties given the interests they feel themselves bound to serve.
Problem #1: Ahead of time you can't be exactly sure what's going to happen. If you have a really good model of how hurricanes of various strengths will interact with your region under specific conditions, you can generate a set of predicted outcomes. You can figure out the worst-case scenario, and how bad the other cases would be. You could, perhaps, develop a good guess as to which outcomes are most likely.
But then, you have to deal with these predictions.
Problem #2: Preparing for the case you think is most likely may be at odds with making the preparations needed to keep people safe. Even if the majority of projected outcomes have the hurricane missing your region, or hitting it but not damaging the levees, the fact that there is a non-negligible change the levees are going to get it might mean it's prudent to reinforce the levees (if there's still time ... 2001, I'm looking at you), or to take aggressive measures to get people out of harm's way.
Being "overly cautious", though, might mark you as an hysteric. The folks you're trying to help might be less likely to listen to you next time if the worst case scenario doesn't come to pass. And, the other folks in the business of making models and predicting disasters may give you a hard time for acting as if the worst case scenario was more likely than any good modeler would have seen it was.
Of course, if you're working for the government, there's ...
Problem #3: Preparing to avoid harm has costs -- monetary costs and political costs. Even if you think a particular outcome is really likely, there may be pressure to anticipate a better outcome instead because of the lower costs. There are not endless buckets of money to throw at every possible harm your model predicts, and we get by on getting lucky.
So, even after you've surmounted the problem of building a good model of reality, when betting on what's going to come to pass, you have to weigh the competing pulls of:
- wanting to be right
- not wanting people to get hurt if it's avoidable
- not wanting to get canned by your governmental overlords
Different people will give these pulls different weight. But I think it's important for people to see the potential consequences of giving one of these pulls too much or too little weight, and to think about how it would feel to own those consequences.
0 Comments:
Post a Comment
<< Home