I’ve been reading and enjoying Nassim Nicholas Taleb’s excellent new book, The Black Swan. The book is about random, unpredictable events, but really it is about the illusions in human cognition.
Taleb shows how humans have a powerful and instinctive need to create stories in order to remember and understand the events that make up our lives. He observes that those stories don’t necessarily have to be true, but the salient point is that we will go to great lengths to create a story that makes the facts memorable. Once the events are strung together on a plausible narrative, the narrative becomes real and the facts are demoted to a supporting role. Taleb calls this the “narrative fallacy” and it’s an indication of the importance of narrative to the operation of our minds. Incidentally, it’s just one more indication, to me at least, that storytelling—narrative—is the one thing that truly separates humans from other species.
One of the many side effects of the narrative fallacy is the way that facts appear to us. As they happen, occurrences are often apparently (and actually) random. Our minds make them seem to make sense by creating a plausible narrative for them after the fact. In retrospect, all events tend to make sense for the simple reason that we remember them by stringing them together into a story line. We put the facts into a plausible storyline in order to make sense of them, and so they seem to make sense to us because they are in a storyline. But our minds magnify and distort, manipulate and rearrange the facts to fit the storyline.
When, for example, a giant airliner falls out of the sky due to some utterly random event, never before seen, our narrative powers shift into gear. We recover the tiny shreds of airplane in a tour de force of forensic investigation and scuba diving. We then force the reconstructed craft to tell us a story. Once that story exists, it is believable, and once we believe it, we can see it. Upon seeing it, we fear it, and we demand that our bureaucracies do something about it. And they create laws.
The problem is that each of those laws deals with a random event. They generally don’t address anything systemic. In 1996, TWA’s Flight 800, a giant 747, suddenly exploded and fell into the ocean, killing all aboard. The cause was a random electrical spark in an empty fuel tank (jets have several fuel tanks, and often one or more of them will be empty).
Boeing 747s are the safest airplane ever made, and thousands of them have flown billions of miles without ever exploding and falling out of the sky. They fly with empty fuel tanks all of the time. Sometimes bad things happen to good airplanes, and this was a pretty clear example of utterly simple, random bad luck.
But the human mind is uncomfortable with a story that goes like this: “Jet airplanes fly. One day, one of them exploded.” That story is uncomfortable to the human mind because it’s just two independent facts. Our minds struggle to record the story. There is no causality. Why did it explode? This is not a question of values or responsibility. It’s just a simple fact about how the human mind categorizes and stores memories. We can’t put the story to rest until we have a reason that connects those two facts. We crave narrative.
When we defend ourselves against imaginary dangers, the only really effective tool is a compelling narrative. Unfortunately, we have constructed huge organizations chartered with making us feel “safe.” In other words, to tell us stories with a happy endings. There are two primary federal agencies whose job it is to make us feel safe: the NTSB and the FAA. When such a highly visible event as a jet crash occurs, they feel they must demonstrate an equally visible reaction to assure us that they deserve their pay.
The NTSB spent millions to reconstruct the events leading up to the crash, and millions more to compose a sensible, measured response to it. Just recently, after twelve years of work, the FAA mandated mechanical changes to 2,700 similar aircraft.
The bureaucracy’s and our narrative angst was satisfied, but at what cost?
The retrofit may or may not fix the problem, simply because what happened was an event, which is not the same thing as a “problem.” Sometimes, sh*t happens, and things break. A tiny, unforeseen spark can be generated in an insanely complex machine because that’s the nature of insanely complex machinery. If there was a problem, there would already be other 747s lying on the bottom of the sea. The Boeing 747 did not have an “exploding fuel tank problem.” The odds of that spark in Flight 800 have always been infinitesimally tiny.
Whatever Boeing will do to alter their airplanes, the only thing accomplished will be that when the next 747 explodes in midair, it will be because the spark came from some other, unforeseen, source, and followed some other, unforeseen, path, and ignited some other, unforeseen, component. While the odds of another fuel tank spark are lower today, that fact isn’t really very helpful. The odds were already unbelievably low before Flight 800 ever took off.
By retrofitting those planes we may or may not be enhancing safety. But there is one thing we guarantee, and that is that the cost of living in our society will go up. We must all pay the price. Not only was the cost of investigating the crash enormous, but Department of Transportation estimates the cost of retrofitting to be around a billion dollars. Based on what Taleb says to expect about unexpected events, I would guess that the actual retrofit cost will be closer to 50 or 100 billion dollars.
Taleb calls events such as the Flight 800 spark a “Black Swan.” They are totally unexpected, and they can only make sense to us after the fact. Our minds demand a sensible narrative, so that walking down the jetway onto a airplane doesn’t appear to have any element of randomness in it. So, for billions and billions of dollars we purchase the palliative of a happy ending to a made up story: “The plane blew up because it was defective and now that we have fixed the defect it’s safe for us to fly again.”
I’m not immune to the narrative fallacy, and I am just as pleased with this newer, better ending as anyone else. I am not happy, though, with what we paid for it. It’s simply too high a price. For the cost of shifting the decimal point in a probability equation from five hundred places to five hundred and one, we could have erected schools in a hundred cities, or built a transit tunnel under the San Francisco Bay, or trebled our investment in AIDS/HIV research. Did we really make a good decision just because it feels like we did? Is the narrative fallacy causing us to foolishly waste our money?