Mr. Donald Rumsfeld once stated:
“Reports that say that something hasn’t happened are always interesting to me because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say, we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tends to be the difficult ones.”
Now if you are not saying, “Huh? Wait, what?” then you’re better than the rest of us.
On the day he uttered these words, Mr. Rumsfeld had inadvertently sideswiped into the trainwreck occurring on his tongue. What can be gleaned from this gem of a literary fail? Rumsfeld was complicating the basic principle of the Socratic Paradox:
“I know that I know nothing.”
Who Was Socrates?
About 2300 years ago, Socrates was born to middle-class household and, for much of his life, he was unremarkable. And yet today, he is considered to be the father of Western philosophy (Philo = truth, Sophia = love … literally the love of truth).
As a younger man, Socrates painted and sculpted, but mostly he thought about things — a lot. So much so, that by mid-life he was asking questions from the elders of Athens. He asked so many questions, he began to attract a following (including the famous Plato).
Socrates’ paradox, the idea that the only thing we know is that we know nothing at all, has become the basis for philosophical studies in the two-plus millennia since his death.
Applying the Socratic Paradox in Dangerous Jobs Saves Lives
Remember your first day on the job? Remember how you read the safety manuals from cover to cover — twice! You were the most cautious and safe one in the crew. All that you “knew” at that point was that you “knew” absolutely nothing. You didn’t know how to cut corners. You didn’t even know what corners were. Again, you knew only one thing and that was that you knew nothing.
Then, you took advice everywhere you could find it. You learned the ins and outs of your job. You learned small little shortcuts to make your job easier, but not safer. Before long, and with much repetition, you began to “know” your job; before long, you even “knew” your job.
Now that you knew your job, it was okay not to use safety equipment, such as goggles, gloves, hard hats and fire retardant gear. Maybe it was too hot, too cold or too heavy. The little shortcuts weren’t hurting anything, after all. It’s been done like that a thousand times before. Indeed, it probably has been done a thousand times before that.
Did you notice how the more you “knew” your job, the less safe your actions became? You were more reckless, less cautious and willing to throw the rule books out the window. Ultimately, you became a danger to yourself and everybody around you. In the beginning, the only thing you truly knew — like Socrates — was that you knew nothing at all.
The Danger of Knowing
Feeling overconfident has real consequences when you’re working in the field. Cutting a corner here and there may not seem like a big deal, but history proves otherwise. In 2010, the BP oil spill, the largest and most devastating of its kind, killed 11 people and injured 17 more. The tragedy onboard the Deep Horizon could have been predicted nine years before it happened. BP, plagued by safety issues for decades as a result of a top management cultural attitude of hubris, ambition and blatant disregard for safety and disaster prevention.
The crew, considered to be among the best wildcatters in the industry, and thus chosen to run BP’s flagship deepwater rig, also exhibited the same management led nonchalance in safety matters. It was more important to be comfortable and personally safe than it was to look for and resolve chronic safety violations.
Newsflash: Blowing up an offshore oil rig is not easy. To do so requires an astonishing collection of failures, big and small, human and mechanical, by both individuals and organizations. It’s the “swiss cheese metaphor,” which is the idea that each mistake is a hole in a single slice and it’s only after the errors stack up, with the holes aligning perfectly, that a disaster results. Small incidents are warning signs that conditions are ripe for disaster, and a long stretch without a serious accident breeds complacency; people forget to be afraid. The Horizon had gone seven years without a single major safety incident until one night history was made.
A similar track record of systemic disregard for unresolved safety issues plagued the Exxon Valdez some 30 years before.
The Challenger: A Hard Lesson to Learn
Only the best and brightest are chosen to grace the halls of NASA’s elite. In spite of their impressive resume, NASA employees are just as susceptible to overconfidence on the job as anyone else.
On the morning of January 28, 1986, it took only 73 seconds for ‘a broken safety culture’ to kill seven people and destroy 100’s of millions of dollars worth of scientific equipment.
Seventy-three seconds. That’s how long the NASA flagship space shuttle Challenger spent in flight before erupting in an unforgettable fireball. The path to get to that 73 seconds was a long winding road of denials, failed oversight and at times outright negligence.
Prior to the disaster, engineers had warned their bosses repeatedly that exactly what happened could and probably would. Mechanically, the failure was attributed to a failed O-ring that was known to be ineffective in cold weather. Those working closely with the program note that deadlines, budget gaps, and political infighting took precedence over safety at all levels of the organization. Twenty years later in a culture that hadn’t changed nor learned from the past, a second space shuttle, Colombia would meet a similar fate.
Be Safe Out There
If you’ve learned anything today, let it be that you know nothing. Always remember these famous last words: “Don’t worry, I got this … I KNOW WHAT I’M DOING!”