In his excellent book Range, author David Epstein shows us just how powerful, and sometimes deadly, blindspots can be.
He tells the story of a business school case study called Carter Racing. Students are told they are the owner of the race car and have to make a decision: will they enter the car in the biggest race of the year, despite the car’s engine blowing up in seven of the last 24 races?
If Carter Racing enters and performs well, they’ll likely gain additional, significant sponsor money that propels the team to greater success. If the engine blows up in such a high profile event, they could lose alot and jeopardize their future.
What students learn later after the exercise is over is that Carter Racing is really the story of the ill-fated Challenger shuttle tragedy of 1986. The students, as owners of Carter Racing, were unknowingly playing the role of NASA engineers who made the fateful decision to launch.
Recall, the issue was booster rocket O-rings not performing properly due to colder temperatures that moved in the night before the launch. The O-rings couldn’t seal properly, fuel leaked and caused the deadly explosion 73 seconds into launch.
So what were the deadly blindspots of the NASA engineers?
Epstein describes a NASA culture of unquestioned devotion to data. Data is so revered there is a sign in the mission evaluation room in Houston that reads “In God we trust. All others bring data.” Inarguably, this has served NASA pretty well for many years. But for Challenger this devotion to data prevented NASA and Thiokol managers from acting on compelling information that was considered opinion, not data.
Due to the colder Florida temperatures on launch day, NASA and O-ring manufacturer Morton Thiokol engineers gathered to assess the situation. They knew that when O-rings got cold they hardened and sometimes didn’t expand quickly enough to seal the expanding joint in the booster rocket. However, data showed that this was never catastrophic. They knew that on 2 previous launch occasions, one at low temperature (53) and one at 75 degrees F, O-rings were compromised but did not fail.
But a Thiokol engineer had photos of the performance of the O-rings at the two launches and they concerned him. The photo of the launch at 53 degrees showed a line of black soot that suggested the O-ring at colder temperatures presented a heightened risk. However, he didn’t have the data to prove this. It was only one launch. It was just a photo. He did tell NASA that though he couldn’t quantify the risk his opinion led him to think it was too risky. However, NASA’s data-driven culture would never allow any suggestion that wasn’t rigorously defendable, like this one. NASA manager Larry Malloy said without a solid quantitative case – photos of two launches didn’t make a case – there’s no way he could have taken the case of ‘too risky to launch’ up the chain of command. It wasn’t a reflection or failure on his judgment. He was simply carrying the behavior of the culture.Despite strong feelings of concern, reason without numbers was not accepted. Epstein says “In the face of an unfamiliar challenge, NASA managers failed to drop their familiar tools.”
This is a classic, and unfortunately tragic blindspot. There is no one to blame. Rather, it speaks to the power of blindspots.
Fortunately, sales coaching and leadership is not a life and death proposition. Blindspots left untouched can have deep and lasting impact on people and their sales careers.
Good selling,
Mark Sellers
Author, Blindspots: The Hidden Killer of Sales Coaching Buy the book Blindspots here
and The Funnel Principle: What Every Salesperson Must Know About Selling Buy The Funnel Principle book here
Creator of the BuyCycle Funnel customer buying journey sales model, the most time tested, proven customer buying journey model on the market
Learn more about coaching and leading with short videos like this one on our website