Forget Zuckerberg, Listen To Failed Founders

Survivor bias is a huge problem in startup thinking. As Edwards Deming is famously misquoted as saying: "You can't manage what you don't measure". This has become extremely popular in corporate culture as companies base their KPIs on measurable results[1].

The motivations are noble but this leads to a fatal problem: big problems that are difficult to measure are systematically ignored[2]. This has important implications for startups: it's the things you weren't guarding against that are most likely to kill you, not the things you are already worried about.

We love focusing on the attributes of successful startups but the truth is that we can safely ignore the things they failed at as unimportant. We can also look at what failed startups did well as example of things to ignore. In both cases these things probably proved irrelevant to the success or failure of the startup.

We also know that founders, being human, are as prone to hindsight bias and outcome bias as any other human. Their advice is still valid but just remember to take these things into account and be active, not passive, in the advice you allow to influence your thinking.

Just make sure it's an apples with apples comparison and to take context into account.

Here's a neat analogy from the youarenotsosmart.com website[3]:

How, the Army Air Force asked, could they improve the odds of a bomber making it home? Military engineers explained to the statistician that they already knew the allied bombers needed more armor, but the ground crews couldn't just cover the planes like tanks, not if they wanted them to take off. The operational commanders asked for help figuring out the best places to add what little protection they could. It was here that Wald prevented the military from falling prey to survivorship bias, an error in perception that could have turned the tide of the war if left unnoticed and uncorrected. See if you can spot it.

The military looked at the bombers that had returned from enemy territory. They recorded where those planes had taken the most damage. Over and over again, they saw the bullet holes tended to accumulate along the wings, around the tail gunner, and down the center of the body. Wings. Body. Tail gunner. Considering this information, where would you put the extra armor? Naturally, the commanders wanted to put the thicker protection where they could clearly see the most damage, where the holes clustered. But Wald said no, that would be precisely the wrong decision. Putting the armor there wouldn't improve their chances at all.

Do you understand why it was a foolish idea? The mistake, which Wald saw instantly, was that the holes showed where the planes were strongest. The holes showed where a bomber could be shot and still survive the flight home, Wald explained. After all, here they were, holes and all. It was the planes that weren't there that needed extra protection, and they had needed it in places that these planes had not. The holes in the surviving planes actually revealed the locations that needed the least additional armor. Look at where the survivors are unharmed, he said, and that’s where these bombers are most vulnerable; that’s where the planes that didn't make it back were hit.

This quote is also a gem you should carry around in your back pocket:

You succumb to survivorship bias because you are innately terrible with statistics.

One more:

For instance, if you seek advice from a very old person about how to become very old, the only person who can provide you an answer is a person who is not dead. The people who made the poor health choices you should avoid are now resting in the earth and can’t tell you about those bad choices anymore.

But failed startups founders don't die. They hang around and for obvious reasons have fewer people competing for their advice[4]. You should take advantage of that fact.

Footnotes:

[1] This is the exact opposite of what Deming would have wanted as he actually said: "The most important figures that one needs for management are unknown or unknowable (Lloyd S. Nelson, director of statistical methods for the Nashua corporation), but successful management must nevertheless take account of them." and The most important things cannot be measured." http://en.wikipedia.org/wiki/W.EdwardsDeming

[2] Example: let's say your company is choosing between two television commercials it wants to air. One is more likely to be controversial than the other. By choosing the "safer" one have you done well? Let's say your metrics indicate the campaign hit it's targets, does that mean you made the right decision? Not necessarily - you have no way of knowing how the other campaign would have gone. For various reasons (corporate politics) the people involved will conclude that it was the "right" decision and use that to justify future behavior. This is called outcome bias.

[3] I strongly recommend you read the entire article.

[4] Quote from the piece: "Also, keep in mind that those who fail rarely get paid for advice on how not to fail, which is too bad because despite how it may seem, success boils down to serially avoiding catastrophic failure while routinely absorbing manageable damage."