30 September, 2020
When you have a high cost of failure, it can be tempting to reduce the chances of failure occurring. This is wise, but it is not the whole story.
After a while, the cost of such reduction exercises follow the laws of diminishing returns. Failure will likely always occur sooner or later, and so any complete strategy must assume failure will occur at some point. Thus, I would suggest any efforts to reduce the chance of failure are best paired with equally strenuous efforts to reduce the cost of failure itself.
Build systems that tolerate failure, perhaps even expect it. An ideal target might be to build a system that can have anything thrown at it and recover instantly. If building a bridge, perhaps assume that most rivets are badly made. How would your design differ? If building software, perhaps assume that code is insecure and prone to crash. How would your architecture differ?
If building a process, perhaps assume that people are stressed and forgetful. How would your compassion differ?
It may also be prudent to consider how incentives align along such assumptions. If the bridge tolerates shoddy work, does that encourage cowboys? If your software tolerates sloppy code, does that encourage hacks?
If your team practices empathy, does it encourage those that would take advantage?
To which I submit, these are different problems. While one may engender good fire safety by stuffing oily rags and scrunched up newspaper into every crevice of the house, it might not be the only approach.