Seems to be related to the financialization of everything, where financial engineering supplants actual engineering and financial culture replaces engineering culture at the C-level. Here’s a good article on Boeing about that:
I have an essentially identical question: Factory owners do not want their factories to explode and lost ten million dollars by the virtue of profit-seeking capitalism. Yet, being profit-oriented and neglecting basic safety and preventative maintenance is a very common phenomenon, always happens, and seen as the evil of "business people", and it happens even in the most powerful company. I don't understand it.
Is it simply cognitive bias at work, like LessWrong often says? But I think there must be deeper reasons than that. Did anyone write good books on this subject? Either sociology, psychology, management & decision-making or economics prospective is welcomed.
It's short-termism, which is a common threat to society at large. Think of cities that build in sites vulnerable to earthquakes or floods, for example. After the event hits, everyone says, "Oh, we will never do that again," and it is true for approximately one generation. Then, unless the culture has evolved to value metrics coinciding with these kinds of long-term sustainability issues, they immediately start to relax any preventative policies.
In business, all the cycles are shorter, there's always a "new thing", and next to nobody has the kind of deep institutional experience you would see in a big topic like city planning. Thus the valuation metrics of all agents will quickly fall to current market pricing, and competition may act to hold those metrics in place - if you are the only restaurant in town that doesn't cut corners, customers will complain about how overpriced you are. Regulation, labor and consumers exercising their power all have a role in changing what a business can or can't do by raising the floors on acceptable practice.
But let's say you are exceptionally good at operating a restaurant on your own - quality everywhere - and grow to have a chain. Now you need to hire managers, and the viable hiring pool consists of the people who were cutting corners before - because there is literally nobody else out there. Good luck retraining them!
Plus, at the corporate scale, you end up with fiefdoms and power struggles leading to metrics that agree with the current internal political situation, not industry or marketplace factors, and certainly not sustainability metrics. A business is a "machinery of people" and needs periodic tune-ups and reprogramming to go in vaguely the right direction.
In a lot of ways what it all comes down to is one of my personal favorite phrases, "fix ordinary things." Most of the time, we don't. We have a habit of putting off fixing all sorts of little things in our lives, even if our intentions are good, so of course we're caught by surprise by the disasters.
Does the difference in expected value (from preventing rare, expensive issues) actually exceed the cost of adding preventative maintenance? It is not obvious that this is the case.
That's why it's not done. However that means something disastrous happens every few years. On average the company is better off, the increased profit in good years outweighs the losses in the possible bad year. It's where black swan theory comes in. How do you calculate the probability of a rare event?