explicitClick to confirm you are 18+

Perils of Big Government

UrukaginaOct 29, 2019, 8:26:41 PM
thumb_up26thumb_downmore_vert

When humans act, they act for a chosen goal, and there is some chance of success, but there is also a chance of failure to achieve the goal. To do really well in the world, you need a process that self-corrects.

If you've ever seen a squirrel walking on--or even running along!--a telephone wire, you may be amazed. The reason that the squirrel can walk (and even run!) the wire is because the squirrel's brain can sense imbalance in real-time, even at minute scale.

In other words, the squirrel's balance system is self-correcting. To be able to correct oneself means being able to detect points of failure before catastrophe ensues.

If you are driving on a highway and your car starts veering toward the centerline, you will see the shortening of the distance between your car and the line--you will detect the failure of not driving perfectly straight--and you will correct by turning the steering wheel just a little in the opposite direction.

Having lots of points of failure, and being able to detect all of them, would pretty much guarantee success. In human endeavors, success is not guaranteed but systems can be set up so as to increase probability of success and decrease probability of failure.

When decisions are dispersed in markets, there are more points of failure and the variety of knowledge inside the individual heads of the market participants increases the ability to detect them.

In a bureaucratic process, such as a government-run enterprise, the number of points of failure are reduced and even the ability to detect them can suffer. In 1986, the Challenger Space Shuttle exploded just over a minute after take-off.

A human endeavor had failed (and 7 people died).

It was discovered that the launch took place when the temperature (31 degrees F) was too cold, so that rubber O-rings at the joints of the solid rocket boosters--shown providing thrust in the image above--were not pliable enough to seal in time, and hot gas escaped and ignited.

Most importantly, data existed at the time which, when analyzed correctly, would have shown this to be the case. In other words, not only did the launch fail, but it could have been predicted to fail (because enough data had existed at the time).

Bureaucrats in charge of whether to launch or not were not fully aware of what the data showed--the bureaucratic process resulted in a single point of failure, and a low ability to detect it.

Out of the 6 primary O-rings, 3 of them in each of the 2 solid rocket boosters (shown providing thrust in the image above), damage from previous launches of space shuttles had been recorded, but simple regression of such damage--in relation to temperature on the day of launch--didn't show a clear pattern.

The dots on the cartoon sketch below show the total number out of the 6 primary O-rings which were damaged on previous launches, and the yellow line shows a rough approximation of expected number of O-rings damaged at various launch temperatures, all adapted from a 2005 re-analysis [1].

You can see that, if you just look at the discovered damage only (ignoring cases with no O-ring damage at the bottom-right), it looks like temperature doesn't affect damage in a simple, clear way.

However, if you start to think of what was needed for the successes, as well as for the failures--such as when using Bayesian reasoning (as the 2005 re-analysis did)--then a new pattern of risk emerges, even though the data did not change.

In other words, in allowing the instances of "no damage" to offset the recorded instances of damage, you can reveal that there really was a knowable relation between temperature and O-ring damage at the time of this launch.

At 31 degrees Fahrenheit, it was more likely than not that there would be multiple O-ring damage--putting the astronauts at risk.

Would you ride in an airplane if it was known in advance that it is more likely than not that multiple parts of the plane will fail during your flight?

This level of risk exceeds the maximum acceptable level of risk for space shuttle missions--risk for loss of life, or for permanent disability, of crew members--which was 1% back in 1986, and was still as low as 1-in-90 (1.1% risk) back in 2011 [2].

A commissioned government report came out 6 months later identifying the cause of the accident as a failed O-ring in the solid rocket boosters, but the stock market--a system with many points of failure, and a high ability to detect them--had already identified the cause within 21 minutes [3].

It's important to not have what is referred to as Big Government, such as a federal government which is dictating what happens in markets or other broad enterprises, such as a government heavily-involved in health care, or a government involved in protecting the environment, or a government involved in education, etc.

This is because governments are not self-correcting like markets are, and when governments make mistakes, many people pay--sometimes dearly (sometimes with their very lives).

To minimize human harm, it is important to downsize government back to its level in 1950, when it spent less than 16% of GDP, had less than 10,000 pages of federal regulation, was not heavily involved in education, environment, health care, gun control, etc.

The current, bloated size of the federal government comes with unacceptable risk to ourselves.

Reference

[1] Maranzano, C. and , Krzysztofowicz, R. (2005). Bayesian Re-analysis of the Challenger O-ring Data. Published in Risk Analysis, 28(4), 1053—1067, 2008. http://www.faculty.virginia.edu/rk/Bayesian%20Re-analysis%20of%20the%20Challenger%20O-ring%20Data.pdf

[2] Foust, F. (2017). Commercial crew vehicles may fall short of safety threshold. Space News. https://spacenews.com/commercial-crew-vehicles-may-fall-short-of-safety-threshold/

[3] Wisdom of Crowds blog. https://wisdomofcrowds.blogspot.com/2009/12/stock-market-reaction-to-challenger.html Excerpt:

Within minutes, investors started dumping the stocks of the four major contractors who had participated in the Challenger launch: Rockwell International, which built the shuttle and its main engines; Lockheed, which managed ground support; Martin Marietta, which manufactured the ship's external fuel tank; and Morton Thiokol, which built the solid-fuel booster rocket.

Twenty-one minutes after the explosion, Lockheed's stock was down 5 percent, Martin Marietta's was down 3 percent, and Rockwell was down 6 percent. Morton Thiokol's stock was hit hardest of all. ... By the end of the day, its decline had almost doubled, so that at market close, Thiokol's stock was down nearly 12 percent. By contrast, the stocks of the three other firms started to creep back up, and by the end of the day their value had fallen only around 3 percent.

Image Attribution

Page URL: https://commons.wikimedia.org/wiki/File:Atlantis_taking_off_on_STS-27.jpg

Attribution: NASA [Public domain]