Francis Bacon. Image: Paul van Somer / Pinterest via Wikipedia

A basic lesson learned from science is that human understanding is very limited. Science does not depend on ordinary categories of understanding, but instead turns to mathematical relationships between the entities in a system to form knowledge, and to tests to gauge the validity of the mathematical description.

Reason may lead to a proposed theory, but reason alone is insufficient for validation. Carefully designed protocols must be used to insure that the predictions of a proposed theory are concordant with physical observations.

A huge problem facing science is the inability to formulate adequate mathematical descriptions of complex systems. Mathematical formulations tend to be incomplete (simplistic) and uncertain, so that tests are bound to fail. The practical import is enormous. How do we base engineering on incomplete and uncertain systems? And when acting without scientific justification, to what extent is that action based on subjective dogma rather than facts?

In his great work Novum Organum (1620), Francis Bacon identifies four prejudices in human thinking that are obstacles to objectivity. He calls these idols of the mind. In particular, idols of the theater involve the uncritical acceptance of dogma, ideologies, and speculative theories. These are appealing because they are philosophically attractive or satisfy human passion. They provide simplistic explanations.

Their current ubiquity in academia is a clear sign that the great scientific epoch running from 1905 (Albert Einstein’s miracle year) to 1969 (moon landing) is long behind us. Suspicion should be aroused whenever one hears the word “theory” in the absence of proper validation.

Of the idols of the theater, Bacon writes,

“Many various dogmas may be set up and established on the phenomena of philosophy. And in the plays of this philosophical theater you may observe the same thing which is found in the theater of the poets, that stories invented for the stage are more compact and elegant, and more as one would wish them to be, than true stories out of history.”

A narrative is invented that simplifies worldly phenomena in a manner that is “more as one would wish them to be,” and then interprets all behavior in the framework of the narrative, even when there is a contradiction. The ideology, that is, the network of assumptions, that accompanies the narrative, is held axiomatically and can be refuted by neither reason nor observation. An idol of the theater is more than a simplifying assumption to make the world more comprehensible; it is a filter that distorts both perception and thought.

The world is too complex for human understanding, so it is not surprising that people turn to “compact and elegant” stories rather than to live with existential uncertainty. The horror arises when those sharing a common derangement band together to crush those who would threaten their ideology by pointing out logical and factual contradictions. Implemented on the scale of the state, the result can be an Orwellian dystopia or a socialist gulag.

Engineers treat system intervention in the context of optimal control: the system is mathematically modeled, a desirable objective is posited, and an operational policy is mathematically derived that best achieves the objective. A difficulty arises because even if the objective can be achieved, there will be other effects resulting from intervention. These unintended consequences can be devastating. To avoid calamity, a less than optimal policy may be adopted.

If this problem bedevils systems for which all possible consequences can at least in principle be calculated, consider the potential for catastrophe with massively complex systems, where it is impossible to calculate even a tiny portion of the possible consequences, and where small perturbations to the system can have large effects that are seemingly distant from the point of intervention. A prototypical example is drug intervention. Even should a drug yield the desired result, the side effects might be worse than the original disease. The physician should be prudent: “First, do no harm.”

As Francis Bacon intuitively understood, and today we understand from a mathematical-scientific perspective, ideological thinking is necessarily distorted. Acting upon such thinking is extremely risky because actions are deduced from a mental construct whose relationship to the real world is unknown, and distant from it. Nevertheless, in politics and economics, which concern highly complex systems, overly simplistic ideologies abound.

Now, it is obviously true that acting with limited understanding is often necessary. Refraining from acting until one has a comprehensive understanding would leave one impotent in situations where quick action is imperative, and waiting is a default action that may have significant consequences. The prudent person, recognizing the risk, refrains from overly ambitious schemes.

Those who wish to impose grand centralized programs also ignore a basic engineering principle: operational decisions should, wherever possible, be made at the local level. Hierarchical control exhibits several flaws. First, it is unable to respond efficiently to changing local conditions, thereby leading to critical delays in decision-making. Second, hierarchical control is fragile: the longer the chain of command, the more likely it will be broken. Third, hierarchical control can place the decision in the hands of a person lacking qualifications or information regarding the local issue.

Consider biological systems, in particular, metabolism. The functions most commonly shared and heavily engineered by selection have extremely local regulation of activity. The fineness of this control is sufficient to produce both high levels of rapid adaptability to fluctuations anywhere in the network of operations and a level of stability that centrally driven regulation cannot achieve.

Only the ignorant (or malignant) would propose grandiose political, economic, or sociological theories to be imposed upon society. Only someone bewitched by an idol of the theater would propose massive intervention to an impossible-to-model complex system, especially when similar interventions have previously yielded catastrophic results.

It is often argued that the grand scheme failed (perhaps with the cost of a hundred million lives) because it was not implemented properly. A little tweaking of the program will alleviate the problems, as if one could possibly predict the effects of tweaking a catastrophic intervention strategy applied to political or economic structures, or to cancer.

We may hear that the intentions were good but the plan was foiled by unintended consequences, as if this should excuse those responsibility for the consequent suffering. Admittedly, the ideologue cannot accurately predict the specific unintended consequences (nor accurately predict the good), but he cannot claim that he did not intend there to be unintended consequences. To take action on a complex system is to intend unintended consequences. They are inevitable.

The person whose concern is authentic accepts this limitation. In an effort to achieve the beneficial, he takes modest actions, observes their results in real time, and proceeds prudently, always humble before incomprehensible Nature.

Proverbs 16:18: “Pride goeth before destruction, and an haughty spirit before a fall.”

Edward Dougherty is distinguished professor of engineering at Texas A&M University.