Design for Prevention
Man is a prisoner of his own ways of thinking and of his own stereotypes of himself.
His machine for thinking, the brain, has been programmed to deal with a vanished world.
This old world was characterized by the need to manage things – stone, wood, iron.
The new world is characterized by the need to manage complexity. Complexity is the very stuff of today’s world. This mismatch lies at the root of our incompetence. Stafford Beer
Introduction
The aversion to proactively preventing loss is a distinguishing characteristic of the dysfunctional organization. It’s all in the loss record. The large drop in accidents and losses due to the transmutation is, in practice, watching the backs of the people around you.
The Design for Prevention is a practitioner’s manual supporting his due diligence for prevention law compliance. It is a handbook of proven practice for delivering effective prevention systems, pursuant to the professional engineering standard of care and conditions of license. The Design for Prevention provides a framework of strategy and tactics developed by practitioners with more than a thousand applications worked over a half-century to their credit. While the focus here is prevention, the practice described is of extreme generality. The engineering process is the assault on complexity.
Since no institution can come to grips with forming a standard of care that translates from the primary function to the work face, this manual serves as a surrogate for the individual practitioner and students of the discipline. It stands as short as any other. Rather than wait a decade for the venerable tort process to define a standard of care top down that is then already obsolete to use when released, the standard of care herein covers from the stuff you can deploy in complete confidence today up layer by layer to the prime directive.
Like a code of ethics, there is no format guide provided by law for a standard of care or even a scope definition. Unlike the codes of ethics that, over time, have gravitated to a facsimile, formats of the standard of care vary all over the map. The Nursing discipline maintains one of the best standards of care available but it has to be stitched together from four separate documents issued by discipline headquarters. Many disciplines, including engineering, deliberately avoid assembling a standard of care document altogether. The guiding principle for the practitioner’s standard of care provided here is a plan of work, with derivations, that will contemporaneously and with certainty comply with prevention law. That means no retroactive audit of work will find nonfeasance to the law. In short, abide the work plan and compliance is a given. Keep to the principles and you generate your own authority to act. This completes the authority-responsibility loop with prevention law.
These axioms, theorems and procedures are rooted in the foundation book (1/3), Platform for Prevention, attached, which derives the incontrovertible basis for pragmatic foresight by scrutable connection to natural law. The platform is designed for the practitioner as a “snowblower,” a tool to use for clearing various institutional obstacles from goal-seeking paths. Accordingly, the platform tool is in syntax appropriate to the snow and proven quite effective to that end.
The keystone’s standard of care herein compares favorably to others because its basis in natural law is complete. There are no boxes on the blackboard formula labeled “and then some magic happens.” The mathematical physics of natural law is a universal language, reading the same to lawyers and engineers alike. Authority cannot overrule natural law. As long as the work is done to the best available technology and fully scrutable to natural law, it is invulnerable. That is the basis for you to assess the work plan. Any flaws found in the connections will be repaired.
The Design for Prevention is a fundamental problem-solving strategy especially adapted to deal with the outrageously complex. Specifications can be for performance and quality as well as for damage avoidance, such as in safety. The task actions of precaution, intelligence-driven, to anticipate the future are geared to arrange a particular bucket of same to meet design-basis scenario specifications. That’s it.
Prevention delivery presents a surprisingly robust challenge. Its reconnaissance gauntlet is information and data processing intense. The prevention design attributes of future and particularity alone quickly build the possible unique states to sort through to infinity. To most, it is endeavoring to push a pyramid. The tools of complexity reduction, i.e. systems engineering with intelligence amplification, must be brought to bear on prevention design and it’s all or nothing. The part way default is a snap-back to the rulebooks of hindsight and a lost cause. This manual describes practitioner-proven complexity reduction procedures.
The complexity of prevention design contains both social and technical snags. Thereby the compass of competency covered in this manual must of necessity encircle both technological and institutional realms. Since the responsibility for effective prevention system design (ends) cannot be subdivided, the practitioner must exercise skills effectual with both domains. His facility for leveraging natural law well serves him in confronting the motley array of challenges.
Complexity reduction tools for social as well as technical systems have become much more powerful and convenient to deploy for running the prevention triathlon. The advantage for the engineering practitioner centered on natural law is that he can confidently ricochet between social system and technical system affairs as needed. The same control theory seamlessly covers both.
The central organizing principle for the work of delivering prevention and this manual is duplicated from prevention law. Since compliance to the various laws for preventing stakeholder damage must be confronted sooner or later, it has proven most efficient to organize the assault on complexity by the primary distinctions of hindsight/foresight, rules/process, analysis/synthesis, general/local and means/ends aligned with how the law has evolved in the professional obligation of foresight-based loss control. Compliance is both the reason and the benchmark for prevention delivery. When you do the job with transparency and efficiency, compliance is automatic. There are no non-compliant, but productively superior, prevention design practices.
The practitioner standard of care begins with the NSPE code. The National Society of Professional Engineers husbands the code for all engineering disciplines. It is not the only condition of license, but it is a standard feature in the law of every state. The following is excerpted from the current NSPE code, echoed in the World Federation of Engineering Organizations.
“Preamble
Engineering is an important and learned profession. As members of this profession, engineers are expected to exhibit the highest standards of honesty and integrity. Engineering has a direct and vital impact on the quality of life for all people. Accordingly, the services provided by engineers require honesty, impartiality, fairness, and equity, and must be dedicated to the protection of the public health, safety, and welfare. Engineers must perform under a standard of professional behavior that requires adherence to the highest principles of ethical conduct.
I. Fundamental Canons
Engineers, in the fulfillment of their professional duties, shall:
1. Hold paramount the safety, health, and welfare of the public.
Engineers shall advise their clients or employers when they believe a project will not be successful.
A stakeholder is any group or individual with a vital or essential interest tied to what engineers do.
Note: In regard to the application of the Code to corporations vis-à-vis real persons, business form or type should not negate nor influence conformance of individuals to the Code. The Code deals with professional services, which services must be performed by real persons. Real persons in turn establish and implement policies within business structures. The Code is clearly written to apply to the Engineer, and it is incumbent on members of NSPE to endeavor to live up to its provisions. This applies to all pertinent sections of the Code.”
The top theme of the standard of care, fundamental canon I.1, has been number one for a century. It is impossible to hold stakeholder safety, health and welfare paramount by hindsight, crisis response, insurance, or retroactive legal process. You cannot meet this duty by escalating the intensity of emergency response practices. This canon can only be abided by foresight. There is no rearward alternative to prevention.
It is equally noteworthy that the venerable canon is an end, not a means. While many new matters in the Code use the subjective word “strive” as sufficient due diligence, rather than “attain” as an end, when it comes to preventing preventable stakeholder damage it is strictly and unequivocally on delivered results. Even though the law claims your duty only applies to services, when litigation reaches a climax, the law holds the engineer’s feet to the fire of ends by a standard of care it formulates on the spot. In the operational reality, the idea of wriggle room on services for engineers is an illusion.
If law lets the engineering function off the hook of results, there is no entity left to hold accountable for the damage. If no one is ends-accountable, there is no purpose to litigation. With no accountability for safety, health and welfare, both the law and the professions are in violation of the social contract. Engineering has always been the logical scapegoat in the courtroom of prevention law. Can you imagine the law blaming bridge collapses on science?
There is no special ethics belonging to keystones. Keystones are not, simply because they are keystones, exempt from the common obligations, duties and responsibilities that are binding on ordinary people. They do not have a special moral status that allows them to do things that no one else can. Ethics requires autonomy of the individual while a code assumes the legitimacy of an external authority imposing rule and order on that individual. Obedience to moral law for autonomous individuals, such as keystones, is motivated by respect for the moral law. Obedience to civil law is motivated by fear of punishment.
The fundamental fact remains that the institutional context makes meeting the prevention mandates of the fundamental canon with business as usual impossible – a fact the keystone is obliged to embrace. Psychiatrists label this a destructive dilemma. As a condition of his license to perform engineering services, the practitioner is indebted to prevent stakeholder damage and deal with institutional contexts hostile to the rigors of prevention design at the same time. This is no small assignment. The law mandates, in effect, this constraint to be handled by the practitioner, not the institution. At climax, the law judges on results, not on intentions: on ends, not on means.
While operation of the law is tangential to prevention design, accept that lawyers have seized the agenda and go on. The legal system, over time and trial, bifurcated prevention law into separate hindsight and foresight branches. The basis of the bifurcation reflects the intrinsic social system reality of prevention delivery. Legal process did not impose a bifurcated prevention law first. Over centuries, the law gradually adapted to the operational reality of the institution – at a price. There are currently over 800,000 practicing lawyers in the USA, in the same range as the number of graduate engineers. In 1953, there were seven engineers for every lawyer. Do the math.
The central obligation in the standard of care is to avoid a mismatch of attractor to undertaking. This duty includes the chreod and cognitive wherewithal to detect a mismatch in the making. Once mismatches leave port, for all practical purposes they are impossible to recall. The duty to preempt a mismatch is so primary, the practitioner may not continue his engagement if he fails to prevent it. For obvious reasons, the professional engineer is not permitted to remain associated with a project in mismatch and therefore in route to certain failure. There are some projects that professionals do not want their name associated with.
Continued
There is no conflict between what the law specifies for compliance and the best practices to reach the goal. There is no advantage to skirting the law, hoping you are not found out and punished. Design by whim and fancy are non compliant, to be sure, but they are not productive either. Otherwise, you would race to the goal willy nilly and then backfit the appearance of compliance as occasion suggested. Methods that don’t meet the standard of care account for nearly all stakeholder damage. Projects in mismatch are always inefficient, ineffectual and late. Perseverance with methods that don’t work, which means they don’t comply, is a strategy with nothing to recommend it.
The keystone subject to the standard of care cannot exercise due diligence as a member of the institution. It is impossible to be subject to a chain of command and meet due diligence at the same time. It is not just that compliant prevention delivery coming from the hindsight attractor is rare; it can’t happen. The due diligent keystone must be embedded by the ends context – masterless. In the basin of the foresight context there is no hierarchy. Commitment to attaining goals is an individual, not an organizational, matter. This is why no equivalent code of professional practice for institutions exists. Ever wonder about that?
The word “attractor,” introduced in the Platform, is used here interchangeably with the word “context.” Since there are only two possible social system attractors in this universe, there are only two possible working environments. Humanity cannot support a third. To zoom along, it will be necessary to embrace the view of the two social system attractors; natural law as the law of experience; and pragmatic foresight as data gathering reconnoiters into possible future dynamics. If you hold conflicting views of these core precepts, we will pray for you.
Background
The arrow of time is several billion years in flight. It contains history, the incredibly improbable what that actually transpired, and a limitless future of what might yet be. In times of super stability, wisdom sufficient for perpetuation of the species can be supplied entirely from hindsight, folklore and the lessons of experience. Running on tradition and legends during periods of countrywide equilibrium, even though history cannot repeat, is a plan as good as any. In fact, using hindsight navigation to deal with disturbances is how we got here. It was natural to install retrospection, and all that attends it, at pantheon central. So much so that the intellectual domain of pragmatic foresight is aggressively excluded as a contaminant to institutional operations. It is.
Downplay of intelligence-based foresight can be seen embedded throughout society. For example, more than 99% of all courses offered in the educational system, top to bottom, are hindsight-based. The subject matter taught, art or science, is based upon history, lessons-learned, and past discoveries. Pragmatic foresight or foresight engineering, the soul of design, is not offered at all. The few professors of engineering that do teach control theory, a fundamental authority of loop thinking, are considered academic aliens. Harvard’s Chris Argris, of Organizational Development fame, promoted the great benefits of loop thinking and exposed the limits of linear thinking to institutions for decades. The unvarying reception he received from institutional management provided an independent validation of Starkermann’s work and a convincing lesson to any practitioner.
The available literature produced by mankind is exclusively a product of hindsight. On occasion a book will include commentary about the consequences of neglecting foresight, but that’s about it. Suppression of pragmatic foresight, regardless of the significance to perpetuating the species, is ubiquitous. Soldiers and generals alike reject it. You can notice the bias just by looking for it. Institutionalism is a twitch in the great nervous system of humanity that connects all organizations – the living and the liquidated.
The significance of pragmatic foresight to society historically ebbed and flowed as a function of war. As soon as one side used superior technology to defeat the other in battle, e.g., the Egyptian chariot design advantage for pharaoh’s armies, the throne supported the military engineer. Without the crises of warfare, however, engineering foresight was shelved. As it did from the first as it does today, advances in arsenal technology come only from the foresight attractor and, as such, operates apart from the chain of command. This requisite never sits well with the power of generals, kings or gods.
The usual dismantling of wartime foresight attractors, like operations research (OR), followed WWII on cue. Dismantling was about complete by 1957 and along came Sputnik. The threat of Soviet space supremacy triggered a massive resurgence of work in the systems sciences. Suddenly, the scattered champions of systems think could get all the government funding they could use merely by asking for it. By 1970, the sustained emphasis on pragmatic foresight had reached critical mass and the chain reaction we have ongoing today began. By 1990, advancement of pragmatic foresight had reached the stage where the artifacts it produced induced a demand for more artifacts. For the first time in history, except for building massive institutional artifacts like pyramids and cathedrals, pragmatic foresight didn’t need a war to flourish.
The brute fact is that circumstances have made the realm of foresight just as important to human survival as applied hindsight. Foresight is significant and material to understanding and dealing with the plagues of novelty and complexity that confront us. Issues become plagues when assigned to practices impotent to the challenge. At this time, as all know, the nation has a dozen or so major threats non-responsive to government process. The energy and infrastructure mess are but two current examples that have flown by the window of opportunity for a painless remedy.
For millennia, a concerted systematic looking ahead for preventing damage was institutionally unthinkable. All prevention design, if any, was carnage-driven. Prevention of personal damage was up to those individuals exposed to the prospects, not their squires and business owners. For production of dangerous materials, like gunpowder, workers devised ingenious ways to ensure owners were attentive to prevention matters. The Du Pont family at the Brandywine powder mill, for one, had to live adjacent to the mixing sheds before workers would staff the job. By 1900, the railroads were well aware of the wholesale slaughter caused by fire from the combination of wooden passenger cars and frequent train wrecks. For traveling railroad executives and their families, steel cars were provided.
In the great record book of human affairs, there is no instance of an institutional move towards intelligence-driven prevention that preceded killing on a grand scale. Before safety valve prevention gear was required for ship steam boilers, as an instance, 30,000 Americans lost their lives in vessel explosions. The unbroken record during the millennia of the institutional era serves as a rattlesnake’s warning rattle to the practitioner. Prevention work is not a neutral dispassionate activity to the institution. Do not bank on either cooperation or appreciation. Delivering effective prevention is often a quasi-clandestine operation. The only reason you and those in your foresight attractor are institutionally tolerated is that the institution fears giant litigation costs for noncompliance to foresight law. Compliance is your empowering permissive – your only one.
Unlike war and crisis response, the subject of prevention has no place in history. Once the opportunity to prevent damage is erased by the passage of time, prevention records are jettisoned. The history of damage is a history of the consequences of stagnant hindsight competency enforced by the fixed operational limits of hindsight. In times of super stability and equilibrium, the potential impact of these limits is concealed. It is clear from the record that the paramount purpose of an institution is to preserve the configurations and practices that identify and define it as an institution.
The fact that the institutions of government, confined to the practice of crisis response as they are, continue to convince the population that, in their bureaucratic hands, crisis response will be effective in controlling such plagues as terrorism and climate change, is astonishing. Once the window of opportunity for crisis avoidance has closed, control is relinquished to an unmitigated Second Law delivering the crisis. The history of these occasions in past times suggests that society will not learn any lessons from the experience. Absolutely nothing in the standard suggests or addresses methods of crisis response. Not our bag.
Answering the call of disaster is a noble and popular activity with much to recommend it. Calamity rejoinder is an important and necessary competency of civilized society. There are many risks that are best managed by competent emergency response services, such as fire, police and ambulance corps. The principles and procedures for beneficial response to after the fact wreckage comprise a discipline greatly esteemed by society. Since crisis response functionality is a basic element of business as usual, it is spontaneously provided. Firefighting is separate, unrelated and apart from the standard of care for foresight engineering.
With the unmovable restrictions natural law puts on navigation by hindsight, the scope of competency of institutional ideology and process is no different today that it was in the Bronze Age. The Achilles’ heel of institutional methodology, because of its hindsight-fixation, is a rate of disturbance in the environment it depends upon for sustenance faster than its response cycle. As the pace of cultural change escalates, driven by new engineering virtuosity, the gap between ordinary business and stakeholder protection widens. The measure of this gulf is the litigation handle. Growth here signals that the cost of negligence has moved further from the trivial fines for regulatory rule-book violations to the serious amounts attending foresight law – roughly a thousand to one.
It is easy to cite examples where the Establishment has encountered fast-developing complexities over which lag-bound hindsight -adds to the damage. The refugees of Katrina can attest that Homeland Security is not particularly good at aftermath either. The instances observable today, as Dilbert documents, are at every level of every institution. Of the three mandates of the top canon for the practitioner – health, safety and welfare of stakeholders paramount – none can be remedied retroactively. Waiting for indication that these life qualities have needlessly suffered, no remedy exists to erase suffering incurred.
The nested ubiquity signals that natural law is the law of this experience – and nothing else. Because consequences loop back to shape the goal sought, the destiny of all strategies is married to control theory. As institutions can do naught about the rate of change to their environments, they can do naught about the intrinsic delays of their hindsight response. Control theory governs all such looping operations and calculates system instability generated as rate of response increasingly lags rate of disturbance.
It is tempting to list current examples in the large, such as the environmental and healthcare messes, but you will have no problem generating a very long list on your own. Because natural law governs these proceedings, instances are exhibited at every social system level. Note this list is cumulative and never reduced by a “problem solved.” When the requisites of resolving the particular disturbance slump outside the zone of institutional competency, the institutional attractor draws attention to the very points at which an attempt to intervene will fail. That is why the list of unresolved threats is oblivious to applied institutional resources. Congressional hearings parade this exercise in futility by the public several times a week. The Congressional hearing is a staged event exhibiting the development of a significant threat Congressional process can no longer attenuate. Unwittingly, these televised hearings advertise the limitations of political process to solve complex problems. For the issues that sprawl beyond the scope of institutional ideology, Congress is powerless.
Each major peril to the nation that bulges beyond the span of institutional competency follows the same pattern. The methods to placate the public and preserve the status quo ante are exactly the same. Only the subject matter varies. What is never discussed is the lack of control and a stop rule. No matter the amount of damage, the institution is powerless to stop its generation. These issues also share in common imperviousness to crisis response and a time window in the past where control still existed and prevention would be a slight course correction taken in stride. The entire strategy of hindsight rests on the public continuing to believe that the invading threats can only be handled by institutional process. Even the notion that a foresight attractor exists remains taboo. This illusion cannot be maintained forever and sooner or later a day of reckoning dawns.
Response time lags now govern the proceedings. The make up of the cast doesn’t matter. The number of issues and their rate of growth have already overwhelmed institutional process capacity. The outcome is set in concrete. The only uncertainty left is how the multi-pronged calamity will evolve. It is calculable from lags alone that the system is unstable and headed towards collapse. It is, exactly, like Chernobyl where plant operations quietly went passed the bright line where controls still worked and prevention was possible. Crisis response activities will only spread the damage.
The challenge of ideological change
Cultural change is not brought about by combat and displacement, but in using the new artifacts instead of the old. Over time habits associated with the old artifacts fade away by neglect. When you preferentially use your cell phone to the neglect of your landline, eventually landlines disappear with all the attendant protocols. This displacement by obsolescence is happening big time in the military and you can watch. It is the engineer rapidly advancing robot performance, not the general, who determines how the next conflict will be engaged. Bottom up has displaced top down.
The stream of foresight engineering artifacts circumventing traditional institutional governance is at flood stage. Society is no longer insulated from the perils of engineering ingenuity by institutional sabotage and veto – a power formerly enabled by the engineer’s dependence upon capital budgets only an institution could afford. As long as the foresight engineer is willing to launch his artifacts without credit or recognition, he can compound fracture cultural norms. The Establishment dares not officially recognize that the bypass process is flourishing. No institution will acknowledge the actuality.
The state of affairs today, regarding prevention delivery, finds abulic institutional process still the dominant constraint in an era of exploding foresight aptitude. The prime movers that are expanding engineering virtuosity in the prevention field are neither connected to nor dependent on institutional operations. The more crisis response fails the damage prevention goal, the more prevention competency matters. The abyss between the institutional attractor and prevention capability is expanding. The only functioning bridge between hindsight and foresight attractors is law and law is foremost an institution. That is why the legal framework is used to hang prevention knowledge on. The sole motive is parsimonious compliance.
It is a tribute to the proficiency of social conditioning that the population thinks the laws of experience that apply to them as individuals don’t apply to their institutions as collectives. The masses are content to obey authority and support institutions that are not solving the significant problems entrusted to them. The public also supports Las Vegas, fully aware that the house will end up with its money. Yes, it is the same ubiquitous irrationality and it is to be accounted for.
The keystone role
All successful endeavors to arrange a desired future, from any perspective, for any goal, necessarily engage the engineering process. There is no way to command the complexity to reduce, even for wizards. Systems engineering is the only proven process for rendering complexity down to fit a cranium. There is no way to escape from this labor. Complexity reduction is a function of knowledge developed.
The dogged application of inappropriate methods is called long-cycle RBF (run, break and fix). Intelligent applications are short-cycle RBF. Governments, having no stop rule, can only run on long-cycle lessons learned. In past eras, ultrastability allowed the intrinsically slow revision process to get the repairs done before the society became unstable. In the 21st century, disturbances to society are so large, so unprecedented, and so numerous, learning the hard way destabilizes business as usual – in case you hadn’t noticed.
The network of institution and stakeholders contains several nonreciprocal relationships. The ability to produce damage in one direction is not balanced by an ability to cause equal and opposite damage from the other direction. The practitioner, like the stakeholder, has no reciprocity with any institution and he cannot alter the stakeholder’s lop-sided arrangement. Only by preventing damage can the reciprocity disadvantage of the stakeholders be neutralized. After damage is inflicted, when it’s too late to avoid, the stakeholder gets his lesson on reciprocity. Nonreciprocal relationship exposures should be measured early on and stakeholders informed. No more can be done.
When the institution moves to backfit a “safety” function, the decision invariably follows prolonged loss experience and policy coverage stipulations from insurers. It is then too late for the designated specialist for safety to engage the prevention function for damage past – what with history being irreversible. Because he is lag-offset from design, the safety engineer designate can only operate with the hindsight-blinkered rulebooks from the regulatory spawn of hindsight law. This asynchronous arrangement, fully compatible with institutional operations, is not the stuff of foresight law compliance. Its record is awful.
It is easy to tell when a persistent stakeholder damage issue, like safety, is being handled by hindsight process. The safety record will remain the same oblivious to applied resources – behaving like universal constants. The fact there is no correlation between hindsight practice escalation (or reduction) and the safety record says it all. Workman’s Comp, for one index, has been running horizontal lines of cost/effectiveness, oblivious to management fads and resource variations, for decades.
The fact the entropy reduction engineer has proven content with his lot facilitates an operation that flies far below institutional radar. As in the days of the craftsmen and guildsmen, the engineer can now readily afford the equipment essential to the best available foresight engineering technology on his own. Since the 1980s, he has been able to get the prevention job done for any size goal-seeking project without the need for large capital equipment and proving grounds. By forging ahead of the institution on his own hook, the practitioner can avoid the sabotage of institutional veto power at release. Because the engineer no longer has to “explain” things to management in order to get enough capital stuff to seek his goals, management is progressively clueless as to what mission engineers are on. Since the engineers are quite content with the obscurity, generating their own authority to act, management grabs the credit for system design winners about which it knows absolutely nothing.
At any moment anyone is free to act toward the future desired. The future will be as wished and perceived to be. This comes as a shock to those who depend on the principle that only the rules observed in the past shall apply to the future. In the institutional attractor, the concept of “change” is inconceivable, for change is the process that obliterates the rules of the past. In the foresight attractor, change is a way of life.
This stalemate will keep the current arrangement intact for it has every rule but a stop rule. It doesn’t matter whether or not the participants in this affair are informed and conscious of the new operational truth. At this time, everything is on automatic spontaneous, stable, and outside the reach of institutional control. Most of the work pushing the prevention affair uphill is social system chores. The question of technological skill doesn’t even surface until the later stages after the sociology scores are settled. This is the proven template, exactly, to be used for foresight law compliance.
Overview
The standard of care for the practitioner of prevention system design is arranged by preparation and the sequence of work. Appropriate foundations and structures must be in place before work on a specific application commences. The standard of care begins with clearing the obstacles to the requisite context of the foresight attractor. Since all prevention is future, the hindsight attractor cannot and will not meet the standard of care. The path of least organizational resistance is, by definition, never the best route. The lesson of context mismatches to issues has been well learned.
Accordingly, the standard of care consists of a foundation section regarding cognitive preparation followed by the protocols for dealing with the social system constraints so as to preempt a context mismatch. The climax of this section is the formal establishment of a triage step, performed by the institution, to divert incoming issue to appropriate attractor. The last section of the standard is devoted to the technical system tasks for prevention design assuming the foresight attractor has been installed. Because attractors lock in solidly, the great bulk of the standard is devoted to removing institutional constraints in arranging the requisite context to design prevention.
Tools proven to increase productivity in implementing the standard are provided in an appendix. There are many tools in systems engineering and different practitioners will favor different tools. The best tool set for an application cannot be determined until after the goal is reached.
The sections of the standard
The first section provides key generalizations for aligning your mindset to deal with the coming attractions. The constraints and issues you face cause anxiety and stress because they have no fix residing in your institutional comfort zone. The size of the mountain you are about to climb cannot be exaggerated. The damage institutions inflict on stakeholders, being non-reciprocal, is disregarded. The appropriate philosophical basis will serve as benchmark for the host of decisions that attend prevention delivery. The time to check alignment of your perceptual reference is right at the start. Misalignments in this framework cause havoc down stream. If you ride off in all directions, you get what you deserve.
The second section provides generic tooling for the required institutional negotiations. No amount of performance or perfection in process can have any impact on institutional decision-making. Unlike most activities, where you plunge in doing things and the working environment takes care of itself, prevention delivery requires context first. All motivation is derived, directly or indirectly, from the enforced obligation to comply with prevention law – as perceived. The assault on complexity, to the institution, is the handiwork of an alien attractor. In effect, The Enterprise is being hailed by another spaceship, from a known and troublesome extraterrestrial culture, asking it to lower its shields. The intellectual alibi of The Enterprise commanders must be shaped to do so. The first chore is to determine if business as usual will suffice for doing the project.
If not, the next duty prescribed by law is to perform inform/consent. This legal sacrament is where the practitioner informs the institution that prevention law compliance is the supreme authority for the work to be done. The law directs that the institution is to understand and accept that obedience to the law trumps any conflicting decisions from its chain of command. For this step, the practitioner better have his act together. Failure to get this clandestine concession of veto power in deference to prevention law spells curtains for project success.
After inform/consent the task is to develop the factual knowledge necessary to certify that business as usual cannot succeed in prevention law compliance. This work is carried out by the practitioner as a solo act. He is the only one with a passport authorized by the institution, during that period, to engage the requisite “alien” practices. Using brute fact, he makes the case to the institution that assigning the prevention project to business as usual must lead to mission failure. This is the climax of the saga where the institution unknowingly strips itself of project veto power and the foresight attractor is “authorized” by default.
Last in this section is the development of the triage protocols to be used by the institution to divert its incoming issues to the appropriate attractor. Members of the institution temporarily join a foresight attractor cohort to compose the benchmarks and protocols of triage, with the practitioner as facilitator. This completes the context formation and safeguard process, starting from philosophical abstractions and transformed in steps to the language of the institutional attractor – stories and task action checklists.
Section three begins after the institutional attractor applies the triage protocols developed by the cohort. The technical work of prevention delivery begins on those issues diverted by triage to the context in-basket. With the institutional predispositions and obstacles neutralized, the masterless workers in the cohort can devote all their energies to prevention design duty. The order of battle and tooling for the assault on technical system complexity are presented in appendices. It is up to the practitioner, of course, to select which tools are most appropriate to the application. This is the section subject to rapid technological advance, thanks to enormous computer power now available to cope with the mathematical physics, the language of nature. Since the span of engineering competency can only increase, regular updates to this section of the manual are necessary.
Section 1
Philosophical Footings
Personal Perspective Management
Mindset Mentors
The thought that humanity just might be subject to the laws of nature in like manner to everything else material can be found in ancient history. For several reasons institutional, however, such thought is aggressively discouraged. For any device so effective in the operational reality, the taboo nature of the topic will matter not to the practitioner. The connection of natural law to social behavior is a magnificent goal-seeking tool. It avoids pursuits of the impossible and keeps a happy productive workshop. Experience has shown the application to be convenient, quick and flawless. Neither contrary examples nor rumors of contrary cases have ever been encountered. It works 100% of the time.
Engineers are instinctively drawn to the concept that natural law in general and control theory in particular have lot to do with shaping the behavior of organized humanity. Scientists are repelled by the thought. While natural laws are institutional undiscussables in general, the universal law most violently rejected as blasphemy is control theory. With this natural law, first assembled in mathematical physics form by James Maxwell in 1868 (to explain the genius of Watt’s flyball governor), operational consequences are compared to espoused goals for control purposes in an endless loop. This comparison step is intelligence based, going directly against the grain of rule-based functionality. University professors teaching the heresy of control theory are considered alien to academia and abused accordingly. Control theory is the natural law that clashes frontally with institutional ideology.
The mentor who first derived the characteristic equations of institutional ideology on a sound, objective basis is Thorstein Bunde Veblen. He proved that institutional behavior is anything but ad hoc. His work a century ago put social behavior on a scientific foundation – a derivation that allowed falsification using control theory. The genius of his theses triggered all sorts of hostile repudiations. The cruelty he received from various institutions was a destiny explained by his own theories. While the propositions of Veblen’s contemporaries in economics and sociology have fallen by the wayside, his concepts have proven to be timeless and indestructible. About every ten years, Veblen is “rediscovered.” His works are discussed in view of current knowledge, affirmed valid, and put back on the shelf. The institutional aversions derived by Veblen do not go away with time either. Veblen was the first to recognize that society is composed of two attractors in operation and that they were totally incompatible and completely complementary.
The mentor who derived the mathematical physics connection between Veblen and natural law is Rudolf Starkermann. He demonstrated that Veblen had been preserved against the worm of time by the amber of natural law. Beginning in 1953, this distinguished professor of mechanical engineering worked out and tested a scheme to compute social behavior patterns using Veblen’s inputs that matched Veblen’s outputs. In 1994, using powerful computers, Starkermann and other gifted dynamic simulation experts worked together to run systematic tests covering the spectrum of behaviors built in to institutional ideology. This data was compiled, charted and distributed – evoking the same hostility, rejection and knowledge disavowal endured by Veblen a century earlier.
The importance of Starkermann to the practitioner, however, is in the robust and rigorous application to prevention system delivery. The strategy revealed in this manual to deal with the sociology has been lifted directly from the work of these two mentors. Starkermann, ensconced at home in Switzerland, is still actively engaged in the tie between control theory and social behavior and, since retirement, has published several books on the subject. Veblen’s many books, including “Theory of the Leisure Class,” are widely available for free download on the Internet. If you’re not familiar with Veblen, your perceptual frame of reference is a personal menace. If you’re not familiar with Starkermann, you’re missing out on 20/20 foresight for social system dynamics.
The mentor who worked out a viable strategy for assaulting complexity through mathematical physics is William Ross Ashby. Publishing “An Introduction to Cybernetics” in 1950 and now available free on the Internet, Ashby forged the engineering methods from science for reducing actual complexity to a size that will fit a cranium. Ashby was a key member of the great systems science frenzy funded by the national response to Sputnik. Ashby provided the linkup of control theory with the assault on complexity, eliminating the last vestiges of subjectivity in pragmatic foresight. Ashby’s Law of Requisite Variety, very unpopular with institutions, is prominent in the engineer’s pantheon. Like Veblen, his works do not lose their voltage with time.
The great mentor for system philosophy and logic is Charles Sanders Peirce. The mentor who demonstrated the awesome sweep of thermodynamic laws in the operational reality is J. Willard Gibbs. You don’t want to know the savage treatment every one of these mentors received from their respective home academic institutions. Gibbs used the laboratories at Yale off the payroll. Harvard still hides the original papers of Peirce in some attic. The purpose of this manual is to help you navigate around the ubiquitous organizational quicksand and get the job done anyway. The personal price of success is obscurity.
The Prime Directive
Vision without action is a daydream; action without vision is a nightmare. Japanese Proverb
The standard of care for the professional engineer is oriented to reach the objective of preventing preventable stakeholder damage. What other route is available to assure stakeholder safety, health and welfare – the prime directive? As the vision of prevention is end results, the challenge and obligation is to get there efficiently and productively. Experience has revealed effective action strategy to realize that vision.
In a nutshell, the master scheme leverages natural law to power the ride. It takes advantage of the intrinsic properties of attractors to direct and amplify goal seeking. This is done in a two-step process. First the practitioner does the work to lower entropy to the inflection point where the appropriate attractor cohort can properly take over. Secondly, the empowered attractor spontaneously and automatically grinds towards the goal. Like the roller coaster, first a lift to attain sufficient potential energy – then an inflection and the transferring of potential to kinetic.
In order to leverage natural law to meet the prime directive, it is requisite that the appropriate attractor is assigned according to the characteristics of the issue. Any mismatch here leverages natural law in a mission-defeating direction. There is no neutral all-purpose ground. You can be sure that if indifferent natural law is not pushing goalward, it is driving in a counterproductive direction. The problem for the practitioner is that attractors promptly lock in, making assignment mistakes incurable.
The allocation of issue to its appropriate attractor is so crucial to success; the bulk of the standard of care is devoted to arranging it. Instead of a balance of social system aspects with technical system stuff, the standard of care spotlights institutional matters until the proper diversion of issue to attractor is accomplished. Since matters at that point are essentially spontaneous and automatic, technical system savvy is anticlimactic to the context wars. The system design work also leverages natural law to get the job done best. Goal-seeking progress, natural-law amplified is the sustaining essence of the foresight attractor.
The reason institutional constraints are featured is that the prime directive is future-centered and the institution invariably commissions the hindsight attractor to proceed, thus locking in a mismatch. It is the first duty of the practitioner to preempt that automatic choice of hindsight and see that a mismatch is not commissioned. He is required by law, should he fail, to report the mismatch “up the ladder” and, if not corrected, withdraw from the engagement. Guess how the law came to establish that duty?
By law, the PE is forbidden to be associated with a project doomed by noncompliance to the standard of care – and he is obliged to make a reliable assessment of this permissive as he goes. The professional engineer cannot be found still active in a crisis caused by noncompliance. Because it erodes public confidence in professionalism, it is clearly dereliction of duty to be standing among the wreckage you were engaged and duty-bound to prevent. Note that the watchdogs that habitually enable the damage they were commissioned to preempt are not licensed individuals.
The preponderance of emphasis in the standard of care on institutional affairs is because once the attractor wars are settled success is assured. Technical genius doesn’t count if the context is mismatched. When the context is aligned, continuing on to the goal is downhill and automatic. This is not to say the technical phase is easy, because it is not. It does say that worry over institutional constraints and sabotage to goal-seeking can be suspended.
Win Win
There is a place for the blame game in some program botch jobs, but it serves no purpose to blame the cast when the mess is simply the inescapable detritus of grinding natural law. Program failures are typically not caused by the slings of outrageous fortune, erroneously called accidents, but are the inexorable consequences of an assignment mismatch of attractor to issue.
The methodology and technology discussed in the standard of care for compliance to prevention law is extremely general in application. Prevention design and delivery is just one of several places where learning and new knowledge can use the foresight engineering approach to deliver the goods. While it has been developed primarily by and for the foresight attractor cohort, it can also be useful in institutional husbandry. For example, it can be a long long way between the good stuff you learned at training camp and getting the institution to reap its benefits. It so happens that the standard of care provides a proven method for achieving what improvements you can and a stop rule. The institution benefits from the better technology and method and you avoid the angst associated with the thankless and usually disappointing task. By matching your expectations to the foreordained outcome, it becomes a bumpless transfer of ideology.
There is a close parallel to the practitioner’s dilemma experienced by every member of the institutional attractor. As all know, it is customary for institutions to send loyal subjects off to conventions and conferences for the stated purpose, as formally expressed by all parties, to gain knowledge directed to improving the work performance of the individual or of the institution – but usually both. The training material offered is leading-edge technology and/or methodology. That is, something important not already done by habit in the institution. In practice these convocations are either aimed at social recreation or technology, never in equal amounts.
Invariably, the conference sessions deliver on the promise. The attendee is filled with the latest and greatest stuff directly related to improving the situation back at home base. So far, so good. The bad news comes with the sermon appending the vision as to how the institution will wildly prosper, thanking you profusely the whole time, when you get back home and help the institution install the improvements. This glorious outcome is assured by imposing the personal obligation to persuade the institution to make use of the stuff you went there, at institutional consent and expense, to acquire in the first place. The duty, to sway the institution to adopt your superior know how, so that the benefits promised will be realized, is imposed by the trainer to bring closure to the proceedings. You are abandoned with a wad of new knowledge dressed in a terrible obligation.
Between leaving the convention and getting back to work, despair sets in. The feeling that the more you push change, beneficial or not, the more the institution will resist is overpowering. You have found no way to good-deed due diligence that won’t get you punished for your trouble. You are defeated before you can start. The guilt trip begins when you elect to make little reference to what you learned in class while the institution never asks what it got for its money. Since you have acquired the guilt for making no impact on the status quo ante, you get to repeat the same cycle next year.
The standard of care for prevention delivery, dealing with the same condition, is the roadmap for guilt-free, punishment-free action. In principle and in fact, there is no difference between designing prevention systems and getting institutional support for process improvements you learned in school. As the challenge to status quo ante is the same, the methodology best able to deal with the institutional resistance is the same. Foresight engineering fills the niggling gap between knowledge of what is known to be operationally better and what the institution will adopt without a war. The tools of pragmatic foresight show you what is achievable and what must be the pursuit of the impossible. The standard of care shows how to determine which is which and when you have done all due diligence the law requires.
When individuals in the institution have the savvy to use the standard of care appropriately for in tran-institutional affairs, like technology transfer, the institution wins in the benefits derived from better methods. The technology promoter wins in that he has made a contribution without being punished for his trouble. The stakeholders win because well husbanded institutions inflict less damage.
The benefit to knowledge of institutional ideology and compliance to the law, to anyone, is relief from the emotional stress of the guilt trip for failing the duty to do the impossible. The largest contributor to ubiquitous angst is the lack of familiarity with the principles that explain why things are as they are. The same tools that work to show all that can be done to improve institutional operations, through knowledge of the principles, enable incisive perception and uncanny prediction. When you know with certainty that you have done all that prudence would allow – the guilt trip comes to an end.
Survey Irony Mountain
Humanity lives in a world that is designed. The great pool of humanity supplies the folks who design this world as well as all the folks who live in it – and the irony begins. To celebrate the beginning of the new century, the National Academy of Engineering published a list of the 20th century’s most notable engineering achievements. The list includes: electrification; automobile; airplane; water supply and distribution; electronics; radio and television; agricultural mechanization; computers; telephone; highways; spacecraft; internet; imaging; household appliances; health technologies; petroleum technologies; and high-performance materials. Viewed as a whole, engineering as represented by these and other achievements has transformed the way we live and the modern world as we experience it.
The engineering mode of thinking differs in its essence from both the scientific modes of analysis on which it is built, and from the intellectual traditions of social sciences that are called upon to evaluate and accommodate its impact. While science and social science use many common strategies such as abstraction and modeling, the engineering method for problem solving uses these concepts in a way that is informed and constrained by the physical world, mathematical physics, on which it is based and the human world in which it is applied. As interdisciplinary and multidisciplinary problems accumulate on the great millstone worn by technological society, the role of pragmatic foresight and prevention delivery as described in this standard of care explodes in significance. In the foresight attractor loyalty to a single discipline is not the goal; the goal is the goal.
Since humanity populates social system attractors, of course, the individual humans operating in each are largely interchangeable. Adding in the fact that the two human-powered attractors are at once incompatible and complementary powers a far-reaching engine of irony. The paradox generated by this engine is so pervasive, so ubiquitous, it flows about by stealth, unnoticed and undiscussed. The institution maintains a strong hold on the organs of the obvious. The irony of low radar cross section is ubiquitous, patiently waiting for it to be perceived.
The time to take the measure of the upcoming ordeal is right at first. Appreciate that because of POSIWID blinders, vast areas of important knowledge for this quest are deliberately excluded from the institutional scope of interest. Why should cognitive effort be invested in a process the institution finds of no application value and worse, threatens the hierarchy? It is a shock to discover the ridiculous information base upon which the institution operates. Why should an institution go to the trouble and expense to gather and maintain reliable information for a process it will never use? The institutional control volume is restricted to rule based operations. Rules originate from hindsight. That’s it.
There is great personal value in being irony-cognizant. When you find yourself in a situation floating on incongruity, it is easy to project the future. Once established, the irony engine persists. There is nothing personal about an irony-driven scene. You can review your options and weed out pursuits of the impossible. You do have control of your task actions, but the irony engine runs oblivious to disturbance (control). Accept this condition induced by the genes of society as unalterable and design around it. All else fails.
The prime mover of this irony is that the institutional attractor aggressively denies the existence of another attractor at the same time it has become wholly dependent upon its functioning to survive. The deification of hindsight is necessarily attended by the persecution of foresight. We live in a world that is designed, without doubt, but the existence of the world of designers is denied. Institutions have become dependent upon artifacts the institution is incapable of fathering.
We act as if the ends attractor (foresight) didn’t exist at the same time we exhibit acute awareness that it does. For instance, the term engineer is routinely used by hierarchs in a pejorative sense, but correctly, to signal a non business-as-usual process. To engineer around some constraint means to do whatever it takes – rules and all – unhampered by conventions, traditions and cultural norms – to manipulate things to attain an end. It is a goal-seeking process of synthesis and ingenuity unencumbered by the addictions of hindsight. The institution is incapable of pragmatic foresight itself. It can only sponsor pragmatic foresight to be done in its behalf. Pragmatic foresight locked to natural law is so rich in information it transcends the scope of philosophical theory and remains simply what it is – unique, ineffable, unaffiliated, insubsumable, irreducible.
The truth is individuals, to maintain viability, vault between the two attractors all day. There is no choice. Confine yourself to either attractor and your end will come quickly. It is ironic that the individual hierarch can accept that he trampolines attractor to attractor, because he does, but believes his institution doesn’t have to. Meanwhile, the institution is well aware of the foresight attractor. The first thing an institution in big trouble does is to summon it in. No nation goes to war without first gathering its engineers and setting them to work advancing weaponry and logistics. Even Leonardo Da Vinci was pressed into wartime service more than once.
There is one noteworthy example of a single community that openly maintains both attractors. It did not begin that way, but circumstances on an aircraft carrier require it. There is no possible way for a carrier to launch and land airplanes via the institutional attractor. All attempts to conduct flight deck operations by the chain of command were dismal failures producing great damage and loss of life. Accordingly, the ends attractor is maintained for flight deck operations with the context it requires. In flight deck operations, all mishaps and errors are blamed on the system and remedied accordingly. Such are not considered accidents, but consequences. The workers that coordinate to get the aircraft in and out work under the foresight attractor and when off shift they must enter the institutional attractor and follow the rules.
Einstein has been credited as the originator of “A solution cannot exist in the same system that generates the problem.” This axiom means that the novel and complex issues stranded in the hindsight attractor cannot be solved by hindsight practices. Institutions present themselves as knowing everything and having known it all along. However, it is easy to identify matters that sprawl outside of institutional competency. They are the ones that go unsolved oblivious to time and resources invested. The Einstein thesis exhibits the irony in the institutional attractor as a hotbed for spawning issues it cannot then resolve. History contains no contrary examples. Welcome to Iraq.
Albert’s dictum is one of several that attended the great expansion of systems think in the twentieth century ignited in response to the proliferation of new and complex issues rattling the status quo ante. The irony is that the artifacts of systems engineering, powered by pragmatic foresight, now eagerly ingested by society, contribute to the hand-wringing instability. This irony flourishes today because the hindsight attractor, disconnected from the incessant looping of goals and consequences, has no way to cope with cycles – especially those in which it participates. In these times, the institution encounters a world full of strangeness it can’t steer clear of.
There is no value in pushing an institution to do the work of pragmatic foresight, because it can’t, and it doesn’t want you to do it either. The institutional attractor operates by checklists of task action rules executed in linear fashion. Prevention system delivery is of necessity a process of cycles where each next cycle of work is different from the previous. The basis for the cycle of redesign is called run, break and fix (RBF). It is a ratchet process, where the consequences of the previous cycle are sorted into the keeper pile and the jettison pile. Since the RBF process will always reach the goal held in common, it is necessary to take great pains to make sure the goal is well defined and correct.
Composing the information necessary for prevention system design manufactures an “Emperor’s Clothes” condition. The whole point here is not to attempt to change the unchangeable, but to engineer around the constraints. Once you become good at identifying the constraints and accepting the irony for what it is, you will become good at building bypass highways. The trick is to make a contribution without causing a commotion.
The paucity of information, especially good quality information for decision-making, fashions several ironies invisible from within the institutional attractor. When you find one in play, you will find them all. This irony survey provides an idea of what you’re up against. An incongruity awareness checklist includes:
Institutional
- Institutions know the solution to a problem cannot be derived at the same level causing the problem (Einstein) and then they become progressively hostile when their doomed efforts to do so keep failing.
- The institution aggressively drives out information required for pragmatic foresight at the same time it decays from the lack of it.
- Institutions forbid their members to engage in practices now critical to institutional survival.
- Institutions deny the existence of the foresight attractor while it’s the first thing institutions summon when calamity strikes.
- Institutions live in a world that is designed at the same time it forbids its members to engage the design process. Institutions are now dependent upon artifacts the institutional process is incapable of fathering.
- Academia places hindsight over foresight skills by a factor of a thousand to one and then sends its alumni out to cope with an operational reality requiring incessant foresight in order to survive.
Population
The population supports institutions it entrusts to cure plagues caused by institutions.
The population believes the laws of experience that apply to them don’t apply to their institutions. Then, it forgives institutional failure to achieve its goals and promises, but believes more institutional focus will work. Political process becomes mass entertainment.
The population believes character of institutional leaders matters to the ability of the institution to solve its problems and protect its stakeholders. There is no recorded example anywhere to support this notion.
Individual
You want to be considered obedient to authority at the same time you want the right to complain about the consequences.
The scope of action you find comfortable generates problems not susceptible to your comfort-zone competency.
Individuals incessantly trampolining between the two attractors to remain viable are conditioned to believe that a collection of individuals composing an institution can prosper by staying put.
Legal
By the time law composes a retroactive standard of care for a prevention case, it is already obsolete for operational use.
Law sets the standard for “foreseeability” duty only after damage from noncompliance has registered – when all foresight becomes immaterial.
By the time regulatory agencies use history to update the rules, the conditions of damage are obsolete.
Leverage Natural Law
The fundamental chore for the practitioner is to reduce entropy – at the right places at the right times. Since you are a solo act, you can use any trick at your disposal. The second law is anisotropic. It goes only in one time direction in conflict with the rest of natural law, in which the future and the past appear symmetrically, with the laws themselves indifferent to the direction of time. As time progresses, there are more opportunities towards more disorder (higher entropy) than to hold at the previous more-ordered state. Yes, there is a chance that a Martini will autonomously separate into gin and vermouth, but the odds are ridiculous.
If you don’t account for entropy (S =k log probability W), entropy will run your life. This fundamental directs the events that combine to produce stakeholder damage. This incontrovertible, immutable force is, exactly, the center of attention in prevention design. Prevent entropy explosions and you comply with prevention law. In prevention, once the system configuration is fixed, it is necessary to compensate incessant entropy increases with countermeasures allowed by the Second Law.
Thinking prevention is always thinking system entropy decrease. That means prevention design consists of imposing structure and doing work – your choice on each. It is the soul of husbandry. This requisite is also why the designer strives to compose a system that will automatically improve itself with operating experience – self regulation. Technology and methodology to do this trick, using neural networks, gets better every day. There are two entropy-reduction lifts to make in the context-forming phase
The critical climax for the practitioner, laboring within the foresight attractor, is to leverage natural law apart from the ubiquitous institutional predispositions so that task actions for prevention system design will be spontaneous “downhill.” The first lift is done to arrange the setting for the necessary work of engineering foresight – the process by which prevention is delivered.
The second practitioner lift is done in collaboration with hijacked members of the institution, temporarily working with the cohort in the foresight attractor, so that the institution can carry on with business as usual. In this way, the mathematical physics of engineering foresight (engineer speak) ends up with scouting stories and a checklist of rules of action (institution speak). What starts as home for the engineer is translated stage by stage until it is fit for the home of the institution – mythology and task checklists.
The engagement scope of this manual reaches from the very beginning and ends at the summit where the project can safely be released to natural predispositions of the institutional attractor. The misery of the practitioner in the Atlas role does have an early end. Once the second benchmark of spontaneous success has been reached, the role of the practitioner becomes advisory and subtle. The black-hole pull of the appropriate attractor will automatically see to it that the goal is attained. It won’t be fashioned with the same élan as the practitioner would have exhibited, but the destination will be reached nevertheless. Goal seeking is very personal. The cohort gets to do run, break and fix (RBF) their way.
No matter what, diligently preserve scrutable connectivity to natural law. Hierarch and engineer, when dropped from a great height, fall at the same rate of speed, the rate of descent unaffected by ideology or purpose. Natural law is in charge here. Remember, the universe is run rather like a vast prison.
The Platform established incontrovertible transparency and your work must conserve it. Transparency is your only route to zero subjectivity at release. In the pragmatic foresight attractor, proof of design boils down to a sequence of symbols. The complete natural law connection works so well in institutional exchanges, you will wonder why you ever suffered with anything else. It paralyzes legal process. Famous jurists have declared natural law supreme for centuries.
Since the foresight attractor is ends-oriented, there is no value to risk transfer or responsibility tag. It’s a clear duty to reach the goal. It is pointless to disclaim, find a safe harbor, or stop short. In this attractor, loyalty to practices is not the goal; the goal is the goal. Whatever works best to get the job done fit for service. These days, the half-life of the leading techniques is less than a year. Foremost, make sure the city is worth the siege.
Leverage Scrutable Connectivity
Positioning the design for prevention with the institutional attractor exploits the principles of rhetoric and argumentation that comprise institutional life. The Establishment holds that the tools of formal logic are essential and even definitive for mathematics and programming computers but inadequate to decide controversial issues, such as those teeming at the hindsight/foresight interface. The practitioner strategy here is an operational scope completely engulfed by the rules of formal logic. Any claim made by pragmatic foresight, by definition, provides evidence that links the inference to the claim in the form of mathematical physics.
While argumentation in legal proceedings can be advanced by social consensus and personal credibility as well as objective data, the design for prevention proceeds only by objective data. Scrutable connectivity to natural law takes care of any disputes over the definition of “objective.” Practitioner strategy has no role for authority, consensus or personal credentials including his own. The strategy makes the inference, connecting product to the claim by “cause,” namely natural law – and nothing else. The other inferences used in legal proceedings – example, sign, analogy, narrative, form – are abandoned to the lawyers. Make your case scrutably connected to natural law and the argument is over – by the rules of the institutional attractor. The effectiveness of this strategy has been proven by experience.
Play to your strengths
The practitioner has strong advantages. As a PE, the prevention system design must be stamped by the engineer of record. If another modifies his design, he is no longer responsible for outcomes. The biggest advantage is developed by competency with natural law especially for the inspection trips into the future. The reconnoiter typically goes from what is obvious to what is entirely unsuspected. There is no alternative strategy and no other discipline competing for the assignment. Only the human intellect, and not any of its senses, is given to roaming the corridors of time, searching backward and forward for pattern and order with a compelling purpose. The natural thinking in terms of calculus and control theory is both critical to success and unique to engineering.
Most of the differences between life today and the way it was experienced in the 17th century emerged because of the technical advances that rely on the calculus. It is a glorious engineering tool exposing the rational workings of the world. While calculus is part of our collective intellectual heritage, it is mainly the engineering discipline that deploys and thereby directly benefits. Before calculus, engineering was a discipline of great wartime interest; afterward it became a discipline of great power.
Calculus is not only the most fruitful strategy for understanding the dynamics of our world ever devised, it is central to the field trips into the future conducted by pragmatic foresight. Even though calculus is a crowning achievement of humanity that all people can appreciate, enjoy and understand, it is the engineering profession that puts calculus to work in its operational reality. It is how, exactly, you got your smartphone.
Calculus originated in our quest to understand motion, which is change in position over time. The calculus drives the mathematical physics that lies at the core of seemingly unrelated issues. The self-consistent nature of mathematical physics highlights an essential difference between it and ordinary language. The latter is about meaning while mathematics is about structure, form and pattern. Mathematical physics is not bound by limitations of physical perception but only by the rules of its grammar, which has an uncanny way of revealing hidden physical relationships. This is why it is so useful for the assault on complexity.
The unique feature of calculus is its means to quantify continuity. No other rational system is available to deal with continuous functions, including time. When engineers employ the calculus to study continuity for goal-seeking, they proceed alone. No other discipline or science is concerned about reconnaissance. Because it precipitates insights to system dynamics, past present future, calculus is the supreme problem-solving tool in the engineer’s workroom. Mathematical physics makes use only of the combinatorial character of the symbols used to express the natural laws. Provability is defined within the cage of its symbols. That’s how zero subjectivity is achieved.
The ability, through the calculus, to obtain hard data on future dynamics – in a process scrutably connected to natural law – is what sets engineering foresight impervious to the processes of law. None of the various subjective techniques used in law to circle the truth is a match for a calculus-supported position. Furthermore, the practitioner is able to attain this absolute supremacy by his own labors. He is thus able to manufacture his own authority to act. Foresight prevention law litigation that attempts to counter a position based completely on mathematical physics has lost before it gets started.
For all these reasons and more, the engineering discipline has become the insatiable glutton that translates computer power into dynamics by calculus as fast as more computer power becomes available. Calculus power converts directly into more and better scouting trips into the future. Intelligence amplification has transformed engineering into a discipline diamond hard and diamond bright in its transparency. Reconnoiter competency expands virtuosity. Engineering has every incentive to embrace advancing intelligence-amplification power. It puts engineering into a league of its own.
The application of the calculus in algorithms of natural law that can move system states from the now into the future for the end purposes of design is an engineering process tool. While anyone is free to exploit the same awesome tools, other arts and sciences are ideologically restricted to using the artifacts that result from this objective reconnaissance process. Only engineering is socially legitimized to engage calculus-driven algorithms as pantheon central. Mathematical physics is so fundamental to engineering competency any related advances anywhere are eagerly sponged up and incorporated. Engineering would welcome any such efforts by other arts and sciences as usufruct for escalating advancement. It drives the compelling purpose of engineering to instantiate itself in matter. Engineers are shameless in pinching better practices.
The cynosure of engineering is advancing competency in formulating calculus-laden algorithms of the operational reality. An algorithm, as used here, is a linked series of rules, a guide, an instruction manual, an adjuration, a way of getting things done, the pilot’s checklist. Algorithms constructed from the mathematics of dynamics provide the only horse you can ride into the future to make inspection trips – finding out the true significant variables. All alternatives to the sturdy steed of calculus are defaults to subjectivity – judgment and guesswork on significant variables and invariably wrong. Design is purely an abstract intellectual activity undertaken chiefly for the magical moment in which things that formerly stood distinct and separate fall together in a workable system. Such is intellectual bliss, pale in comparison to other forms of bliss, but bliss still.
Any foresight technique not based overtly on the calculus becomes demonstrably inferior. No other discipline but engineering engages calculus as a fundamental discipline tool. It is the new computer-aided capability to use calculus to a larger degree, and nothing else, which has catapulted pragmatic foresight to such breakthrough levels. Any discipline not married to calculus in the fundamental process of the guild, is not able to exploit new computer power to any degree comparable to that of engineering.
The largest advantage, by far, of quantitative reconnaissance is productively operating the cycle of run, break and fix (RBF). In goal seeking, engineers are supremely gifted at making mistakes. The consequences lead to insights as to what to jettison as “bad” and what to retain as “good” features for a fit solution. Since each process cycle contains a stop rule, cycles of this ratchet process cannot fail to satisfy the mission. It is the anodyne of entropy. The faster you work the ratchet, the faster you can reach the goal. This is where, exactly, inspection trips into the future of interest fit in. It is how the essential is identified and separated from the noise and clutter on a production-line basis. Over time, things only get better.
The institutional attractor has no equivalent cycle for finding and correcting mistakes. Its process is linear so that consequences of its actions do not advise future tasks. Over time, as the record shows, the second law, unopposed, sees to it that things only get worse. Since the institution has no process consequences stop rule or a parachute, it progressively degenerates as entropy increases.
Contemporary compliance
Triage is critical to meeting the “foreseeability” standard of tort (foresight law) and the law’s “rule of reason.” Triage is taking precautions from the start to establish and maintain contemporaneous legal compliance. It is, by far, more efficient to avoid non-compliance than it is to respond to negligence litigation retroactively. Compliance to requirements of the law is not inconsistent with efficient, productive goal seeking. Avoiding non-compliance litigation is entirely compatible with preventing preventable stakeholder damage.
Compliance as you go eliminates the risk of litigation altogether at the same time triage ensures project success. While hindsight will get due assignments, it takes foresight methodology to make the allocation on the basis of brute fact. The “rule of reason” examination involves “an assessment of the totality of the circumstances including an evaluation of all pertinent evidence.” Triage engages the “totality” map and develops the necessary and sufficient evidence pertinent to allocate by incontrovertible reason.
The venerable triage procedure applies at several levels of size, complexity and urgency. To ensure reliability of choice, the information-rich operation is connected entirely to fundamentals – transparent. The triage assessment criteria cover knowledge pertinent for reliable, appropriate selection, i.e., intelligence. The triage process initializes with the generic field of requisite knowledge, “totality,” as a template for gathering and arranging information. The difference between the information possessed and that required to intelligently select, i.e., the field of requisite knowledge, is called the field of ignorance. At some threshold of template information density, all traces of subjectivity are eliminated in selecting the matching process domain.
Initialization
For assaults on complexity, such as with prevention delivery, getting off the launch pad in the appropriate direction is crucial. The reason the blastoff vector is paramount is that once selecting a trajectory, inertia takes over. The responsive control knobs you had at the crossroads as practitioner – fall off. It is analogous to the snap of a latch-in relay. The practitioner is the first one on the site and he orchestrates the launch alone. Once a vector is established, because these are attractor phenomena, there is wiggle room for course variation but options are limited by the respective attractor ideology. Once you’re in a basin, and it has an event horizon, getting out takes more strength than you possess. Think Black Hole. Inappropriate launches are irretrievable losers.
The role of philosophy is perception husbandry to make “safe” initial choices via triage. Some practitioner’s lessons-learned for choosing necessary and sufficient are gathered here. There is no one best way to start, but there is a definite list of requisites. The trick to this work is to displace your subjective instincts with brute facts. By the time experience has shown the error of your judgment, it’s often too late to recover. The record of the very many executive proclamations for changing corporate culture provides an example. Attempts in the thousands; cultures actually changed – zero. This formidable resistance to disturbance is characteristic of attractors.
Most of the philosophical footings are driven by social system pile drivers. Your technical system skills don’t matter much until after the appropriate context is established and secure. The institution does not tolerate your position because of your credentials or your technological prowess. You obtain sanctuary to do the necessaries because the institution fears the penalties of non-compliance more that it distrusts you. There is a lot riding on how you position your strategy. The inclination to delve in and start cutting metal is strong but this predisposition must be held back until triage is done. Getting off on the wrong foot down the wrong road has sent more projects irreversibly to an early grave than any other reason.
The generalized abstractions of philosophy are all there to support your start. However unnatural, the place to start is a coherency check on approach and value systems of the practitioner. There is a conservation law for coherency. The pieces and parts must always add up to unity – no more and no less. Being incomplete is a default to blind chance.
Alignment at primary function gives permission to go on to the level of generalized functions and then to physical functions and then, last of all, to part numbers. Coherency checks are exactly like a Sudoku challenge where elements have to add up to the same number within a level and across levels. The transformation from the prime directive to part numbers must be coherent, vertically and horizontally, all the way. Any error or lapse changes the goal and the consequences.
Mindset management begins at home. The primary operative rule here is to avoid pursuits of the impossible. Goal seeking is productive only when it is within the realm of the achievable. Attractor mixtures are instant disasters. To avoid the impossible it is necessary to know where the boundaries are. In the operational reality, there is but one fence and everyone needs a map showing it.
Attractor Awareness
The two-attractor world of humanity is the product of natural law and genetic endowment. As natural law is the law of all experience, it means that the two-attractor template is nested – operative at every level. The use of “attractor” herein refers to the garden variety of attractors, such as the Lorenz attractor, and not the pop versions of “strange” or “chaotic.” The value of attractor perception for social systems is to simplify what would otherwise appear too complex to comprehend.
There is nothing strange or chaotic about institutional behavior and using the mathematics of attractors makes it easy to predict. How else but by an attractor could institutional behavior be identical generation after generation? Institutional behavior remains the same whether the population considers its effect good or bad. Revolving about the axis of desire and belief, the institution may not know anything about natural law, but it knows what it wants. It is pointless to communicate in mathematical physics to an institution still baffled by long division.
Once natural law is revealed as the prime mover, all the attributes associated with natural law must be exhibited and accounted for. When that fact is coupled with the absolute certainty of natural law, it means that the human record can contain no contrary cases. When you incorporate the two-attractor template into your perceptual reference, you will observe examples in process everywhere. Knowing the principles of attractors, you have acquired a magnificent navigational aid.
In assessing any situation, any condition, it is necessary to determine the attractor field – the sooner the better. Attractor awareness is a critical success factor. The formation and viability of a context is not driven by a chain of command but spontaneously and automatically by natural law. The primary principle is that all isolated state-determined dynamic systems are selective. Whatever state these systems start with, they go towards states of equilibrium. These states of equilibrium are characterised by being exceptionally resistant to disturbance. Attractors have quiescent basins of productive activity – ultrastability. From the Dead Sea basin, the escape energy out of the basin is very high. In an institution, mavericks and loose cannons have no chance.
Since the subject matter and the tasks actions distinguishing the institutional attractor are composed only of stories and check lists from hindsight, any other syntax is a silhouetted alien target. Institution man doesn’t even have to know the topic to make an accurate attractor determination just from grammar.
The syntax of the foresight context is goal definition and goal seeking for which no checklists can yet exist. Likewise, until surveillance into the marked future has been made, there are no stories to tell. This is why the goal must be defined in a set of stories (scenarios) so that at the end the practitioner can communicate in a comprehensible grammar to the institutional attractor.
Except for science fiction, the future, a lottery among infinity of choices, has to first be inspected to form a story. The great adventures of humanity’s famous explorers had to occur before they could become folklore. This is why the imaginary stories of science fiction are so pale in comparison to the discoveries of a Magellan. If there is anything astronomy teaches these days it is the incredibly unlikely journey to how we got here.
There is no difference between the improbability of history and the future. You can never return to a past event any more than you can repeat the experiment of creation. But, as long as super stability prevails, it doesn’t matter if the general population believes that history is the mark of nature’s preference. In super stability, tradition and case history work as well for navigation as anything else.
———
Application of attractor awareness through triage immediately separates pursuits of the impossible from those achievable. Cement the attractor concept into your perceptual framework – in the first row. It is easy, convenient and without risk of a disconfirming example. It is easy to distinguish the hindsight hierarchy from the foresight hooligans. The thing about concepts rooted in natural law is that they work 100% of the time.
The institutional attractor places a limit on how much cognitive exertion will still qualify as obedience to authority. Institutional hierarchy wants a “push-button staff” that does what it is told but has no opinions or feelings of its own — certainly none that would lead it to question its leaders’ leadership. It wants soldiers that wait calmly, even apathetically, until told to do something – and then act instantly and unquestioningly. It wants a staff that takes prescribed precautions cheerfully – and takes no other precautions period. This is not the stuff of pragmatic foresight.
The goal-seeking ends attractor is a world swimming in all manner of ingenuity as necessary to fuel the RBF engine that gives it life. Of what use is ingenuity to the realm of means-oriented rules-based hindsight? To those ensconced in the institutional attractor, an idea can be at most no help at all and quite likely interpreted as a mark of disloyalty. There is no value in attempting to make goal-seeking ideas accessible to those forbidden to acknowledge them by thought, word or deed.
The key to understanding institutional attractor dynamics is what contemporary psychologists and neurologists refer to as an “affect program.” These programs are a syndrome of neurological, muscular and hormonal reactions that are hard-wired. The programs are event triggered and once initialized the program executes beyond any voluntary control. Affect programs, including obedience to authority and hostility towards anything that threatens it, such as pragmatic foresight, hold the institution together. The practitioner secret is to look for these affect programs as progress benchmarks. They are entirely impersonal and quite reliable.
The devout hindsight process featured in the institutional attractor has strict limits of effectiveness in solving problems. Denial of those limits does not change them. If the times are steady and stable with disturbances being handled by the institutional attractor at a rate no less than the rate of incoming disturbance, the god of hindsight can reign supreme – until further notice. Using hindsight exclusively for controlling disturbances brings no harm to the population. Thinking that hindsight is all there is to intelligence may not help matters any but it doesn’t hurt anything. The significance that business as usual has no stop rule is dormant.
The hindsight plan for living works fine as long as incoming disturbances fall within the envelope of hindsight and crisis response competency. Significant problems arise when disturbances intrude that show no respect for the boundaries of institutional effectiveness. There is no graceful degradation. There is no gradual, manageable reduction in productivity. The situation implodes.
The institution limits cognitive demands to “operator” level and less. It wants workers to be skilled operators but remain clueless about design and control. Rule-based behavior is the badge of loyalty while curiosity about goals and consequences triggers suspicion. In every social context you encounter, classify the issue context as hindsight or foresight oriented.
The cognitive “quantum rings” of social organization are omnipresent. Computer gaming provides a blunt exhibit. The folks that buy and fixate on sophisticated computer games confine themselves to operator level. Tournaments staged by the computer game industry awards prizes on the basis of operator skill. The folks that design computer games, which are dynamic simulations, occupy an intellectual and cognitive demand realm unintelligible to the players. If you were to put the game design engineers into the same room with those who operate their simulations, the gamers wouldn’t understand a word. You can test this by asking any gamer about the design of the software/hardware that provides his entertainment. The switch to hostility is immediate.
If hindsight reigns, you know the maximum cognitive exertion allowed is well below design level. Limit your mental injunctions to operator level (the lower level is called experiencer level) and flak will be minimal. More cognitive demand can be loaded on the maintainers, who have to understand their systems in detail and think in order to diagnose and repair.
If foresight reigns, the cognitive demand is inescapable maximum. Design consumes all the cranial capacity available. This condition makes distinguishing hindsight from foresight realms rather easy and quick. If you somehow error in classification, the context will promptly inform you. There is and there can be no higher cognitive demand than design. System design work consumes everything thinking-wise the brain can deliver. It always did.
The existence of the pragmatic foresight context and its essential role in design is both undiscussable and widely known. The logic of many positions regarding crises of the times is ludicrous without the link of foresight engineering. One recent example was provided by Intel’s CEO Craig Barrett.
On an Internet video, he points to the astonishing improvements in efficiency and miniaturization in Intel’s semiconductors, which around 1972 came loaded with 2,000 transistors that could be seen with the naked eye. Today’s integrated circuits, 11 generations down the road, bear 1-2 billion transistors that can be seen only with a scanning electron microscope. Intel has had to make other improvements too, says Barrett, as they moved into the nanoscale, attempting to improve functionality and performance without power dissipation. Dual and quad core microprocessors now permit parallel computing within a single PC. “The challenge is in the next six to eight years, going to exascale, getting up to a million teraflops,” through multiple core processors, he says, and then there will be a “huge challenge in terms of software paradigms.”
This source of computer power is announcing an endless increase in the performance of its products. These changes must come, says Barrett, if the world is to confront its “grand challenges,” such as making solar energy affordable, solving issues of carbon sequestration, and figuring out the hydrogen cycle. Those extra teraflops and exaflops will also prove essential to the next generations of visual computing, where scientists (and gamers) want the feel of HD reality on their computer screens. Barrett says silicon photonics will help pave the way for such improvements.
Operational Reality Acceptance
A good place to start is to wonder at why prevention delivery should be such an emotionally stressful issue to the institution in the first place. As derived in the Platform, contemporaneous compliance to prevention law is not only the best way to deliver prevention; it is, by far, the most economical. Contemplate why the institution deliberately maintains a process, status quo ante, corrupted at its source, inferior and noncompliant – against all experience. Appreciate that you are not operating on a level playing field. The dice are loaded, the cards are stacked, and the game is fixed. If it were an organizationally neutral engagement, your foresight attractor services would not be needed.
The most successful strategy for arranging the requisite context for prevention design takes the ubiquitous institutional behavior profile and the humanity that energizes it as an unchangeable prominence of social conditioning and genetic endowment. Universal propensities and proclivities of the institution are, by its ideology, hostile towards efforts to prevent foreseeable stakeholder damage. It is not in the nature of an institution to allocate resources to avoid damaging anything but itself. That is why there are prevention laws and why your services are engaged in helping the institution comply with those laws – nothing else. To the institution, preventing stakeholder damage is a deeply resented cost of doing business. Should you forget that, the institution will remind you.
The process by which the institution gets itself entropy saturated is well known to the anthropologists. When disturbances the institution can’t escape from do not respond to hindsight institutional processes, the hierarchy becomes sensitized that the condition presents impossible choices. Like Congress and the immigration mess, when impossible choices have to be made, Congress chooses to avoid making choices at all. Neglect provides the opportunity entropy needs to escalate. The immigration snafu spontaneously becomes a fubar.
Understand clearly that institutional hostility is directed at the process of prevention system delivery. It’s not the operation and husbandry of prevention systems that institutions revile – quite the contrary. Once delivered, the institution will adopt the system of prevention as a child of its own. All traces of the tumultuous design process will be erased from the record books. For example, no one now remembers that it was Lee Iacocca who convinced the President to veto mandatory air bag legislation.
The dance of the two social system attractors can clearly be seen in how the institution relates to the work cycle of preventing foreseeable damage to stakeholders. The fossil record is filled with examples. The modern automobile is loaded with dynamic damage avoidance (e.g., ABS) and active safety gear (e.g. airbags) that are now considered standard equipment produced by standard institutions. The lobbies sent to Congress demanding relief from seatbelt requirements now serve other Establishment needs.
The process of delivering prevention ignites the ubiquitous attractor fireworks. The institution rails against the work of pragmatic foresight because engineering the future by intelligence is, on many counts, an institutional anathema. When the necessary work to deliver effective prevention is benchmarked against institutional ideology, it details why the institution freaks out. Affect programs drive this hostility.
The institution well knows it can’t do the foresight necessaries of prevention design. If the process of delivering effective prevention could be done by rules, prevention law compliance would be quiet routine. Having an illusion of omnipotence to uphold at the same time, it doesn’t want its stakeholder-damaging limitations in competency made public. The many disturbing complexities important for the institution to resolve, however, require good intelligence about future dynamics. Therein lies the rub. As history shows, all attempts to bring risk-informed decision making to the head shed have failed. The institution doesn’t want to know how faulty their information systems are and it certainly doesn’t want an example of reliable intelligence-based decision-making going around the hallways entertaining the troops.
One example of the limits of hindsight law to prevent preventable stakeholder damage is healthcare appliances and drugs. The public is generally unaware that producers of the drugs and devices approved by the FDA have been granted preemption of litigation. Once approved, the producer can operate with impunity. Drug manufacturers are not required to report or respond to issues that develop after approval. The public thinks the FDA is responsible for their safety, not appreciating that the regulatory agency disclaims all responsibility for outcomes. As with all regulatory bodies, the FDA cannot be held accountable for malfeasance through litigation. The same law also provides those with FDA approval with protection from legal retaliation. Accordingly, obtaining the FDA seal is free reign for both regulator and regulated to menace society. Congressional hearings on this disconnect are televised weekly to advertise the damage.
Efforts spent to inform the institution of any other reason than compliance to the law for incentivizing prevention delivery have nothing to gain and much to lose. There can be no practitioner obligation for getting institutional “buy-ins” or persuading misguided hierarchs to become angels. The signature trait of institutional ideology, classic autism, will host the project product when the work of prevention system design is released. In great contrast to crisis responders, prevention system design work always proceeds in a hostile environment. To remain fit for use after release, it must accommodate operators and caretakers that will relate autistically. This is why, exactly, the pragmatic foresight attractor is the only context where prevention design can proceed. There are no contrary examples.
This condition of prevention design is why so much fuss is made after a calamity by the institutionalized champions of prevention. There is a small post-disaster time window where the institution will feign concern by launching initiatives to prevention. As soon as the media fuss over the crisis responders passes, the Second Law takes care of the rest. Stuck in the wrong attractor, the initiatives are soon granulated to oblivion. This is why Challenger was followed by a Columbia. This is why New Orleans remains catastrophically vulnerable to class 4 hurricanes. It makes no sense whatsoever to be pushing and persuading an institution to act in a way contrary to its nature. This doomsday trail is littered enough with human debris. Adding your own to the pile is, at the least, very unprofessional.
Accept institutional ideology and the humanity profile that occupies its fortress turrets – as exhibited non-stop throughout recorded history – as is. Don’t push in any vector other than that selected by the institution on its own. Their lot is obedience to authority: your lot is goal seeking. You inform, professionally; the institution consents. Incompatible and complementary – your Indiana Jones adventure.
Competency limits of institutional ideology
We are what we repeatedly do. Aristotle
Institutional business is conducted on a highly emotional basis void in times of crisis of any coherent system of control. Any system regulated by emotion is necessarily focused on hindsight and all matters subjective. This preoccupation limits the span of competency to negotiate the incoming disturbances to status quo ante. These limits are to be respected by quietly accounting for them in your strategy. A strategy that requires change to corporate culture in order to succeed the mission is not a strategy; it is a wish.
One of the many indicators that the institutional attractor is in over its head is the use of the “case history,” a story, to narcotize the students to believe that case history “points” at the principles which gave rise and that by going through enough case histories you will somehow absorb the principles. When case history is the primary mode of instruction, as Harvard Business School uses to train executives, it signals that the instructors are clueless as to the drivers. Case history cannot teach the principles by which it developed. It only has value in illustrating the stated principles. If you knew the principles, you wouldn’t have to resort to the case history ruse.
It is difficult to find an initiative more futile than pushing an institution to do what it cannot. If your excuse for doing so is ignorance, you are in violation of the professional engineer conditions of license. The global monotony of institutional news, generation after generation, attests to the immateriality of the institutional roster. Institutional ideology and its manifestations do not depend on a particular cast of players. Solidified in hindsight methods featured in its attractor, an institution allows the acid of its own devising to drip on itself. The quality of life in an attractor falls by entropy to the level set by the population group that accepts the lowest standards
The practitioner’s rule to not ask the institution man to do what he cannot requires knowing the borders. Plan your strategy and arrange your tactics to call for support only when your task requests fit within the limits. Breaching limits is the most common practitioner error and it progressively degrades productivity. Members of the institutional attractor know their authorized range of motion exactly and are annoyed by requests to exceed it. When the same individuals become legitimate members of the foresight attractor cohort, however temporary, you can assign any missions you wish.
The institution cannot do whatever it says it wants ~ to achieve whatever it says it wants. The range of institutional motion, confined by its ideology, limits the range of whatever it can want to achieve. When the institution faces the choice of purpose or submission to its rules, it automatically chooses obedience to authority. When the stated purpose of the institution exceeds its range of motion, POSIWID converges the various stated purposes to one, i.e., damage stakeholders. Because that’s what it does. In mismatch conditions, things only get worse with time.
The institutional bind is conspicuous and it has no stop rule. Ashby’s Law of Requisite Variety governs that the joint and several complexities plaguing our great institutions just keep growing. Occasionally the growth becomes media worthy news. The Congressional hearings resume, ending with the message that more money or more institution will get the job done – even hearings about the massive fraud and waste of Iraq contractors. No one ever asks if the Establishment has the power to control the situation, why didn’t it act to prevent the damage in the first place? If you actually have the power to regulate, why didn’t you preempt the damage? You mean your power to remedy only works after the damage is done? You mean you had issue control and let this issue get out of hand? If you don’t have disturbance control, what makes you think more institution is the remedy? Believing that crisis response is equivalent to prevention has set up a lot of stakeholders for calamity. Institutions have done a remarkable job, like Vegas, in convincing a gullible public to that effect. The scam is in plain sight and no stakeholder admits seeing it. Welcome to the casino.
As the arrow of entropy enlarges the challenge with time, what the institution is capable of doing potentially in the future, its problem solving capacity, decays as entropy builds. In the latter stages when entropy explodes, institutional potentiality is lower than its current capability. Since the ideology of pragmatic foresight is purpose – ends, POSIWID shows alignment of goal-seeking to mission. In matchup conditions, things only get better with time.
Section 2
Institutional Interaction Aids
Accept Full Responsibility
Foresight engineering succeeds with prevention design because it takes full charge of prevention content, context and process as a bundle. The responsibility is not granulated and dispersed, anticipating the circular blame carousel sure to come. There is no risk transfer, no reliance on hindsight agencies to contribute vital information, and no hierarchy. Take this inevitable as a condition of design. When the social system transfer function common to all institutional attractors is accepted as is, rejoice. You are still ensconced in the foresight attractor free to do whatever you think best and by your own efforts you can spare yourself the agony of defeat. Ways are usually found to get the job done anyway.
As competency of the foresight attractor advances – while hindsight scope remains locked in place – the job gets more straightforward. It is already easier to deploy pragmatic foresight and get the right job done right, even with the high intellectual investments of full responsibility, than it is to get bogged down in trying to get hindsight to do what it can’t. It amounts to an exchange of intellectual efforts for maintaining psychological health – and a bargain. Besides, law has an elaborate dance of the seven veils, although extremely unlikely to be summoned, that will nail you with the responsibility anyways.
Taking full responsibility means that you will have to handle the entropy reduction task by yourself. The practitioner must forge the project to the requisite low-entropy work summit. At first, he is the only one wallowing in the pragmatic foresight attractor. Working within institutional ideology, the goal of the “Platform for Prevention” is to help orient the institution to that requisite without triggering institutional defenses.
Use proven process, order of battle
The key to practitioner success is to delay all technical system work, which you are quite good at, until the critical social system matters, for which you are maladapted, are settled and stable. Premature technical system work is a fatal error that locks in the hindsight attractor as controlling. In prevention delivery, the context issue comes before process and content. The requisite working environment for prevention delivery cannot be left to chance. To ask the institutional attractor to do that which it cannot is to lose. To divert objectives to the wrong attractor is to lose. To not leverage spontaneous action is to lose. This mandate is reflected herein where application of the algorithms of systems engineering technology, the comfortable satisfying part, comes dead last.
Intuition in engineering must always be structurally supported. When the social system minefield is cleared for appropriate procedures, you will find that all systems engineering stuff works. Because of its run, break and fix (RBF) basis, the goal-seeking methodology is shaped over time in increments. It doesn’t matter how it goes at any particular moment. The RBF principle of engineering is so powerful and robust that any type of variation or trial, whether guided by reliable foreknowledge or not, followed by the discarding the “bad” or “unfit” trials, and the retention of the good, will result in progress. The fact that successful steps are retained leads over repeated cycles to an irreversible accumulation, a “ratchet effect” which only allows overall movement in a goal-seeking direction. Nifty.
As long as you operate by intelligence-driven RBF, trial and error, suck and see – the initial schemes don’t matter. It is not necessary to work out a detailed systems engineering strategy that all subscribe to, in advance. RBF tolerates and accommodates all varieties of philosophy and approach to the complexity assault. There can be an illusion that only by a commonly shared value system can success be attained. Getting down to work quickly dispels this notion. A few cycles of RBF and convergence to the best takes care of itself. Rule-based behavior is linear-think and, as such, inaccessible to the RBF entropy-reduction ratchet. Since the hindsight attractor is linear, it does not “allow” failure. No failure, no RBF.
As your technical competency potential has been expanded by IT, so too has your toolbox for dealing with the sociological constraints. The same natural laws and engineering principles of systems think have enriched your toolbox to deal effectively with institutional roadblocks. If you go at it right, the law and your conditions of license can serve as allies. While considerable, thankless effort is involved; the professional engineer today can engineer context control. Unintentionally of course, law is the only motivator available to bring foresight technology to bear on prevention. To push intelligent prevention design as the “right thing to do” is a colossal waste of time. Keep the topic of stakeholder protection morals and ethics on the sidelines.
The invention, design and delivery of effective prevention systems are intrinsic components of the venerable engineering design process. There is no such thing as a damage prevention discipline functioning apart from engineering process itself. While no engineering discipline exists dedicated to crisis response either, all system designers are safety engineers. There is no way to segregate, for independent treatment, any of the attributes of safety, quality, performance, transparency and economy of the pragmatic foresight process of engineering. When the designer of prevention reconnoiters into the future of concern, the system dynamics observed include all interactions and behaviors as a package – as automatically integrated by natural law.
Mathematical physics is indifferent to the distinctions between the future and the past. The algorithms that enable the ride into the future are time independent concerning what has come or what is to come. Any contact made by the foresight engineer with the future is symbol mediated. The reconnoiter into the future is a device of fastidious formality serving to keep track of the associated symbols. Their meaning is ignored. As the record clearly shows, no matter how the institution decomposes the system functions and attributes for separate treatment, large or tiny chunks, the requisite methodology of entropy reduction is one and the same.
In working on prevention delivery, you always have to do something else first. Before design can safely proceed there must be a good grasp of just what, exactly, is to be prevented and it has to have a scrutable pedigree. The answer to what comprises damage comes from the perceptions of stakeholders. Even though they are often institutions, the array of stakeholders to protect comes from goals and the human network tied into them. The matter of collateral and propagated damage depends on reciprocity and the strength of the various relations among stakeholders. When the practitioner first arrives on the scene, no one will have any idea of goals or a stakeholder’s list. It is perfectly safe to assume the stated goals are materially defective and received knowledge about stakeholders is worse than useless. These are just not data that ordinarily matter to an institution.
The practitioner’s role, like the roller coaster scheme, is to lift the whole train up to the elevation where the remainder of the trip, thanks to the reliability of natural law, will be automatic and spontaneous. In the foresight basin, the actual work done is always between capability and potentiality. It is common for applied pragmatic foresight to raise guild “potentials” during the course of the engagement. This means better methods and tools are constantly sought out, regardless of disappointments. Advancement encounters speed bumps on the goal-seeking path as not all design method experiments succeed. What counts is an assured destiny – reaching the envelope of demand on the back of “Old Reliable” RBF.
Embrace the engineering take on the institution
As I grow older, I pay less attention to what men say. I just watch what they do. Andrew Carnegie
The addiction to navigation by hindsight is revealed in the ratio of foresight stuff to hindsight stuff used for decision-making. Since the purpose of social conditioning is perpetuation of the species, it would be expected that the ratio would vary as a function of novelty and the severity of possible threats on the yellowish road to Emerald City. The lessons of POSIWID clearly show that institutions are more than willing to liquidate in order to protect the value most cherished – privatization of profits and socialization of losses.
The institution gets its action profile from elements that share the same profile. Contexts evolve as a function of compelling purpose. For institutions, it is the addict’s value system. You cannot help any system in denial. Entropy will out. Institutions grade subject loyalty by immediate purging of alien material just as much as by obedience to authority. There is no master control room that manipulates the individuals composing the organization to an action profile that conflicts with their genetic endowment. Like a hologram, every piece of the institution contains the whole image. Every soldier contains the instructions for behaving as general.
It is crucial to understand the characteristic equation of the institutional context – that which makes it an attractor. It may seem backwards, but the overriding benefit is avoiding pursuits of the impossible. This does not mean that evading certain failure equals success. What it does mean is the pursuit the impossible is flat out fatal to goal achievement. The idea that the institution/government can somehow know, using only hindsight, the future’s possibilities and can and should control the future’s unfolding, was labeled the “fatal conceit” by Friedrich Hayek. The institution operates to enlarge its supervision of ideology, narrowing individual choices in the name of institutional wealth.
Since the institutional context is so hostile to the requisite context for prevention design, the first order of practitioner business is to address the sociological constraints. The seemingly inordinate amount of attention to sociological affairs in the standard of care is necessary because getting the context of work right is a condition for proceeding with design. Put the other way, proceeding within the wrong attractor is certain death of the mission. The battles over context selection can only be won upfront. Once serious work gets underway, the institutional context is supreme. Too late, you lose.
Presenting the brute facts about the institutional profile of behavior is in no way a disparagement. To attempt to change an attractor is a mission that goes well below zero payoff. All attempts fail. The least your impetuous raid can do is annoy the institution; the most it can do is ruin you. A historical parallel to this context challenge can be found in the Court Jester era. In many middle-age monarchies, this was the only safe way for the general staff to sneak facts concerning the true state of affairs to the king. Unlike today’s corporate mavericks that are allowed to live after their stint, jester careers were remarkably short. History does not disclose how jesters were recruited and trained for the job at Jester University
The strategy of change or else has nothing to recommend it. Avoid any criticism of the hindsight attractor. Treat the matter as a condition – not as a problem to be solved. In the hindsight attractor half the time is spent learning the ropes and the other half leaning against them. Reshaping the institution is never an intermediate goal; the goal is the goal. You must develop tactics that take the immutable condition into account while seeking the objective. Foremost the design for prevention is a tactic for avoiding setback and defeat at the hands of institutional process. In mismatch, the hierarchs deserve sympathetic understanding.
The stratagem for design success is unhampered systems engineering RBF (run, break and fix). The practitioner job is to remove anything that inhibits the free flow of the ratchet process for as many cycles as it takes. Your job is to continually improve ratchet productivity, reliability and cycle speed. Remember, the institution operates on linear. It cannot countenance cycles.
The characteristic equation for the institutional attractor, its operational genetics, includes the following DNA. These patterns of exhibited behavior reflect those issues important to the foresight attractor. If you assume the institution will react as the DNA suggests, you will never be disappointed. Foremost, institution man is addicted to hierarchy and hindsight – autistic grade.
An abbreviated directory of characteristics includes:
- Addicted: in perpetual denial. Defensive of institutional ideology and hostile to perceived threats.
- Autistic, no corporate conscience, closed to disconfirming evidence
- Perceptually limited: Nearsighted, Blinkered and Astigmatic
- Communications impaired: legends, history and task action checklists. No knowledge of information quality
- Obedient to authority, rule-based, operator level cognition, linear think
Addiction: The institution is addicted to itself to the exclusion of all else. In turbulent times it constantly needs a “fix” to maintain its illusion of omnipotence. It is perpetually in a state of denial of its addiction, responding exactly as do individual addicts. All the psychology of addictions is fully applicable to the institutional attractor. The foresight attractor is just as addictive to itself.
Autism: All the psychology of autism fully applies to the institution. It is an affliction contracted from linear think. The more knowledge you gain about classic autism, the more you will observe the congruency. Never practice methods known to fail on autism.
Perception: The institutional attractor is sharply restricted in what it can “see” and how clearly it can see it. A great spectrum of knowledge is blocked out of view. Institutional tunnel vision is undiscussable and its undiscussability is undiscussable.
Hindsight and Hierarchy: The syntax of institutional communications reflects its guarded perceptual restrictions. Its cognitive range of operations is limited to hindsight and rules of action in checklist form. All descriptions are legends, stories and case studies. The institution cannot understand information not in story form attended by a checklist of task actions. It ignores information quality. The language of the foresight attractor is perceived as useless gibberish – which it is. In the foresight attractor there are no legends and task action checklists last no more than a cycle.
Loyalty paramount: Any cog in the institutional attractor seeks to exhibit the attributes considered loyal by the institution. This means a maximum cognitive level of “operator.” The focus on loyalty denies the vast difference in knowledge needs between hindsight and foresight. In pragmatic foresight you have to know a lot of stuff hindsight has no use for. Thereby ignorance is a badge of loyalty.
To the degree the system rejects the stuff it needs to prevent damage is the degree of addiction to hindsight and hierarchy. As for any addiction, denial reigns supreme. It repels anything new or critical to understanding the limitations and the consequences of ignorance. You cannot help a basin in denial; addicted as it is to hindsight and the chain of command. What you can do is avoid the angst associated with futile attempts to alter it.
Stakeholder Spotlight
Once the mission profile is described quantitatively, stakeholder identification can proceed. A complete network of stakeholders, where the definition of damage begins, must be compiled. It’s a big job. A stakeholder is any entity nonreciprocally subject to the consequences of institutional operations that would be entitled under the law to a litigated proceeding. The institution can learn the identity of its stakeholders by waiting for legal actions. You do not have that luxury.
That portion of consequences which meets the legal definition of unforeseeable reduces this field of concern. The practitioner is as careful to define “unforeseeable” as he is to map those zones on the field of ignorance that are foreseeable by the best available technology (BAT) of pragmatic foresight. Advances in the BAT will change the boundaries during the course of the engagement. All targets creep. Competency advance is a ratchet process where experience is used to discard the bad and retain the good for the next cycle. Continue the cycle. It can’t miss.
This dynamic is a significant problem now for large civil engineering projects. The capability to go better, faster, cheaper – without adding to stakeholder loss exposures – improves during the work. By the time the projects complete, the realm of the foreseeable has doubled from that at inception. This escalation factor is handled by the prevention design process through anticipation and husbandry. It is why the dynamics work of design is embedded into the system controls as a benchmark with intelligence, such as neural networks, to automatically improve the reference model as operating experience is gathered.
Because prevention design works from the possible damage scene backwards in time to the potential causes, called event trees, stakeholder damage pours the foundation of work. Damage that is likely to be perceived by stakeholders as plausible actionable damage comprises the scope for prevention design. To survey this terrain, stakeholders have to be identified and evaluated. The key process is parallel to the now-routine task of accident reconstruction. The scope of plausible actionable damage will expand during the project as well as during the life of the system.
Starting from the wreckage at the scene and working back through the pathways allowed by natural law to arrive there, the movie of the damage generator can be mathematically constructed, frame-by-frame. Once the sequence of events that can give rise to the wreckage is determined by running the virtual clock backwards, design schemes to preempt the sequence can be evaluated by running the same virtual clock forward. This approach is the cornerstone of intelligence-based prevention system design, a.k.a pragmatic foresight. This is the process that will benchmark any specification of the “foreseeability” standard of care past, present and future. It is incontrovertible, timeless and robust. All other schemes are demonstrably inferior. It is by demonstration, exactly, that reconnaissance of the future is held supreme.
The advantage of the damage scenario reconstruction procedure is illustrated by the global exposure to large asteroid collisions. Computing back from earth’s craters associated with mass extinctions, the range of objects of interest was determined. This step was followed by a search for those objects hurtling about space eventually having an orbital rendezvous with earth. The further out the object can be detected, the easier to intervene. Catch them early enough and asteroid trajectories can be altered by simple, low-cost measures. If you wait until the approach to earth is immanent, preventing the collision becomes impossible. This attribute is built in and is parallel to the issues of energy, healthcare costs, immigration, pollution, water, education, conservation, terrorism, infrastructure, global weather change, etc.
This set of issues, ignored, has quietly inflected where massive protracted damage can no longer be prevented – by anybody or anything. The window of opportunity to prevent the damage by low disturbance, low cost measures closed, unheralded, decades ago. The fact this ubiquitous condition is nested right down to home base signals human predispositions married to natural law. It is a consequence of navigation by hindsight. “If it ain’t broke, don’t fix it” passed its sell-by date and no one noticed.
The practitioner of foresight engineering is endowed with the dispositions of humanity, of course, but he is also able to handle them in a more objective manner – especially when goal attainment is at stake. While the predilections associated with genetic endowment are instrumental to viability, it has become essential to step back and examine these inclinations in the spotlight of the issues at hand. In too many cases related to species survival, automatic instinctive is no longer getting the job done. The least cognitive demand that will support continued existence has gone up.
Gathering knowledge about stakeholders and their actionable exposures is developed in several ways. Since the institution has no interest in process consequences, it can hardly be bothered by stakeholder concerns. Usually, the practitioner can leverage several uses of the stakeholder knowledge. The design requirement here is to examine stakeholder risk of incurring damage – as defined by the stakeholder.
Risk the noun is characterized by the form:
Condition A endowed with a transformation pathway that is triggered by chance T, producing a situation B – where A is an acceptable system state and state B damages stakeholders.
If the condition A was not transformable into situation B by trigger T, there would be no dice to toss. Risk involves a chance the trigger for transforming A into B will not get pulled. If the activate events are not a lottery, there is a continuous condition from A to B and no risk. Chance must be in the delivery chain of B. If it is continuous, the issue lands in the engineering realm of process control not prevention.
To deal with risk, you must define all the B situations and that means finding out the details of what triggers and comprises stakeholder damage. A stakeholder is one who can take legal action about damage sustained from institutional operations at A or B. You bound the A conditions by the Bs and that is the design basis for prevention. It is best to characterize a stakeholder as an institution. When it comes to understanding their goals and consequences, a stakeholder is functionally no different than the institution that damages it. How else could humanity’s mess have evolved?
Risk is real. Risk can only be engaged by first making it tangible “part numbers” real. Whatever it is that comprises what you consider a risk cannot be “controlled” by abstractions. While individual risks are classified in arbitrarily named sets, like floods, and protection measures are likewise classified in arbitrarily named sets, like levees, the prevention of flood damage to your basement depends on what has been done in your yard. There are no generic, blanket risk attenuators. All prevention is local.
The risk-transfer industry flourishes from the treatment of risk as ephemeral. The more business can be conducted by ill-defined abstractions, the more control the insurer retains in selecting which specific claims it will actually settle – if any. For example, by deliberately breaking up event-caused damage from a hurricane into separately insured abstractions, such as wind v water damage, the insurer can deny any claim. Insurance amounts to a nonreciprocal risk. It is in principle equivalent to casino gambling.
It is best and necessary to get the client to express and define just what laws and litigation it deems most significant for compliance. Then, connect whatever you do to their profile of significance. Since what has to be done does not vary with fears and aspirations, it doesn’t matter to you how you phrase your approach. Eliminate all reference to morals and ethics regarding safety/quality et al. We hold stakeholder welfare paramount because the law explicitly requires us to, not because we are more moral.
The standard checklists of risk have evolved, of course, to serve the vested interests of the institution and its hierarchy. As the record of results shows, the usual fuss over risk management is a waste, a diversion, and often counterproductive. The fact that risk is leached from history as fast as it forms does not deter the institution from its preoccupation with hindsight methodology centered on obedience to authority. It well knows that pragmatic foresight bypasses the chain of command. As all know in working with the future, where uncertainty resides, it requires the efforts of masterless men engaging systems engineering practices forbidden under institutional ideology. By definition, two attractors cannot occupy the same space at the same time.
When the stakeholders and their damage particulars have been identified, a matrix of stakeholder interactions is engaged. While damage can be inflicted directly by the institution, the network of relations among stakeholders provides pathways for propagating collateral damage. In stakeholder work, it is necessary to assume the institutions of stakeholders, those being damaged, are in no way operationally different that the institutions causing the damage. Roles could be seamlessly reversed in a heartbeat. For many stakeholders, if they had their informational act together they would have never entered into a lop-sided nonreciprocal relationship in the first place. The institutional attractor characteristic equation is identical across the board. The characteristic equation for the pragmatic foresight attractor, so incompatible to institutions, is likewise identical across the board. It is not necessary to learn a third.
The stakeholder matrix is a tool in the system study that includes communications, lags, information quality, and the consequences of error in transmissions. When the matrix task is near completion, it is time to define the design basis scenarios. It is the sole responsibility of the institution to define the boundaries of satisfactory and acceptable operations in the extreme of design basis events. This is the objective definition of goal you can affirm on your own.
Section 3
Preparation for Inform/Consent
Prevention Law Due Diligence
The “foreseeability” standard of tort marks the only foothold society has allowed, inadvertently at that, for the legitimization of foresighted prevention. The origins of tort law were legal expedients to handle disputes between wealthy English, not their serfs. For centuries, the basic and highly successful defense for tortious behavior was to show the events of damage to be unforeseeable. Hindsight provided no history and therefore no lessons learned applicable to the particular circumstances causing the damage to plaintiff. The few occasions where plaintiff won the litigation kept tort law as an inconsequential legal specialty.
Professional due diligence is a given of the social contract. Social system stability depends upon submission to the rule of law and the licensing of professionals provides direct oversight. The reason the laws exist and are enforced is because what institutions are predispositioned to do, unregulated, habitually exceeds the bounds of the personal social contract. The great historical record shows that institutional ideology, left to its own devices, is a stakeholder disaster – no exceptions. Evolved law contains a variety of obligations and constraints for how things are to be done by institutions and individuals alike. For the most part, noncompliance to the rule of law will have a day of reckoning.
The only motive for the institutional attractor to tolerate anything but its own ideology is provided by law. It is only the fact that the professional engineer is duty bound to comply with the same overarching law institutions are also subservient to that allows any consideration of prevention at all. Since it can neither be avoided nor attenuated, the wise engineer will put prevention law to work. This skill is part of practitioner aspirant schooling.
Obedience to the law first requires understanding the law. Learning about the law requirements after the stakeholder damage is done is not the stuff of intelligent prevention. The sub prime mess is a recent example of obsolete schemes of prevention by rules always lagging reality. The pyramid scheme developed without difficulty with rules and watchdogs in place to prevent it. The money is gone, the damage is propagating, and nothing can undo the efficient cause. Miserable failures in stated purpose, another irony, and the government increases regulatory agency budgets. Guess what that cycle reinforces.
Prevention law is bifurcated into two distinct branches – hindsight and foresight. Hindsight law is the most widely understood and is incorporated in the institutional routine at nominal cost. Foresight law, repulsive to institutional ideology and rejected, is where all the money is. The numbers just in construction alone are staggering. At least a quarter of the construction industry annual handle of one plus trillion dollars goes to prevention law matters. None of the costly fussing about risk transfer, insurance and bonding disputes, and contract litigation has ever laid a brick. Law-contrived adversarial relations of project members result in a stream of pure waste now settled into habit. All sorts of hand wringing and Band-Aid measures flood the construction media while the operational reality only gets worse.
Juridical Rules for the Standard of Care
When the supreme gatekeeper finds it necessary to develop a standard of care for a particular time relevant in a case, he must comply with a set of rules put forth by the US Supreme Court in 1993. Lawyers today refer to the rules as Daubert and progeny.
The Daubert Court ruled that the trial judge must ensure that all scientific evidence admitted is not only relevant, but reliable. In Daubert, the Supreme Court identified several factors to consider when the trial court undertakes its review of the reliability of proposed expert testimony:
(1) Whether a “theory or technique . . . can (and has been) tested;”
(2) Whether it “has been subjected to peer review and publication;”
(3) Whether, in respect to a particular technique, there is a high “known or potential rate or error” and whether there are “standards controlling the technique’s operation;” and
(4) Whether the theory or technique enjoys “general acceptance” within a “relevant scientific community.”
The Court emphasized that the admissibility inquiry must focus “solely” on the experts’ “principles and methodology,” and not the conclusions that they generate. The Court explained “cross-examination, presentation of contrary evidence, and careful examination of the burden of proof, rather than wholesale exclusion under an uncompromising general acceptance standard, is the appropriate means by which evidence based on valid principles may be challenged. The Rules are not designed to seek cosmic understanding but to resolve legal disputes.”
———-
The bifurcated system of prevention law recognizes that the duty to deliver effective prevention has only one discipline – engineering – that traditionally understands and embraces the obligation. Prevention design fits well within the engineering standard of care. Advancement in foresight competency tied to natural law has created a novel condition for litigation in that no superior basis is possible. Scrutable connectivity to natural law is the ultimate “principles and methodology” which is what defines a natural law in the first place.
There are several benefits to a natural law based design and it handles many institutional constraints at once. The strategy to gain benefit revolves around the incontrovertible fact that only natural law “works” in dynamic simulations – the horse you ride off to reconnoiter the future. Modular modeling dynamic simulation programs serve as the objective disinterested gatekeeper basis to determine which constructions are based on natural law and which are not. Dynamic simulations do not run on theories.
Dynamic simulations of systems and settings, as in computer games, must be described to the computer in natural law elements or the program will neither compile nor run. Natural law itself (control theory) provides the only integrated consistent system of rules for taking a real system for a ride into a particular future. Modular modeling programs consist of modules tightly constructed on mathematical physics. The configuration and parameterization of a dynamic simulation requires a large quantity of physical facts concerning the system and its setting. If you don’t bring in the necessary and sufficient facts, the program simply waits until you do. There is no part score.
When the rigor of dynamic simulation is met, you have your vehicle to reconnoiter the future and it can be demonstrated as such by anyone anywhere. This is the ultimate barrier to competing propositions. Anything short of an all natural law foundation and dynamic simulation is not a viable option. There is no way to engage a consistent system of physics with a qualitative abstraction. Any link in the chain of reason that cannot be tied down with physical data cannot be simulated on a computer. Inconsistent systems can be simulated by “magic” to fool an audience, but even the magicians have to obey natural law in doing so. Since the computer is the antithesis of magic, any system containing a subjective, generalized, abstract or an undefined element will not work on a computer. In a courtroom, only one side can run a dynamic simulation with a completely objective basis open to inspection. The other side, excluded from dynamic simulation by computer, is limited to rhetoric. Case closed.
Using the Platform for Prevention, the practitioner has no problem setting the standard of care for prevention delivery that meets his conditions of license, his code of professional conduct, and the judicial rules at the same time. Doing the right job right takes care of it. Scrutable connectivity to natural law handles Daubert 1-4 with ease and sets a trap for contrary evidence experts. The heartburn it works on law is that no superior basis is possible. It is an unforgettable occasion when some other challenges a solid natural law position and therefore is armed with an inferior basis and we have witnessed several. The reviewer smoothly rocks along espousing his theories until the moment he realizes his argument is natural law challenged. At that instant, vocal paralysis strikes. He has placed himself in the unhappy position of a man who manages to see the ultimate consequences of his own argument reach out and bite him squarely on the ass. It’s a hoot.
Since pressuring the hindsight attractor to be responsible for solving complex matters, such as prevention delivery, is the pursuit of the impossible, duty levied on the institution is limited to having the work done.
Hindsight Prevention Law
The “hindsight” branch of prevention law is not addressed in this manual. Hindsight law compliance has become standard automatic in business as usual and handled contemporaneously, without the need for incentives, simply because it includes itself. Hindsight law is composed of familiar statues, codes, licensing, permitting, standards, rules and regulations expressed as task action injunctions – if, then statements of compulsory means commonly established as permissives. All forms of hindsight law provide checklists of what to do and what to not do particulars gleaned from protracted loss experience “lessons learned.” Always fasten your seatbelt – unless …
Since most hindsight law operates by real-time permissives, like a turnpike tollbooth, due diligence is institutionally spontaneous, downhill and unalterable. Significant enforcement disputes are handled efficiently and solely by administrative law judges. Hindsight law is a large part of any prevention design project. It is automatic and necessary but not sufficient. Most of the compliance money, these days, goes to foresight law after the project is done. Since all prevention is future-based, foresight law governs the aggregate proceedings.
The institution labels its hindsight approach to prevention law compliance “risk management” and “loss prevention.” The choice of words clearly conveys the focus on future because there is no risk in the residuum of history to manage. The long operational record of “risk management,” organizationally neutral, speaks for itself. Set against the backdrop of incessantly growing quantities of foreseeable stakeholder damage, conventional risk management and loss prevention efforts, regardless of amount, show no corresponding influence. The monotonous failure of institutionalized “risk management” practice to attenuate risk and avoidable stakeholder damage is a testament that the purpose of “risk management” is not to prevent loss – because it doesn’t. The purpose of the institution is what it does (POSIWID).
As it must be, risk management is a hindsight procedure navigating by the rear-view mirror and related rule checklists. The bulk of what loss prevention does is crisis response design and arranging for insurance. During times of ultra stability, hindsight compliance by whatever name eventually gets the prevention job done. Back when blanket insurance policies were readily available, insurance took care of the costs of retroactive compliance. There was no need to know the particulars of prevention engineering.
Hindsight law compliance is largely prepared in the basin of the means attractor via regulator agency-supplied task checklists. The cognitive demand of linear-think obedience is minimal. Information loads are slight. Ingenuity is neither necessary nor tolerated. In times of ultra stability, this arrangement is efficient and effective.
Unfortunately for “risk management,” these times are anything but a repetition of history. There are two principal drivers of the lag problem. First, the high rate of cultural change caused by this explosion in competency is unexampled. It is not many years back, for instance, when corporate officers took pride in being computer illiterate. It was the defining badge of high office. Second, the field of the legally foreseeable has grown so large, there is but a small and vanishing residue of unforeseeables for shielding business as usual. As these two forces of change persist, the gross disparity increases. Few appreciate just how far pragmatic foresight has advanced and how fast the novel engineered artifacts it enables change social norms. What took decades for the telephone is now compressed into months. The cycle time of cultural change brought about by engineered artifacts is now much faster than the cycle time of institutional adaptation.
Foresight Prevention Law
Carpe diem, quam minimum credula postero. Horace
Seize the day, but put no trust in tomorrow.
In foresight law, the backdrop of actual damage puts the litigation spotlight on “foreseeability.” The plaintiff has the considerable advantage of a specific damage scenario in the record – stipulating that whatever prevention was considered, if any, did not work in this particular. The defendant has carved a history he is helpless to change and efforts to cover up noncompliance past often make matters worse. It is up to the defendant to prove the damage was unforeseeable. Since the damage is history, it is easy to show many ways how the particulars of efficient cause could have been foreseen and the damage prevented. It’s already a routine engineering practice.
The plaintiff is further and greatly aided by the fact that the measure of “foreseeability” is a legal standard that advances in lockstep with advances in foresight engineering competency. Over the last two decades, the scope and competency of pragmatic foresight has rocketed to the level where very little stakeholder damage caused by the institution remains legally unforeseeable. At this time, advancing technology is still expanding the scope of that which is included under the standard-of-care. Using the appropriate process for the matter at hand is destined to succeed.
It is to the distinct pecuniary advantage of the law profession, as the financial data show, for institutions to conduct business as usual oblivious to the Big Bang expansions in pragmatic foresight scope. The foresight law handle is a direct measure of engineering foresight competency. Status quo ante and usufruct provide a continuing windfall as the engineering profession goes about its standard-of-care duties. The more pragmatic foresight advances, the more the considerable trouble of bringing suit appeals to damaged stakeholders. As foresight competency expands, retroactive compliance to foresight law extracts larger and larger tolls. Business as usual has no compensating leverage. It can’t prevent the damage, it can’t obstruct the advances in pragmatic foresight, and it can’t attenuate the escalation in litigation and insurance costs that result. There is no stop rule for the progressive degeneration of institutional defenses from stakeholder litigation. The institutional reaction to this threat, as always, is to fund lobbyists to obtain compensating legislation.
Unlike the seamless compatibility of institutional operations with hindsight law, compliance to foresight law is not institutionally possible. To accommodate this basic condition, foresight law is administered by the legal system as a deferment. Until stakeholder damage has materialized, the friends in foresight law give business as usual a pass. After damage has been tallied, the friends morph into adversaries who revoke the pass with a retroactive inquisition of compliance. Institutions just pay; compliance is impossible.
The institution is totally free within the law to choose retro compliance as operating policy ad infinitum. It amounts to a lottery ticket on if, what and when damage will accrue. The winning draw is when no damage occurs and the prize is no expense to defense. The record in some industries, such as big construction, shows it to be a sucker’s bet.
Since the retroactive definition process lags the damage by several years, the practitioner can preempt the legal definition by explicit definition of the discipline standard of care as he goes. A husbanded standard of care based on natural law, prepared and documented by the practitioner, will trump any effort contrived by lawyers starting years after the fact. Formulating the standard of care for the engagement does not add to the workload. On the contrary.
In the aftermath of Three Mile Island, for example, the Nuclear Regulatory Commission (NRC) issued a “clarified” set of instrumentation and control requirements for all nuclear power stations. For those stations where pragmatic foresight did not lead the design, only the regulations existing at the time of engineering release were followed. This policy eliminated the need for operational thinking in design. Backfit of the new requirements for the rule-based configurations was so expensive several nuclear stations were decommissioned instead. For those constructions based on foresight engineering, little supplemental investment was necessary. The new NRC mandate for licensing called for nothing that wasn’t already included by intelligence-based prevention. The difference in cost averaged two hundred million dollars per unit.
Although it seems incredulous that the body of foresight law would operate exclusively by 20/20 hindsight, this temporal trick to accommodate institutional ideology is a suspension, not a pardon for negligence, and it makes the task of proving noncompliance a breeze. The time shift accommodation to status quo ante comes at high cost, whose main beneficiary happens to be the profession of law itself. For the prevention practitioner, the adventitious legal quirk sets up a conflict between the institutional penchant for retroactive compliance and his profession’s unequivocal commitment to contemporaneous due diligence. The suffering of damage, necessary or not, cannot be undone any more than relieving the pain already endured.
The professional engineer (PE) is driven to embrace contemporary compliance to foresight law by two pile drivers. First, it is a duty specified as a paramount canon of his discipline, part of his legal standard of care, and a condition of his license. To reinforce the foremost canon, the professional engineer is forbidden by his code of conduct to remain associated with a project based on retroactive compliance to foresight law. He is required further to detect such a mismatch as he goes.
Second, compliance as-you-go is, by far, the most cost/effective basis for prevention delivery. The pecuniary difference between the recorded costs of retroactive compliance and those attending contemporaneous compliance, averaging more than ten to one, is compelling to any engineer. Financial waste can be prevented, of course, in like manner to any other foreseeable damage. The risk of litigation expense is zeroed by compliance as-you-go and that target condition can be independently verified as-you-go.
Accordingly, the proven best practices that deliver contemporaneous “foreseeability” compliance with productivity and efficiency are engaged. Engineered foresight is a process in which the practitioner continuously produces his own authority to act. The risk-informed design of prevention systems in compliance to foresight law is up entropy hill all the way. At great expense, juries ultimately decide the contentions in foresight law. The objective of this manual is to keep the necessary time and cost to attain risk-free absolute compliance to foresight law at a minimum ~ as-you-go.
The span of competency of foresight engineering is in an unexampled period of rapid expansion deliberately and enthusiastically powered by the engineering guild. Competency advances are of great benefit to the discipline of engineering, especially when no other disciplines are rising to the occasion. Technology advances lead to better practices that quickly displace the lesser ones. At this time, the half-life of the leading edge of pragmatic foresight is about a year. There is no stop rule for this learning cycle and the improvement draws compound interest. The lag in foresight law application coupled with the incessant extension of engineering virtuosity, especially since due process takes so many years, works to the distinct advantage of plaintiff. Over 95% of the cases are settled out of court.
It is natural that the defenses of the great institutional attractor would be active when the alien other attractor system is about. There are just too many practices endemic to the foresight attractor that are taboo in institutional operations. Common ground between those obedient to authority and the masterless men of prevention design is scant. As Voltaire observed, “It’s dangerous to be right when the institution is wrong.” The practitioner recognizes that the crucial obstacles to prevention delivery are institutional and not bandwidth limits of prevention technology – like it used to be. The application of appropriate methods delivering prevention is a separate matter apart from coping with the institutional roadblocks. If these obstacles of attractor mismatch are not removed up front, the mission of effective damage prevention becomes mission impossible.
Since the institution has and will have ultimate and exclusive dominion in all situations where prevention is deployed, it has little appetite for co-existence, no less for cooperation, with certified heretics. Taking institutional ideology as an eternal given of humanity, the design of this manual is specifically directed to those proven strategies and practices circumventing the joint and several roadblocks to pragmatic foresight all institutions instinctively erect. In prevention system design, the first tasks are devoted to preventing institutional predispositions from blocking the formation of the necessary context. After release, the institution gets it all back.
Leveraging Foresight Law
The key to foresight law compliance as-you-go is to assemble and husband your own standard of care based on the best available technology and the Rule of Reason. The practitioner doesn’t wait years for the judiciary to synthesize a “foreseeability” standard of care for a moment in past history – a standard to which the engineer then responds with passion, perhaps, but with zero influence, for certain. By doing this chore first and formally, as a professional in the only discipline entrusted with prevention delivery, you, not the gatekeeper, retain control.
Experience has shown that the judiciary does everything in its power to avoid the convoluted and messy business of setting a standard of care – especially for engineering. No gatekeeper judge wants to go to the considerable trouble, expense, and intellectual investment to fabricate a retroactive standard of care for foreseeability conflicting with the one deliberately installed at the time by a professional that has documented its roots in natural law. The Rule of Reason would then work against the gatekeeper. Remember, your conformance to the standard of care can be dynamically demonstrated in a courtroom. No competing standard can do the same.
Scrutable connectivity to natural law answers the current legal uniform definition of “strong inference.” Courts have held that a “strong inference” is one that is “more than merely plausible or reasonable — it must be cogent and at least as compelling as any opposing inference of non-fraudulent intent.” Having a linkage directly to natural law, and no other, always trumps any opposing inference. While the natural law platform can always be dynamically demonstrated, other inferences are necessarily subjective and, accordingly, cannot be dynamically exhibited by contemporary modular simulation programs. This comparison settles the question. It may be difficult to pull off, but scrutable connectivity is impervious to hostile assaults.
Professional engineers have every reason to take charge of the foresight law compliance matter. There is nothing to be gained by using less than the best available technology (BAT) for goal seeking. As long as the practitioner lashes to the natural law platform, he has eliminated the risk of litigation and ensured a direct shot at the goal. By eliminating subjectivity and delivering transparency, the risks of institutional interference and veto vanish. For the engineer, there is neither glory nor personal advantage in careening to the goal using a second rate methodology. When the design is fit for service, all memory of his contribution is going to be erased anyways, no matter how he got there.
The PE handles compliance to foresight law as the design of a poison pill to potential litigation. Because the retroactive legal process is so long and costly, the designer gets a large cushion to work with. He doesn’t have to be perfect. Just good enough to make plaintiff’s task of proving foresight nonfeasance overly risky to the lawyers will suffice. The fact that a formal compliance effort was conducted to best available technology will invariably comprise adequate deterrence – with a factor of safety. The efficacy of the poison pill can be independently validated by legal expertise as-you-go. There is no bar to validating foresight compliance as-you-go to eliminate the litigation risk. The conventional assumption that foresight law compliance can only be determined retroactively to damage does not square with the facts.
It is an anchor to productivity to think that the foresight laws protect your goal-seeking by holding you responsible only for services and not quality results. This proviso is a contrivance of the law to protect the non-engineering disciplines, not yours. It was stuck on to the engineering profession as an expedient. The hindsight genre of professionals, those not operating by run, break and fix (RBF), does not engage the necessary process to assure attaining ends. Their focus is due diligence to established discipline protocols. To the professional engineer, obedience to institutional ideology is not the goal; the goal is the goal.
The professional engineer’s standard of care does not mention the legal emphasis on means, i.e., services, as a relief for defective outcome responsibility. The fundamental canons of the profession are about the priority ranking of ends, not means, with holding stakeholder safety, health and welfare paramount. If there is no discipline duty bound to attain ends, what safeguards stakeholders?
The responsibility for results issue is a plague for stakeholders and law alike. Granulation and distribution of responsibility for foresight issues like prevention design cannot and do not work. There is always a day of reckoning in litigated proceedings. Eventually the blame game spirals about to which functionality is objectively tied to foreseeability – which functionality cannot pass the buck. The single answer is whoever designed to system must, by virtue of the task, own the duty to make it fit for service. The hindsight disciplines leave out all science and all licensed professions but the discipline of design, which is perforce the engineering process.
Because successful design requires dealing with the future, engineering has no history of denying legal responsibility for outcomes fit for the specified service. The master documents of engineering are exclusively about outcome requirements. The engineering drawings of construction to be used for subcontracting and site work, for example, have no legal status until sealed by a professional engineer functioning as the engineer of record. Any separation is an artifact of law, not engineering and eventually the day comes when law, by blind drift, has to pin the tail on the donkey.
After all, the stakeholder has been damage to an extent he has taken legal action. Some designer somewhere had the foresight role attending all design. Who else but the designer could be held responsible for the consequences of design? Who else but the designer could be held responsible for seeing to it that the design was in shape for service – in the future of the application? Everyone else has an airtight alibi.
Engineering design marks the collision of services only with results. The engineer must not think that, like the doctor, he is only duty bound to services. It is a deception in that when the crunch comes, law will take you and your pragmatic foresight process to task. There is no choice for law when the damage is caused by incompetent foresight. Identify any other discipline that will accept outcome responsibility.
The first profession, engineering, was created by Hammurabi in 1500 BC. The matter of enforcement was handled exclusively by outcomes, not services. Building fall down and harm occupants; building design engineer put to death. When medicine got around to organizing itself in 500 BC as a discipline, the Hippocratic oath was limited to “do no harm” to the client. This injunction is not the same as “cure the patient’s affliction.” The sheaf of papers you sign before surgery attests that you understand and accept the services-only obligation of the operation.
Since you’re going to be de facto legally responsible in any case, should one arise, why bother with the blame game? Why get into the risk transfer scherzo when your duty for results cannot be delegated? Your tie to natural law and your ability to scrutably connect your pragmatic foresight to universal law is your great equalizer. Leverage natural law and the rest falls into place. You can well afford to take responsibility for outcomes when law cannot blindside you. The law of society has always known it is, like everything else, subordinate to natural law.
AQI
In order to contemporaneously comply with foresight law, it is necessary to make goal-seeking choices and decisions based upon necessary and sufficient information. The armature to organize and hold the data is called the framework of requisite knowledge. The empty data bins on the structure comprise the field of ignorance. The kind and amount of knowledge necessary to design and husband effective prevention measures are independent of context. The field of requisite knowledge is a standard, oblivious to which attractor is in charge. The method of proof of design is also an instrument of discovery. There is no intrinsic role for subjectivity, institutional or otherwise.
Operationally, the distinguishing differences between hindsight and foresight law compliance are found in the amount and kind of requisite knowledge to be developed and the fuss made over information quality. A compelling reason foresight law is ignored (with impunity) is that the information quantity is relatively enormous and the kind of information that has to be developed is seen to oppugn the chain of command. Both claims are true. Designers delivering prevention derive their authority to act from the kind and amount of learning they do by reconnaissance into the future. That is, by their own knowledge-building efforts.
Reality is a dynamic that disappears as it takes place. Living takes place in no time, without past or future. Past, present and future are notions that we human beings, we observers, invent to explain our occurrence in the now. Time arises in the experience of the observer with directionality and irreversibility.
The designer of prevention must have his perceptions grounded in the future. If he cannot image the future, he doesn’t know how to act now. All prevention is anticipation and taking precautions based on the fidelity of what particularly has been anticipated. Action to prevent stakeholder damage takes place in the now.
To deliver prevention under foresight law, risk-informed design, a great deal must be known about the dynamics of what it is, exactly, to be prevented. At the end, the competency of the prevention system is settled by scenario demonstration. Since the goal of prevention evolves with time (the Second Law), the knowledge base must be embedded into the system and updated as needed (husbandry). Intelligent prevention system design works from a full dynamic description of stakeholder damage backwards to the conditions that create it. The basic process is accident reconstruction where the damage is virtual rather than actual. The designer needs to be able to run the system clock forwards and backwards so that the prevention designs can be tested for local efficacy.
Pragmatic foresight economics
What you do speaks so loudly that I cannot hear what you say. Emerson
Triage is necessary to make absolutely certain that the resolution process to be employed is appropriate for the incoming. Prevention begins at home. The obscene cost of engaging a process-issue mismatch is the first condition to be prevented. Because mismatches, as in Iraq, have no stop rule, they are quite capable of generating major waste and protracted stakeholder damage. Mismatch is the hotbed for litigation. The great doom of the “Big Dig” in Boston was well established in the first month. Everyone associated with that boisterous blatant project botch job knew after month two that it was litigation-bound. These travesties do not happen overnight and they come as no surprise.
The whole subjectivity-ridden issue of all economics associated with the effort can be quenched at the start. The administrative dimensions of every project – accomplishment, schedule and cost – are not separable. A prevention project is a birdsnest system. Change any element and the system changes. The economics annoyance is dispensed by revealing just how much waste is going on using business as usual. Key information is unreliable or missing; understanding of the mission is, to be charitable, discordant; the wrong goal has been chosen. You can then, early on, declare all economic matters off the agenda of discussion. It will stick for the remainder of the project.
Recorded history generously attests to the necessity of getting the right resolution process married to the incoming issue. A major study by the GAO of Pentagon acquisition experience measured the average increase in project cost after triage at one percent. Modest projects bypassing triage and losing the gamble, assuming business as usual would work out, averaged 41% over budget. The acquisition of major weapon systems without triage runs multiples over projected costs. The V-22 Osprey program, for one example, was launched twenty years ago and now has a second generation entrenched community of vested interests.
The fact that damage is an obvious failure of the risk management process to foresee and take appropriate precautions, the question for litigation to decide is: was the damage event legally foreseeable? As the standard of care for compliance to foresight law is the best way to run a project, what is the point of noncompliance? It’s not to save money, because it doesn’t. It’s not to speed up the project, because it can’t. It’s not to attain a quality product with an excellent safety record, because it won’t. What economy could there be in a risk management process that consistently fails to get the job done?
The economic benefit of triage is a no-brainer. The differential cost between retroactive compliance to foresight law, which the law taxes for orchestrating the deferment, and contemporaneous compliance, which law ignores, is not a close call. Even for the retro projects that luck out, the cost difference is seldom below double. For the retro projects that kick up foreseeable damage, the life cycle cost difference will be many multiples. The differential is so large, insurance costs to cover nonfeasance litigation are an insignificant component of the financial advantage. While contemporaneous compliance has front-end costs retro does not have to pay, retro expenses race ahead long before the project is over. A project management philosophy based on retroactive compliance to foresight law signals a project in serious trouble. The penalty is paid in schedule and accomplishment, which exchange directly into cost.
The litigation cost differential between risks that are managed by foresight and prevention and risks that are addressed retroactively by hindsight and emergency response is the economic driver for triage. Because the foreseeability benchmark devised by law in retro cannot exceed the foreseeability benchmark in as-you-go compliance, litigation risk involving foresight law is reduced to zero. No one knows the standard of care better than those working in proactive compliance at the time. Not only are damage costs averted, all the associated administrative losses attending stakeholder damage litigation are avoided as well. Preventing damage, unlike emergency response, is not glamorous work.
The pecuniary overkill is that a project run from the start with the appropriate process will be a winning project overall. As the record attests, it is impossible to be in contemporaneous compliance with foresight law and have a troubled project at the same time. Since compliance to foresight law and good design are cut from the same cloth, any gain for foresight compliance is an advance for design.
Assaulting Complexity
There are issues proven to be critical for success in prevention design. It is always helpful to refresh your knowledge base for comparison to the principles that form the “Design for Prevention.” While there will be slight misalignments between the bases presented here and your own experience, to be sure, there should be no conflicts in the fundamentals that shape the design process. The measure of intelligence is determined by how well the assault is waged. Complexity reduction is elimination and unification. The finding that most of the constraints and roadblocks are social system matters will not shock you. None of us has flaws in technological and methodological matters in our assault on complexity so significant as to preclude reaching the goal. The ratchet of trial and error accounts for errors of those emotionally ablaze with the engineering rapture.
Keep in mind that the best goal-seeking methods are scrutably connected to natural law and from that whence commeth your distinguishing difference and overwhelming advantage. The capability to collect hard dynamics data from reconnoiters into the future is unique to the engineering process. Further, this podium has no peer in the pantheon of problem solving. It not only gets the best job done best, you need have no concern for criticism and sabotage at release and turnover. It is the ultimate poison pill for prevention law compliance. Scrutable connectivity to natural law providing the attribute of transparency is impervious to the armamentarium of institutional ideology. For this, there is no plan B.
Stakeholder damage is not the result of bad luck but one of (ideological) consequences. Choosing an ineffective or detrimental policy for “coping” with a complex system must not be a matter of random chance. The intuitive processes will select the wrong solution much more often than not. A complex system; a class to which a corporation, a city, an economy, or a government belongs, behaves in many ways quite the opposite of the simple systems from which we have gained our experience. Complex systems are counterintuitive. That is, they give indications that suggest corrective action that will often be ineffective or even adverse in its results. The policies that have been adopted for correcting a difficulty by government habitually intensify it rather than produce a solution. Welcome to Iraq.
Most of our intuitive responses have been developed in the context of what are first-order, negative-feedback loops. Such a simple loop is goal-seeking and has only one important state variable. The simple feedback loops that govern walking, driving a car, or picking things up all train us to find cause and effect occurring at approximately the same moment and location. In complex systems cause and effect are not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system of prevention has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). The cause of a difficulty usually lies far back in time from the symptoms, or in a completely different and remote part of the system. Causes are usually found, not in prior events, but in the structure and policies of the system. This fact is what makes prevention design so thorny. Structure and policy is what defines an institution in the first place.
The complex system is even more deceptive than merely hiding causes. When hindsight looks for a cause near in time and space to a symptom, it usually finds what appears to be a plausible cause. But it is often not. The complex system presents apparent causes that are in fact coincident symptoms. The high degree of time correlation between variables in complex systems suggests cause-and-effect associations between variables that are simply moving together as part of the total dynamic behavior of the system. Conditioned by our training in simple systems, the same intuition to complex systems leads into error. The great advantage of dynamic simulation is as a reliable means of finding the significant variables. Typically, they come as a surprise. When you treat symptoms, not causes, the outcome lies between ineffective and detrimental.
The practitioner knows that complexity is not a system property. Things are called complex or not by the arbitrary gage of a single human cranium. If the benchmark points to complex, then part of the task is to gather enough system knowledge to reduce apparent complexity to manageable levels. This process is the soul of “intelligence.” Ask any group for a definition of “intelligence” and you will observe great variety leading to heated debates, seldom coherent.
Society grades intelligence by testing hindsight proficiency, which it then uses to score intelligence. The daily Isaac Asimov “Super Quiz” in your newspaper, purportedly measuring your intelligence, consists of nothing but remembered history where the highest points go for those who also know the lower levels of details. For engineering, there is only one definition for intelligence, Ross Ashby’s “appropriate selection.” The more you study the matter of intelligence, the more Ashby’s parsimonious take appears as genius. There is and can be no superior definition of intelligence. Just try to appear intelligent by making inappropriate selections. Just try to appear dumb by making appropriate selections.
Ashby’s definition of intelligence says nothing about hindsight and memory work. Appropriate selection is simply a function of information received. That information can be time stamped with hindsight, the immediate present, or foresight. The best way to achieve appropriate selection is the most intelligent. The engineering view has always been preoccupied with appropriate selection, ignoring any intelligence attributions. Being considered intelligent as individuals has no play in appropriate selection. The record of professional judgment in anticipating the future is awful.
By starting right and avoiding the minefields of conventional diversions to waste, the destiny of success approaches certainty. Once the proven method is underway, it quickly justifies itself so much so that institutional management finds something else to occupy its time. Institutional process is firmly governed by the authority of management – and nothing else. Your authority comes from reflections of the living truth embedded in your work. Staying Velcroed to natural law often conveys hindsighters to the edge of madness.
Foresight engineering produces its own authority to act. The scouting trip into the future makes videos no one else has. Videos of the future enable stories no one else has. This attribute is at the core of pragmatic foresight. If work in the ends attractor were subservient to higher authority, it would assume that the authority already had the risk-informed knowledge base. Since this cannot be true, by definition, any design process for prevention must be a cyclic bootstrap operation. Knowledge developed by task actions is the permissive to select the next task actions to develop more knowledge. In the operational reality, the workers are looking for instructions from scouts reconnoitering the future, not the institutional shamans.
At all times engage the foresight attractor and accept that it is apart and distinct from the hindsight attractor. No efforts should be invested in explaining one to the other. The institutional attractor, by its identity, will always be hostile to your foresight attractor. Get over it. Never ask institutional humanity to do what it cannot. Such is the pursuit of the impossible.
Intermediate summary
This first section was intended to prepare the site and place the foundation for what is to come. The guidance was largely philosophical, abstract and generic. It amounted to preaching to the choir. The following section is about the initial tasks for securing the right context for the right issue. Foundation set, it is time for the “front end.” Erecting the structural steel for the armature that will support the prevention project begins the transition from generalizations to part numbers. The institution detects the transition immediately, switching over from offense to neutral to defense. Up go its shields.
The work of the front end is, by far, the toughest and most thankless job the practitioner has to face. In any mature institution, the requisite information for the prevention project simply does not exist. The head shed abhors the very idea of risk-informed decisions and further, it is annoyed by any initiatives to do so. The hierarchy panics at the thought that someone is going to reveal the ridiculous informational basis, the dark matter, actually used to direct activities. The reason institutions are not managed by reliable, timely information is that’s the way they want it.
There is no point to going ahead with foresight engineering for prevention until the institution understands and accepts that the work of foresight law compliance can only be done in the basin of the foresight attractor. It must tolerate this alien context until release. The institution must not be allowed to hold prevention delivery as a threat to its power; otherwise prevention design collides with design prevention. Veteran knowledge is necessary to pull off this tightrope-walking act. The task sequence presented has proven to be the most direct and least troublesome route to a hassle-free design stage. It is not without peril.
This marks the end of the preoccupation with social system constraints. Once the foresight attractor is humming along, goal seeking is the measure of worth. Technical accomplishment takes priority over socialization. In the foresight basin with its flat organization, social matters take care of themselves. Teamwork of equals is a hallmark of life in the basin of the foresight attractor. Without consciously recognizing the occasions, everyone in every institution has at one time or another spent time in the foresight basin. Those are the small-group, goal-seeking cohorts where everyone, masterless during the time, pulled together to achieve amazing things. To its credit, Lockheed gave Clarence “Kelly” Johnson free reign to get design results. The “Skunkworks” he formed in the 1940s has been in continuous service to this day with an unexampled list of achievements.
Project initialization is mostly about social system knowledge, Starkermann grade. Foresight technology must be deferred until the various institutional constraints have been neutralized. Starkermann’s work shows how natural law forms the attractors. Run his simulations and you can witness the formations driving for system stability – one for hindsight and one for foresight. You can see the tug of the respective basin on deviation attempts. Thanks to Starkermann, there is no need to bounce around experimenting with different arrangements. His stuff works 100% of the time – anywhere humans interact.
The challenge of pragmatic foresight, whether it is for damage prevention or for advancing institutional productivity by using stuff learned at training camp, is the same. It is instructive to check the records of history for any example where the instinctive approach based on logical reasoning was successful. Who in your institution came back from a conference loaded with operational goodies that were promptly embraced by the hierarchy? Note that executive management, including the President’s Cabinet, goes through the same cycle and makes the same impact back at home base – zilch. By all means check the record at other institutions and in other nations. An employee urging the institution to improve operations by the force of logic and fact is indistinguishable from maniac.
On several occasions over the last three decades, as both test and demonstration, large assemblies have been arranged in groups and given the same goals to meet as benchmarks. The same roster in different arrangements delivers the variations exactly as computed by Starkermann’s models. Individuals go from high productivity contexts to low productivity contexts without recognizing the differences. Do yourself a big favor and use Starkermann’s derivations as-is. Run your own tests with these affect programs to gain confidence. Forming the optimum group for the work takes another potential problem off the pile.
The fact that te foresight attractor is viable shows that the front-end struggle with institutional predispositions in intellectual hygiene has come to an end. It’s now time to move your social system skills to the background and gird for the assault on complexity. No matter how good you have become at palace politics, an acclaimed Court Jester, natural law remains deaf to persuasion and there is much left to do.
Section 4
Order of Battle
Overview
Only after the struggles to establish the requisite context of work are over, can systems engineering matters safely take center stage. The penalty for placing the prevention issue into the wrong attractor basin is so large, it is necessary to allocate very early and without doubt. Thereby, the first order of business is triage to organize and benchmark the diversion of issue to attractor. While parts of triage can go quick and easy, the allocation must be based on brute facts. The institution is not going to relinquish veto control without a barroom brawl. To secure success, you must make your stand up front. After attractor selection, it’s too late.
As the Platform for Prevention derived, the two operational realms are entirely incompatible and, for delivering damage prevention, fully complementary. It is important to commence with a triage phase to get the allocations of which issue to what process system attractor correct. The client is owed an inform/consent session and it must come early. Once an attractor forms, it auto-ignites.
The purpose of triage is to help the institution keep its shields against pragmatic foresight lowered. Only when the institution is in crisis mode is doing the front end, temporarily, somewhat easy. For anything else you have to finesse the institution, which despises the attractor you represent as a matter of principle, to “allow” the necessaries of prevention delivery to be done. The institution is well aware that the professional delivery of effective prevention is a cold-shower operation. There is no way to dupe the institution to believe that your design process is organizationally neutral – making its concern about compliance to prevention law pointless.
The institution is defined by a character profile it holds in common with all institutions. When you are dealing with an institution of any size, you will have to deal with the consequences of this temperament characteristic. The institution generates one set of responses when things are in equilibrium with its environment and another different set when it perceives a threat. The personality profile responds to non-members in ways dramatically different from how it responds to those under its command.
The institution has good reasons to hold pragmatic foresight at bay. Scandalous truths about the actual base of information used to “lead” the institution are exposed. The field of ignorance actual compared to the field of knowledge appropriate and rational for navigation is stunning. Since institutional decision-making has little use for hard data, it is discarded as fast as it becomes available. With a practitioner on the loose, the cover is going to be blown.
Another good reason for the institution to sabotage pragmatic foresight is that the foresight attractor is so productive and efficient for what it does – solve tough problems. Since you cannot change the damage reports already in the record book, generated by the problem, solving a problem can only be done with an eye towards the future. The institutional attractor, which can only address problems by hindsight, does not appreciate being shown up by an attractor it is trained to despise. So, not only does pragmatic foresight make the institution appear to be governed by whim and fancy, it runs circles around the barge of hindsight in remedy design.
You can only leverage value when you shape your requirements to fit the predispositions of the species. At no time is any of this shaping and leveraging to appear as a disparagement. One of the best critics at this confluence was C. P. Snow and he didn’t make a dent. There is a reason and a history of the species in evolving to what it is. Not only is there no value to disparagement, you wouldn’t know what to disparage. It’s best to take the character profile of an attractor as a given condition of the operational reality.
Descriptions of the order of battle phases are first set in abstract and general form. There is no all-purpose recipe and task action check list. Given the immediate goal, the practitioner selects from his toolbox (Appendix and book 3/3) those procedures he thinks best for him to use to reach the objective. It is necessary to widely separate the social system matters from the technical system stuff for two reasons. First, the institutional attractor is aggressively uninterested in how pragmatic foresight is done. It is knowledge it can neither use nor admit knowing. The audience that cares already knows the drill, which is why it cared in the first place.
The second reason is that, before the triage issue is settled, any premature emphasis on the technical system side sends control to the institution. When the institution is in control, all technical system matters are settled in favor of the hindsight attractor. It doesn’t matter how sound and logical, how productive and efficient, the pragmatic foresight methodology is for prevention. The quantity and quality of benefits to the institution is incidental and immaterial to its decision process.
Initialization
The first task, either before or immediately after the kickoff, is to assess if business as usual is a domain match to the issue. While the odds approach zero, if there is a match, you must report the assessment and bring your engagement to a close. While the law forbids you to remain associated with a program spiraling to the black hole of mismatched projects, you have no business with a project the client can properly do by itself. If the situation doesn’t need your foresight practitioner services, off you go.
Once you are guaranteed intra-institutional safe passage, you can usually make the assessment in a day. Like mismatches, attractor matches exhibit characteristics that come in bundles. When alignment is good, it is very very good. Like a hologram, every part of the operation contains the same image. A strategy for so evaluating a construction project, shaped by considerable experience, provides an example of benchmarking by the foresight engineering standard of care.
Inform/consent
The top strategy of the standard of care is to assure success as early as possible and certainly before the information organized into requisite knowledge is seen by management as a threat to its authority. After the initial survey for business as usual, the law requires the delivery of an inform/consent overview to the client. It is about to undergo a surgical procedure and the law requires a review of options and risks. This step secures the beachhead for the early-success scheme. As introduced above, it is your duty to inform the host of the overarching principles that will govern your engagement. While the client has considerable authority to exercise, he is to understand and accept the legal principles that govern you both. You have no choice but to put your standard of care paramount and hold that all conflicts will be resolved according to established legal doctrine – to which all are subservient.
This chore is not as antsy as it appears. If the client insists on his way over the law, the earlier you can avoid association with a certain disaster. Understand that the path to attaining the goal under legal compliance is just as certain to succeed as is the path to failure under noncompliance. You may not like the client response to the law-first edict but all alternatives are much worse. You must obtain consent to establish the foresight engineering attractor/context in order for foresight methods to work. In short order, once the work begins, the whole permission issue goes away. Intelligent goal seeking can only be killed; it cannot be castrated. The later the project is killed, the tougher it is for the institution.
If the client chooses to exercise supremacy, holding on to the right to control by sabotage and veto, the engagement for the practitioner is over. When the inform/consent session leaves foresight law as the benchmark, it shifts the spotlight towards technical system affairs. For all practical purposes, enabling choices made at inform/consent remove the institutional barriers to design success. From then on, the institution can only kill the project. It can no longer retain control.
The inform/consent step marks the end of the road for professional judgment based on the information available, taken as-is. Although quiet and without fanfare, passing the cognitive inflection point is a blaring reveille to the practitioner. He can no longer appear organizationally neutral. It’s knowledge-building time.
The Assault on Complexity Begins
At this stage, the practitioner has determined that business as usual will not suffice the objective and the institution has accepted that prevention law will be the benchmark of operations. The next milestone to be reached shows by brute fact that business as usual cannot succeed the mission. While the institution can certainly run a mismatch, as long as it has the resources, it cannot hammer institutional process into the mold of achievement. It’s not a case of will and protracted determination; it is flat out impossible.
Making the case that institutional usual is mission impossible sets the stage for obtaining institutional consent for a triage program. When the institution permits the development and installation of a triage procedure of the institution, by the institution, and for the institution, project success is assured and the ball game is quietly over. A matchup will not only reach the objective, it can’t be institutionally vetoed at turnover. Triage lays the groundwork for a bumpless transfer. Release is not an exchange of messages between attractors, but a nonverbal demonstration that the solution meets the specification.
As the practitioner’s clandestine goal line, everything is riding on his ability to secure institutional “permission” to insert a triage step between the incoming and its infallible diversion to appropriate attractor. What would have been an automatic assignment to business as usual is replaced by an intelligence-driven triage protocol. Once the barrier to triage is removed, the practitioner has no legitimate excuses to fail. As retrospect will reveal, the true climax of success is getting the institutional green light for triage. For the first time, attention can be concentrated on technical system matters without having to guard your rear.
Institutional misfit
After inform/consent, knowledge building begins. The task is to connect the situation to the natural laws that collectively forbid institutional process to succeed. Considerable information is necessary to do this, but it’s information necessary for project success in any case. The strategy of transparency with scrutable connectivity to natural law for a nonverbal system release is used here to establish that business as usual can’t hack the mission – regardless of applied resources. That is, the conclusion is driven by brute fact. In effect, this intermediate milestone is a prototype of the final release performance.
Making the case requires gathering the kinds of information and developing the kinds of knowledge that are not organizationally neutral. Up to this point, nothing you have done appears on the institutional radar screen. In showing that no combination of established rules can meet mission requirements, alarm annunciators go off and battle stations are manned. You have to show, by using situation particulars, that natural law stands between institutional ideology and mission success. It’s hard to imagine anything more unpopular with the hierarchy. The case is composed of technical system, social system and the communications system matters
The work reveals that the institution is not being directed by anything approaching a coherent body of timely, reliable and coherent data. Risk-informed decision-making is not the institutional standard of care. You will be finding that the most basic of navigational aids are either corrupt or missing altogether. You will, in fact, have difficulty in finding anything ready-made about the following:
- Goal definitions, specifications and mission profile
- Scope and interfaces with external entities
- Key term definitions
- Compliance priorities: permits, licenses, codes, prevention law
- Information quality and tagging protocols
- Priorities and value systems
- Stakeholders and stakeholder damage specifications
- Design basis scenarios
There is no one place to start or any one best sequence of development. The delta between what’s available and what’s essential cannot be filled by business as usual methods. The fact that the necessary information is not available signals that the institution is unable to generate it. Rule-based practices can support pragmatic foresight but they cannot replace it. Obedience cannot substitute for ingenuity.
It will be helpful to remember that this milestone seals the deal so it’s worth the considerable effort to find and lash this information together. It is your ticket to a successful mission, but it’s all or nothing. The institutional attractor can support neither the practices nor the context necessary for them to take place. There is no halfway to it. If you can’t complete the necessaries, don’t commence.
Quantify the compliance value system
Triage formation must begin with getting the client’s take on compliance clarified and detailed – a profile of expectations. Just what laws are to be featured and their priorities and what litigation is to be avoided and why. The relevant institutional history in this area will help. Getting the client’s take on the legal driving force does not change what you do, just how it is shaped for presentation. It sets a good tone to begin with law-driven as the basis because that is your excuse for doing it right. The more you can get the institution to elaborate on what prevention laws it considers compliance most important, the better you can align the engagement progress reports with what the institution values.
There is no such thing as blanket prevention. There can be umbrella insurance policies, of course, but the dance with the future and local complexity of delivering prevention forbids “one size fits all” schemes. Intelligent prevention system design cannot proceed without developing the requisite knowledge about local dynamics first. And, there is a lot of local. The prevention design process generates its own authority to act by virtue of generating this knowledge.
In principle, it is the same sort of knowledge necessary to assess a non-umbrella insurance policy. Until you know the details and exclusions of the policy, you really don’t know what, exactly, would qualify as a claim. Your ignorance of the true coverage creates a nonreciprocal risk to the advantage of the insurance company. It gets to define coverage as it chooses after you, in your ignorance, submit a claim. After every hurricane, for example, claims litigation surges as policy holders find out what they actually got for their premiums. The litigation dockets are never cleared before the next hurricane lands.
In prevention delivery, a great deal must be known about the institutional value system. This compilation will be unique as no institution concerns itself with the value system it exhibits on automatic by POSIWID. Doing it takes the place of understanding it. The practitioner assembles the value system in steps by safe, small-scale testing.
Goal definitions
Embedded within the task of goal and damage specification is the necessity to develop objective definitions of the critical success factors. Never assume that the central words that frame the mission are universally defined. Words are symbols for and passageways to ideas. Philosophy matters. Words, including governance, performance, risk, compliance, safety and quality, have to be defined to some common ground. It is no secret that the terms used in mission and damage description by the stakeholders are just as nebulous and diverse as the description. No stakeholder wants to find out how error-laden their notions of key terminology are and how dangerous the discrepancies in definitions can be to project success. Great and undiscussed variety in definitions of key elements of mental models is a hallmark of the institutional attractor. To not know of the variety is another sign of loyalty to institutional ideology. Informationally, the institution runs on lagniappes.
To the practitioner, the most significant distinction between hindsight and pragmatic foresight is in the amount and kind of information necessary and sufficient for operations. Faithful to the incompatible/complementary arrangement, the institutional attractor does not seek, retain or maintain the information foresight requires to function. Hindsight operations have no need for and do not keep the information essential to the foresight process. In practice, such information is aggressively driven out.
The degree of difficulty of triage is measurable at the outset during the institutional response to the systematic investigation of its prevention objectives. Experience has shown that, for control purposes, institutions have flimsy and entropy-filled notions of its goals. Even its process for setting goals is unsupported by facts and coherent reasoning. For reasons described in the Platform for Prevention, the institutional goal statements and specifications are consistently eroded by entropy. Further, the information claimed to be used by the institution for goal-setting is invariably outdated, spotty, rambling, disjointed and unreliable. To the institution going through the pragmatic foresight initialization process, facing the awkward quality of information basing the goal specification is a shattering experience.
The basic initialization procedure concentrates on matters ordinarily undiscussable in institutional affairs. An early chore is to get serious about goal definition in a top-down breakdown format with a bottom composed of part numbers corresponding to physical functions. This struggle attends validating that the goal espoused bears some resemblance to the goal needed. All keywords used in goals and specifications have to be defined in worker terms. The brute lesson from experience is that institutions never set goals that correspond to stated needs. They are always far off the mark. Accounting for stakeholder interests is never done. The same applies to keyword definitions. The practitioner has his hands full enough when the objectives are properly aligned and defined with keywords shared in common. When the mission is ill chosen and ill defined, prevention design defaults to the hindsight attractor.
Nothing expresses the institutional ideology better than the quality and husbandry of its information. A mature institution has drifted so far from its original goals and consequences, a study of the information it operates by serves as a tree-ring dating system. While institutional keywords signal corporate identity, no official definitions are ever promulgated. In time and turnover, no two members will share the same meanings of the keywords freely used to describe goals, operations and consequences. Information for building knowledge has to be more reliable than the reliability of the knowledge required for design. Information for control purposes has to be particularly reliable. Using information without unambiguous definitions of keywords used to describe it is the borrowing of trouble.
The record finds less than 0.5% of institutions have pellucid goals sufficiently defined and husbanded. Those rare cases that do have their goals act together will have no need for your services. The field-of-ignorance problem for goal-seeking is so uncomfortable for the institution, the necessary work cannot be done by regulars. It takes an outsider with a social system mandate of equal or greater authority, a royal pass for safe passage through institutional ideology, to power the knowledge development up to the tipping point where spontaneous institutional process can take over. Finding out upfront that the stated goal to meet stated needs is the wrong goal dispenses with the economics issue. The crossover point where cost savings outgrow triage expenses is invariably reached during the second week of the engagement.
Several benefits accrue from building common definitions of the keywords with the institution. The illusion of institutional omnipotence is turned to rust. Major roadblocks and obstacles are revealed for the first time. Members can work out significant misunderstandings formed when keyword meanings are significantly dissimilar. Often, the improvements in communications are quick and substantial. For the realm of prevention, start with the words safety and quality and the need will reveal itself.
In practice, the earliest knowledge-development tasks are done in random order with many loops back to hang additional data on the racks. Insights to goals and stakeholders, for example, can emerge anywhere and anytime. It is of the utmost significance that institutions maintain no running archive of informational stuff vital to their survival. It is key to understanding the restrictions of institutional ideology and how it shapes the engagement. If you align your approach to appear organizationally neutral and never ask a hierarch to do what he cannot, you will have smoothed out a lot of bumps in the road.
The variety in keyword definitions, the axles around which institutions rotate, will be sobering. Keywords like damage will have as many disparate definitions as there are contributors. Not only are things not on the same page, they are not even in the same book. Questions such as “If you have no common basis for your goals, how can you reach them?” must go unexpressed. It’s shattering experience enough.
There are several steps in the Design for Prevention that will precipitate the same reaction. Defining the communication channel traffic in the “penetrations” procedure is sure-fire bedlam. Since it is original data and institutionally significant, the underground will give it wide distribution. This process begins the shift of the center of “power” to the engagement. The appetite for reliable information about institutional viability is insatiable, bottom up. The head shed will object, of course, so you downplay your role.
A full and complete goal specification, abstract to tangibles, is absolutely essential in order to benchmark stakeholder identification. The damage to be prevented is the particular damage stakeholders are subject to, nonreciprocally, by institutional operations. A great deal must be known about the primary drivers of the damage, the trigger events, and the progression of effects to the local scene of each stakeholder. This information is necessary to design proficient prevention systems. If any bona fide stakeholder is left out of the proceedings, the odds that its needs will somehow be met by the prevention designed for the other elements is akin to “Jump and the net will appear.”
Reconnoitering the future
No one can deal with possible future dynamics without a model of the system. Even the gazer on a crystal ball needs to know subject matter particulars. The model may be subconscious or the model may be a robust computerized simulation based on mathematical physics. There will be, by whatever name, a model. It is the benchmark of perception. A great deal of the practitioner’s work in the early stages is to discover the mental models and value systems in service by the institution and its stakeholders. No two models among the roster will be the same and it’s the differences that drive the contentions. The sooner this variety is measured and structured, the better.
Think of value systems of the future of interest as a chunk of space-time, where your 4-D zone of interest dynamically progresses as a system (arbitrary) within a larger system and then a higher order system. Making inspection trips into the future is pure discovery. There is no control to it. You can only observe and measure. If you could force a future oblivious to local circumstances, you wouldn’t need to investigate it prior to design.
It is the supreme significance and importance of knowledge of the future that gives those who obtain it first the authority to act. Those that make decisions without discovering the future first are always inferior in rank. Knowledge-based authority is derived from scouting trips into the future. The squad would rather follow a corporal who went on ahead to observe what’s out there than a general using his great experience from the rear.
Safety/Quality/Prevention in Compliance
The work of prevention delivery cuts a wide swath. There are several issues typically handled as separate activities that fit well within what has to take place for prevention. One example is safety/quality. Process-wise there is no difference in preventing harm and preventing unacceptable quality. It is impossible to segregate these issues out for detached treatment. Any goal such as quality, safety and performance can only be “engineered” to reality by taking appropriate precautions in the immediate present in contemplation of a desired future. These objectives, by definition, are welded to what lies ahead. Achievement will be far better guided by knowledge about the future than the record of the past.
There is no positive value in separating the litigation over safety from the litigation over quality for independent treatment. Creating artificial specialties signals the institutional intent to maintain status quo ante. Process-wise and compliance-wise they require the same competency. The failure of specialization is so obvious it goes unnoticed.
The favored definition of safety/quality/prevention, as it relates to prevention law is:
A specified section in future space-time that has been engineered to assure that conditions within this niche of the future will remain within stated specifications under specified disturbances for the system life cycle.
Quality/safety/prevention is future, local, engineered and husbanded. The future of concern for preventing damage is a moving target and therefore subservient to the natural laws of control. This means that the design basis for safety is predicated on the possible scenarios of unsafety expressed in objective specifics. The control of these unsafe possibilities and their triggers requires the engineering process. The demonstration of safety attained for the design basis events is how compliance to foresight law is secured contemporaneously. Safety issues change with time enough so that safety has to be placed on a husbandry basis for the life of the system.
Safety/quality/prevention is totally future. Nothing can be done about safety/quality/prevention registered in past times. While the record is piled high with valuable lessons learned and the rule-books spawned from history, the relevance of those lessons learned to the issue at hand can only be settled by reconnaissance. The engineering process must intelligently select from history in whatever form presented. The perceptual reference for selection is knowledge of future dynamics in the zone of interest. The benchmark cannot be history itself.
Safety/quality/prevention is totally local. There is no such thing as umbrella safety, generic measures for application that will handle generic safety matters. Safety can only be defined in terms of the specific stakeholder damage to be avoided. The presence or absence of the events and transients of unsafety measures the attainment of “safety”. The trigger events and the progression to unsafety are always driven by tangible particulars. No abstract concept of unsafety has ever caused damage on its own. Likewise, no abstract concept of safety attainment has ever prevented damage. Whether it is Chernobyl or a fender bender, it is always local specific. This attribute is called “particularity.”
Safety/quality/prevention is totally engineered. It is the engineering process, and nothing else, that deals with the future and its control interventions. Safety is preventing the transmission of variety from environment to system. In safety, there has to be an issue of unsafety triggered, a transient from satisfactory to unsafe, which requires deliberate intervention to return to the safe column. Safety issues that are untreated by engineering process do not go away by themselves. A safety concern that requires no actionable remedy does not qualify as a safety matter. It becomes an acceptable risk. Perceiving loss as an excessive difference between actual from the benchmarks, safety/quality is the intelligence and precautions developed to control foreseeable loss-causing events.
Safety/quality is applied husbandry. Because of the Second Law, the zone of interest and the unsafety features it contains, the safety objective migrates. Anything prevention design must travel with the future as it unfolds. There is change, entropy increases, and feedback about change that calls for timely adjustments to husband the same level of prevention. This puts prevention design into the realm of control technology. Prevention begs the question “What have you done for me lately?”
To retain safety achieved, an active feedback loop must be maintained to signal altered circumstances from the prevailing reference. There are several ways to update the reference standard for safety. The automatic variety is called self-regulating systems.
It is compliance to foresight law, where uncertainty resides, that provides the opportunity for preventing stakeholder damage. The institution cannot bring itself to comply with foresight law contemporaneously as it does for hindsight law. What the law does to accommodate this ancient taboo is to give the institution a pass on due diligence nonfeasance until after damage has been recorded. Then the law applies foresight law, if you can believe this, retroactively. The institution pays a premium for this legal retro service. In USA construction alone, the handle for retro compliance amounts to several hundred billion a year.
The engineer doesn’t wait for the judiciary to define what he should have done years ago, in retrospect, as the gatekeeper of “foreseeability” and the Rule of Reason. As long as the engineer uses the best available technology and ties his work to the platform, there is no risk the judiciary will compose a greater standard. It is only when subjectivity enters the fray that the judiciary has outcome power. Engineers have several incentives, including the instinct of workmanship, to use the best available process and compliance is only one of them.
Practitioner triage protocols
Application of the principles of triage to engineering design is common in Australia. Requests for services there are customarily met with an “engineering triage form.” The information provided on the form is then processed in a standardized manner to determine the priority of issues receiving the limited resources. In Sweden the same routine procedure is called “requirements triage.” MIT assures engineering undergraduates they will learn “’engineering triage,’ as it is necessary for the choice of abstractions appropriate to a problem.” In the design for prevention, the same proven principles of triage are employed.
Triage is a PE obligation for those unavoidable institutional issues that come under its umbrella of “risk management.” Triage is defined as a method of allocating limited resources to issues. Since there is no risk managing to be done for matters already transpired, the realm of risk management is a future-based process whose challenge advances and evolves with the arrow of time. The Second Law assures us that no finish line, where institutional risks have finally been managed, exists. Crisis management is an activity frequently assigned by the institution to its risk management department, but the realms of emergency response and damage prevention are intrinsically incompatible. The management of a crisis has an end. The prevention of a crisis does not. If you want acclaim from humanity, join the institutional attractor and get into crisis response. If you want acclaim while in the foresight attractor, to hear praise booming up from the floor of Hindsight Valley, an enthusiastic yodel ricocheting from one stakeholder to another, you’re in the wrong trade.
The sociological basis for triage is the fact of two incompatible, complementary process domains each forming a separate attractor swirling at great psychological distance from each other – even though individuals are completely interchangeable from one attractor to another! One social system attractor is hindsight-based (means). The other attractor is foresight-based (ends). There can be no third.
The functional objective of triage is to divert incoming issues to appropriate resolution process domains with absolute certainty. Once an attractor has been given an assignment and begins operations, it is practically impossible to impeach – which is what defines an attractor in the first place. The high resistance to incentive and disturbance is the same whether good or disastrous for goal-attainment.
The practitioner is driven first by his legal obligations, no secret, so that he presents the work plan in terms of his due diligence needs and not institutional management. When due diligence is met by the PE, the institution which has a lesser standard of care is protected from litigation. The idea is to circumvent institutional veto power over the work. It can do yes/no, but not process – and no is scienter.
The master strategy for the practitioner is to power the front end uphill to the tipping point where spontaneous progress kicks in. There are two tipping points; one for each attractor, and negotiated in sequence. In so doing, the professional carries a significant load of knowledge development. After passing the inflection point, the practitioner can confidently transfer responsibility for developing the requisite information to the project team. Instrumental at first, he does not act in a prime mover capacity past the inflection point where automatic kicks in. Goal seeking then becomes downhill and spontaneous. Beautiful.
The burden on the practitioner is to reach the auto-ignition point with all deliberate speed and parsimony. The major milestones.
1 – Business as usual can’t fail (day): tipping point #1a
2 – Business as usual can’t succeed (several days): tipping point #1b
3 – Development of the triage protocols with a cohort including the institution (many days): Tipping point #2
The first triage goal is to determine confidence levels regarding actuality, capability and potentiality that business as usual will suffice. When business as usual is a fit to the prevention issue at hand, all signs to that effect will be aligned. There are many ways to affirm a matchup and judgment will work as well as any. Everywhere you check, everyone knows about and approves what is being done to prevent damage. Matchup validation does not take long. For issues that fall within business as usual scope and have established a record of success, status quo ante will satisfy the assignment. Determining issues that fit business as usual is a task that usually takes less than a day. Confirming evidence will be found everywhere, as the matchup is a matter of pride. The field of requisite knowledge to conduct business as usual is simply the set of rules of action to emulate. As Turing proved, rules of action taken as infallible cannot also be intelligent. When the institution holds its rules as infallible, intelligence becomes heresy.
For triage purposes, if business as usual will suffice, intelligence is unnecessary and unwelcome. Once affirmed, the intelligence development activity for triage is halted. If the allocation to business as usual is not clear-cut, the professional moves on to prepare for and deliver inform/consent. He must first show, by objective benchmarks, that practices established by the institution cannot succeed. This task, performed by the practitioner, requires structured knowledge development. The conclusion that business as usual is fundamentally unable to attain the stated goal is driven by brute fact.
The contrary is also true. When business as usual is a misfit to prevention delivery, no signs of alignment will be found. Everywhere you check, no one knows about what is being done to prevent damage. Knowing that a matchup has not been found does not prove business as usual cannot do the job. If the minimum conditions necessary for business as usual to succeed are not definite, knowledge development for prevention triage enters a momentous new period.
Where the first goal could be reached reliably on quasi-subjective grounds and professional opinion, if business as usual can’t make the cut, the judgment party’s over. The same learning mountain looms when insurance policies are no longer “blanket” and all-inclusive. When routine practice is not clearly suitable, knowledge development enters the hard-way portal. Today, insurance policies are unintelligible to the “insured.” You have much learning and knowledge development ahead.
Connectance
The second milestone is where it is factually clear and certain that the project dimensions sprawl beyond the limits of business-as-usual competency. The field of requisite knowledge for the foresight engineering process is significantly larger and more varied than that sufficient for the rules of regulatory agencies. The goal here is to establish unequivocally that the application of business as usual to the issue at hand cannot succeed. Because it is the institutional mainstream, it must be proven with objectivity and transparency. In triage, it’s one or the other. No partials.
Begin with damage definitions followed by study of the mechanisms that can give rise – the event tree. Damage descriptions suitable for framing prevention specifications are stakeholder-dependent. The list of stakeholders is an initial task and one invariably full of surprises for the practitioner. The only stakeholders recognized by the institution as such are those who have successfully taken legal action as plaintiff. No litigation, no stakeholder. Just getting the institution to authenticate stakeholders, so that scope boundaries can be set, can be a daunting exercise.
In the same genre as stakeholder authentication is the vital matter of goals and objectives. There is a significant aversion to rigorous goal definition by institution and stakeholder alike even though the definitions of damage to be prevented are married to the goals. Usually the shopping list of damages proclaimed by a stakeholder is inconsistent and incoherent with respect to the stated goals. Since the practitioner cannot rationally go forward with the design until coherency between goals and damage is established, this task has to reach a stop rule. Work on a mutually acceptable and objective stop rule is done first.
The clincher for concluding that the institutional attractor cannot succeed on an objective basis is a counting exercise provided by control theory and Shannon’s theorems, called connectance. Not every project that passes the connectance test is assured to succeed, but every project that fails connectance is doomed. The surety of connectance is found throughout the fauna and flora of nature. In any ecosystem, if more that 15% of the possible connections between elements are simultaneously active, the system is unstable. The task is to count the total relationships and then those among them that are routinely active.
Connectance was featured in Ashby’s approach as a requisite assessment step before going forth. Those who evaluate the viability of ecosystems now accept it as a critical success factor. It is a property of any system and when projects pass through the 15% connectance level, things become unstuck. Starkermann shows the same collision with instability as working group size is increased in the foresight attractor. In the institutional attractor, there is no scaling limit. Connectance is another system feature subject to natural law that runs counter to intuition and the institutional image of omnipotence. Testing this principle on seminar groups is surreal as the switch is uncanny and abrupt.
Once business as usual has been eliminated on surveyed and reliable grounds, the third and last goal of triage top is developing the information prerequisites for triage operated by the institutional attractor. Since the reference template for the third level of triage is developed with and by the situation owners, the triage top template is completed as practical. It provides emphasis allocation for the second tier work and the formulation of design basis events. Prevention delivery can only proceed with a concrete specification and image of what is to be prevented.
Information gathering is, of course, the process by which knowledge is acquired, and knowledge is the processes that integrate past and future experiences to suggest new now activities, either as nervous activity internally perceived as thought and will, or externally perceivable as speech and movement. These attributes fit the basic requisites of entropy reduction – imposed structure (thought) and work (movement).
Developing the triage protocol benchmark with the institution
Once the issue with business as usual as infeasible is put to bed, the task becomes to work with institutional agents to customize their triage protocol and benchmark manual. This is the impartial objective process that will be used by the institution to divert incoming to the appropriate attractor. It represents the decisions already made about what comprises appropriate for hindsight and what requires pragmatic foresight.
Views: 568