The recent tragedy at Fort Hood was only the latest in a series of crises that would likely have been prevented if the U.S. Government had adopted a logical holistic system design when I first began making the argument more than a decade ago. Since that time we’ve witnessed trillions of dollars and tens of thousands of lives lost; 9/11 and two wars, Katrina’s turf battles and incompatible communications, the mortgage bubble and global financial crisis, and now the Fort Hood massacre. The current trajectory of systems design and dysfunction isn’t sustainable.

“The care of human life and happiness, and not their destruction, is the first and only object of good government.” – Thomas Jefferson

While this particular tragedy is still under investigation, patterns are emerging that are very similar to previous crises, including 9/11. So let’s take a closer look at this event relative to what is currently possible with organizational design and state-of-the-art technology in order to better understand how to prevent the next crisis, for it will surely occur unless prevented by a logical holistic system design.

Crisis prevention by organizational design

It is true that some crises cannot be prevented, but it’s also true that most human caused crisis can be, particularly those that are systemic, including all cases cited here. In fact many tragedies are reported to have been prevented by intelligence agencies without our detailed knowledge, some of which would undoubtedly help inform our democracy if declassified, but we are still obviously missing preventable catastrophic events that we can ill afford to endure as a nation; economically or otherwise.

“In times of change, learners inherit the Earth, while the learned find themselves beautifully equipped to deal with a world that no longer exist.” – Eric Hoffer.

In each of the cases mentioned here, including Fort Hood, actionable evidence was available either on the Web or within the content of digital files residing on agency computer networks, but were not shared with the appropriate individuals or partners in the decision chain, usually due to careerism, turf protection, and justified fear of retribution.

It is difficult for leaders to understand that members in a hierarchical bureaucracy are often punished by micro social cultures for doing the right thing, such as sharing information or taking action to prevent tragedy. A good report from the field on 9/11 is Coleen Rowley’s Memo to FBI Director Robert Mueller in 2002.

Interests are not aligned: Denial does not a better system make

“The really valuable thing in the pageant of human life seems to me not the State but the creative, sentient individual, the personality; it alone creates the noble and the sublime.…” – Albert Einstein

The reality is that interests of the individual and that of the organization are often not well aligned, so system designs need to include intentional realignment. However, in the case of the Fort Hood massacre, red flags were so prevalent that many of us are asking the logical question: How explicit must a threat be before the systems will require action?

Red flags were hidden from those who need to know

In the case of Fort Hood, as was the case with 9/11, the U.S. Government apparently again experienced a data firewall between agency cultures, supported in previous cases by fear-induced interpretation of regulations and defensive micro cultures within agencies. The Washington Post reported that an FBI-led task force was monitoring emails of the suspect Army Maj. Nidal M. Hasan, some of which were shared with a Washington field office, but were not shared with the military, to include apparently Hasan’s supervisors who clearly were in the camp of ‘need to know’. A properly designed architecture as described in our recent hypothetical use case scenario for the DHS would have automatically alerted those in the decision chain who were pre-determined to ‘need to know’ when certain phrases are present, including the base commander and security officer in this case who may have prevented the tragedy in a manner that did not compromise the subject’s rights to privacy or freedom of religion.

“The status quo is the only solution that cannot be vetoed.” – Clark Kerr

One such semantic phrase for example that should probably be immediately shared with base commanders and counter terrorist experts would be: “communicating with known terrorists”. No one in the chain of command, including criminal investigators, should be empowered to prevent that information from reaching those in a position to prevent tragedy, whether a national security threat or localized. Indeed, logic suggests that local surveillance might be necessary in order to define the threat, if any.

Crisis Prevention by Technical Design

Among the many academic disciplines influencing modern enterprise architecture are organizational management, computer science (CS), and predictive theory, which manifests in the modern work place environment as network design, computer languages, and mathematical algorithms. The potential effectiveness of these disciplines depends primarily on three dynamically interrelated factors:

1. Availability and quality of the data

“A popular government without popular information, or the means of acquiring it, is but a prologue to a farce or a tragedy, or perhaps both.”– James Madison

The problem reflected in the decades-old phrase GIGO (garbage-in garbage-out) used in computer science influenced the holistic semantic design of Kyield more than any other factor. Rather than attacking the root of the problem at the source and investing in prevention, CS in general and consumer search in particular have teetered at the edge of chaos by combining clever algorithms and massive computing power to convert unstructured data (GI) to relevance (GO). While search and conversion of unstructured data has improved substantially in the past decade, it cannot compare to a logically designed QIQO (quality-in quality-out) system. Evolving to a QIQO environment from GIGO in organizational computing requires a holistic solution that is focused on prevention, improving work quality, and enhanced innovation.

It became apparent during several years of extensive applied R&D shortly after the commercialization of the Internet and WWW that embedding intelligence in files would result in far more functionality and efficiency, particularly within enterprise networks.

Without availability of high quality data that provides essential transparency while protecting privacy, the potential of enterprise computing is severely hampered, and in some cases has already become more of the problem than the solution. Once essential data is collected containing carefully tailored embedded intelligence, the task of preventing crises can be semi-automated.

2. Through data barriers

“It doesn’t work to leap a twenty-foot chasm in two ten-foot jumps.” – American proverb

Unlike other industries in previous technical revolutions, the U.S. has generally embraced a laissez-faire approach to technical standards, resulting in proprietary standards that are leveraged for market share. Unfortunately, the result in technology has been much like that in finance, although largely invisible with costs of inoperability transferred to customers. Unfettered innovation can have tragic consequences. In the network era, inoperable systems have increasingly contributed to some of our greatest challenges; including failure in crisis prevention, cost and inefficiencies in healthcare, and reduced innovation and productivity in the workplace. So in our case, even though voluntary standards are less than ideal, we’ve embraced the W3C standards for public transactions.

3. Data constructs and analytics

“Our major obligation is not to mistake slogans for solutions.” — Edward R. Morrow

Once the essential data is collected, many of our current great challenges in organizations become within reach:

  • Red flagging can be automated while protecting jobs and privacy.

  • Realignment of interests between the individual and organization.

  • Accountability and meritocracy is far more achievable.

  • Original work by individuals and teams can be protected.

  • Information overflow can finally be managed well.

  • Creativity and innovation can be enhanced.

  • Predictive and ‘what if?’ modeling /algorithms are much easier.

  • Formerly essential unknowns about the org become known.

  • The organization can become more adaptive to change.

  • Cultural management and continuous learning is manifest.

  • Rich visual metrics of formerly unknown patterns become routine.

Crisis Review

To his credit Secretary Gates has called for a system-wide review of the Fort Hood tragedy, which will coincide with reviews by the Army, White House, and Congress.

However, it would be irresponsible not to emphasize that the underlying stresses that likely contributed to this tragedy are directly related to failure in preventing previous crises. The result of previous failures to adopt logically functional systems is that our macro-fiscal situation in the U.S. is now so degraded that future prevention requires a much greater effort than would have been the case a decade ago.

Preventing systemic crises and related security (economic and warfare) are the foremost reasons for our government agencies to exist, and was the primary motivation for creating Kyield, even if the holistic design provides many other side benefits. The system problem has now been solved by design; but it has yet to be adopted.

“I am not an advocate for frequent changes in laws and constitutions, but laws and institutions must go hand in hand with the progress of the human mind. As that becomes more developed, more enlightened, as new discoveries are made, new truths discovered and manners and opinions change, with the change of circumstances, institutions must advance also to keep pace with the times.” – Thomas Jefferson

Mark Montgomery

8 thoughts on “Preventing the next Fort Hood tragedy, by design

    1. I was an entrepreneur in the Seattle area, had some success & started consulting — one thing led to another, including a fair number of the leaders — founded one of the early pioneer incubators specializing in knowledge systems — 15 years ago now, then venturing. We live in Santa Fe now — just click the Kyield logo for our web site. A great deal of history on this topic to include three admins, many members of Congress, dozens of agency CIOs ….. most top labs …. as we watch much of the world listen and adopt. A negative spiral I’m afraid, although one that can be reversed– only with sufficient desire and energy. Thanks, Mark

  1. I forget where I read it, but someone (I’ll eventually remember) pointed out that what’s good for humans is generally not good for systems, and vice versa. I’m going to have to remember; it seems to explain practically everything we consider a failure in government.

    It’s also difficult for the general public to understand that bureaucratic cultures punish people who do the right thing. This is why there are so few good cops anymore.

  2. Maybe organizational leaders have to realize the user needs to control the technology, not the technologist (CIO) controlling the user. You do not have to be conversant with technical terms to build an application or share data. You have to have someone who knows it can be done. With over 55 years in IT and communications I cannot understand the signigicant role the CIO plays in making user decisons. Infrastructure needs to be built to accomdate user applications, not fit behind some firewall or squeezed into some communications pipe not designed for flexibility. Study the articles in our trade journals and I’m sure you will see more and more the role of CIO drives the application. It is the application that should drive the technology. To me that is one of the greatest underlying problems why information is not shared. The infrastructures are not designed to accomodate information sharing. Today the technology and communications systems are there to promote sharing but retarded by individuals dictating what can and can’t be done because they control the systems. In closing I recommend the book “The Soul of a New Machine” by Tracy Kidder. Read the book and compare it to the problems of today. Get the people who say it can’t be done (information sharing) out of the picture and a good person steeped in technology that wants to prove it can be done put in charge. There are plenty of young folks anxious to take the challenge on. All it takes is the backing of the user.

  3. Thanks for your comments Michael and Tom.

    Tom, I checked your bio and see that you have direct personal experience in the CIO role. I was briefly a board member of the U.S. CIO KM Work Group a few years ago, invited as the only outsider at the time I think due to a combination of our experience with knowledge systems — two technical leaders with significant market adoption (one that many leading universities used as a pattern), and with some hope that I would be a reformer.

    I wasn’t surprised, but quickly learned that reform from the inside of the CIO structure was very unlikely. Many issues beginning with the KM culture itself ( see alts to CKO http://wp.me/po8Dc-49 ), the issues with the CIO role and KM leaders reporting to CIOs, and the often unhealthy relationship between IT leaders and vendors — careerism. Of course the voluntary standards system for what is an essential utility doesn’t help.

    When I concluded that internal reform wasn’t possible, I had no choice but to resign. By regulation for example, any multi-organizational reform effort (I was told by the head of OMB at the time if memory serves) must be sourced from the WH agenda. Only after Katrina and the ensuing report did that happen finally– in part from my Katrina business case ( http://www.kyield.com/publications.html ), but even that disaster wasn’t sufficient to cause real reform — still no national knowledge system like other countries — Australia recruited me a decade earlier for their national system — I have often wondered what if I had gone….

    The financial crisis did create a firestorm of interest and we are seeing progress for the first time (in government knowledge systems reform), but I am still concerned. Tom says “infrastructure are not designed to accommodate information sharing” . One of the fundamental problems with internal architecture (there are also benefits) is that it does allow agency heads, unions, and IT departments to protect rather than focusing on their reason for being, aka mission. We saw that at the core of every crisis, the most obvious being Katrina and Fort Hood, the most costly being 9/11. What we don’t see however is huge, occurring every day, showing up in the form of the most costly and increasingly dysfunctional systems in the developed world (healthcare, education, crisis prevention, economics….).

    Tom is quite correct about control. Of course security issues are paramount, but ironically security is enhanced with a logically designed semantic system, where in Kyield for example the CKO module sets the system parameters. It’s not just about empowering the individual — it’s a very complex balance of aligning interests that then must be simplified greatly.

    BTW, if you haven’t seen it yet, we just released the first in a series of hypothetical use cases. The first is a DHS case involving facility security and international terrorism:

    http://www.kyield.com/images/SCENARIO_3-_Roger_the_maintenance_man_at_the_hydro_dam.pdf

Leave a Reply