Skip to content Skip to footer

Systemic Failures, by Design

Over the past dozen years, the US has experienced a series of dangerous and costly systemic failures throughout our security and regulatory framework. The unfettered bubble in technology, missed opportunities to prevent 9/11 – leading to two ongoing wars, the tragic response to Katrina, the largest financial crisis in history, the Fort Hood massacre and the “underwear bomber” incident on Christmas Day all share one commonality.

Over the past dozen years, the US has experienced a series of dangerous and costly systemic failures throughout our security and regulatory framework. The unfettered bubble in technology, missed opportunities to prevent 9/11 – leading to two ongoing wars, the tragic response to Katrina, the largest financial crisis in history, the Fort Hood massacre and the “underwear bomber” incident on Christmas Day all share one commonality.

In each of these cases, data had been collected by US government agencies that contained a high probability of either entirely preventing or substantially mitigating each event, if only the information had been recognized and acted upon within the window of time allowed by circumstances. In case after case, repeated warnings by recognized experts, sourced internally and externally, were ignored or suppressed.

Concurrent with this series of historic failures, advances in the multi-disciplinary area of knowledge systems has dramatically improved our ability to predict and prevent crises. In the specialized field within computer science generally known as semantics, digital files are embedded with pre-defined meaning and executed in an automated or semi-automated manner that can reduce or eliminate common human failures, regardless of cause.

This technology is successfully deployed today in other large-scale data environments where human errors, conflicted decision-making, lack of interoperability and misinterpretation of data have long been associated with systemic failures. When combined with rich meta data in each digital file describing the interrelationships of topics, organizations and people, these types of human-caused systemic failures simply need not occur.

For example, if a state-of-the-art semantic architecture had been deployed prior to 9/11, the Phoenix Memo would have contained sufficient embedded intelligence to automatically elevate the red flag warning – not just within one agency where internal conflicts are common – but also to notify pre-selected decision-makers in partner agencies with built-in tracking to ensure accountability as well as instant audit reporting. This would have significantly increased the probability of preventing two wars.

In the Fort Hood massacre, a logical semantic system should have required an alert to the base commander and appropriate security personnel about Maj. Nidal Malik Hasan, who apparently displayed a red flag warning on a continuing basis.

In the most recent incident, on Christmas Day, a properly designed system would have profiled not just another youth succumbing to militant religious extremism, but the quality and relationship of the information source, which would then have automatically placed Umar Farouk Abdulmutallabon on the no-fly list.

Within the financial regulatory arena, a properly designed system for banking regulators around the world would have automatically linked the incoming data from university and independent researchers that clearly displayed dangerously spiking discrepancies between earnings and mortgage levels in multiple regional markets, thereby making it difficult if not impossible to ignore or later deny knowledge of a multitrillion-dollar financial crisis in the making.

In our highly complex, interrelated and rapidly changing world, these types of crises are simply too dangerous not to prevent, which is why so many countries around the world have targeted semantic systems research and development as a high priority.

In order to achieve the higher level of functionality required to prevent these types of crises, leadership must first acknowledge that misalignment of interests exists throughout their organizations – not just in sharing data, but also in the design and adoption of enterprise systems. More than a decade of attempting to improve knowledge systems in federal agencies has repeatedly demonstrated that effective solutions cannot emerge from the institutions implicated in the systemic failures. Despite tens of billions of dollars invested over the past decade, US government agencies are still a decade behind in state-of-the-art functionality in knowledge systems.

In understanding the breadth and scale of both the need and potential of improved knowledge systems, we should reconsider that during the past decade the US national debt more than doubled, the number of jobs created stands near zero, and the US stock markets delivered the worst performance in 200 years of history, providing investors with a negative return.

The direct correlation between a series of systemic failures and the precipitous decline of the American economy could not be more evident, so no greater priority exists, for all other goals depend in varying degrees on successfully overcoming this challenge.

We are wasting precious time.

Join us in defending the truth before it’s too late

The future of independent journalism is uncertain, and the consequences of losing it are too grave to ignore. To ensure Truthout remains safe, strong, and free, we need to raise $43,000 in the next 6 days. Every dollar raised goes directly toward the costs of producing news you can trust.

Please give what you can — because by supporting us with a tax-deductible donation, you’re not just preserving a source of news, you’re helping to safeguard what’s left of our democracy.