Avoiding the Threat Posed by Predictive Certainty

The University of Maryland Center for Health and Homeland Security (CHHS) was recently tasked with developing the security plan and value assessment for a large metropolitan transit agency that uses close to 400 transit coaches to carry almost 30 million riders annually. During peak travel hours, the agency (which for professional reasons cannot beentified more specifically) has more than 300 of its 30- to 40-foot coaches in service at any given time.

The intent of developing a security plan was to establish a broad spectrum of strategies that could be implemented over the course of several years. To develop such a plan, it was determined, that an initial assessment of the various sites controlled by the agency would be needed and include: the agency’s executive building, depots, transit centers, bus stops, and various park-and-ride lots as well as the agency’s training academy. In short, the agency had to deal with all of the typical problems and operational issues facing any other large local agency. Because of its proximity to one of the nation’s largest metropolitan areas, however, the number of potential targets significantly increased the risk profile.

The CHHS effort highlighted several fundamental concerns with potentially far-reaching effects including the fact that risk modeling often uses extremely technical and complex formulas with limited value and can actually increase risk, rather than reduce it. Such models also can be both expensive and time-consuming. Another major concern was that it is extremely difficult to predict acts of terrorism – and efforts to do so can lead to a false sense of security. Moreover, the use of highly complex risk models to accurately forecast future events can sometimes lead to higher levels of risk exposure simply because of a false belief in the accuracy of what is almost always an extremely complicated assessment.

Quick Overview of a Complicated Process 

In developing the threat profiles of the various sites analyzed, the CHHS team used many of the usual sources of information that transit agencies across the country would be able to access within their own jurisdictions. The first step was to consider the types of assets involved – for example, operational centers, staging areas, transit staff personnel at those locations, as well as their current duties and responsibilities.

Also taken into consideration were flood maps and the crime statistics for each area, the number of commuters likely on any given day, after-action reports of previous incidents, and the opinions, insights, and recommendations provided by agency employees. The team also took careful note of the number of vehicles and passengers likely to be present at each site at any given time.

Although risk assessments come in a variety of forms and employ different approaches, there are certain core elements common to most of them, including: (a) a careful analysis of the value of the physical assets involved; (b) the potential impact of a dangerous incident or event; and (c) the probability of such an event occurring. The value of the assets involved is then quantified in economic and operational terms based on variables such as the construction costs of the facility, the revenue generation provided by the site, the potential cost of using alternate sites in times of sudden emergencies, and the contributions that the site brings to the company’s operations as a whole.

To understand the possible impact of various types of incidents, the first question usually asked is, “What would happen if the company were to lose all or part of this site?” After that question is answered, the next steps are to consider the severity of past events, physically inspect the sites involved, and determine as accurately as possible what various types of incidents would adversely affect the physical integrity and operational status of the site. By using this reductive approach, the CHHS team was able to make certain reasonably accurate determinations about the impact that various types of incidents could have not only on the individual sites but also on the transit agency as a whole.

The likelihood that a certain type of incident will occur, though, is much more difficult to determine. Even the most detailed and accurate consideration of crime and usage statistics, flood maps, and after-action reports to determine probable trends will usually generate a number of best-guess answers and estimates. As the project team discovered while researching the data sources for models and formulas used by other jurisdictions, more technical models offered more accurate the results. Unfortunately, the more technical models are more difficult to fully comprehend, interpret, and act upon the results.

Earthquakes & Overcoats – Limits of Prediction 

This was not nor should have been a surprise. Experience shows that the use of predictive models is often not worth the time and/or effort involved if only because the world at large is so incredibly complex. Moreover, most people including experts tend to overestimate their own ability to predict what is likely to happen in or because of certain assumptions and/or circumstances.

This tendency is both reasonable and understandable. When people choose their clothing based on the weather prediction for the day, they can quickly adapt to sudden changes by donning a sweater, doffing a winter coat, or using an umbrella to compensate for an inaccurate prediction. However, earthquakes, terrorist attacks, and other major unexpected incidents are much more difficult to prepare for. Although planners have long understood that long-term projections must be reasonably flexible, they often fail to incorporate enough flexibility into their projections of potential risks and likely threats.

There is another risk factor involved in such projections that is not always acknowledged: human nature. Whether considering crisis management, stock market predictions, or the outcome of various sporting events, the theoretical “lesson learned” is usually the same: The analysts and planners involved ascribe correctly predicted outcomes to their own insights and personal expertise. But when the outcomes are incorrect, the same experts are quick to point out previously unknown (or misunderstood) factors that adversely affected the outcome. Unfortunately, there are far too many factors and variables involved in major projects to fullyentify, let alone quantify, the numerous future risks, potential dangers, and a broad spectrum of probable, possible, and/or unlikely results needed to produce the reasonably accurate risk scores needed and expected. The false expectation of at least some planners, therefore, is that if more refined – i.e., more complex – risk models are used, more accurate predictions can be developed, the correct actions can be recommended, and the overall risks can be reduced.

The Dangerous Use of Overly Complex Predictive Models 

There are, in short, numerous concerns involved, specifically including the inefficient use of increasingly scarce resources, when overly complex risk models are used. These concerns are magnified when planners overestimate the ability of such models to fully capture the nature and types of risk involved and recommend the spending of additional resources to develop increasingly complicated formulas designed to capture the essence of such risks within a particular context. Adhering more closely to some of the basic values – especially flexibility and adaptability – postulated in the National Incident Management System (NIMS), would significantly mitigate such concerns.

Another risk involved in developing relatively complex risk assessments is the creation of misplaced confidence. As such assessments become even more complex, and consequently less understandable, a perplexing inversion sometimes takes place: Those who use the assessments to develop operational plans often gain more confidence, irrespective of the fact that they may not fully understand the additional complexities involved. Deciding how to best use resources to reduce risk can be a daunting task in any circumstances, and an accurate assessment should provide at least a starting point that is supposedly grounded in hard numbers. But when an assessment is completed that does not provide a comprehensive understanding of all possible risks, the managers and decision makers involved may fail to fully consider threats that receiving lower or no priority in the assessments.

Additional problems develop, of course, when risk assessments are so general in their doctrinal approach that the results and recommendations developed are little more than speculation. Another danger that should be taken into consideration is that highly technical assessments seem to provide a solution to the problem by defining a course of action that is solely quantitative – ignoring the fact that the underlying reality of today’s dangerous world defies ification in simply mathematical terms.

The Way Forward: Numbers Are Not Reality 

In some ways, models are a necessary evil.eally, planners would consider a broad spectrum of scenarios, recommend approaches that would be taken when unlimited time and resources are available, and postulate flexible protocols that would provide helpful guidelines without adversely affecting emergency needs. However, the usual scarcity of time and lack of material resources make it impossible to bring all necessary participants into the more detailed type of planning structure needed to consider and implement the actions required to counter the dangers that must be confronted.

In short, when analyzing the risks facing various assets and operations, it is necessary to: (a) categorize various overarching risk levels; (b) facilitate a comprehensive and accurate discussion of the various types of risks involved; and (c) establish a common nomenclature that would facilitate both an understanding of the various risks involved and the promotion of an open discussion of the actions that must be taken.

Fortunately, the CHHS project team adopted a numerical approach for ifying the various risks involved for the transit agency, but also recognized that it was important to adhere to a single guiding principle – namely, that reality cannot be expressed in numbers alone. To be both useful and significant, the numbers used must be accompanied by enough descriptive language to build an understandable and “doable” context for the decision makers responsible for using the assessments made by the project team.

Conscientious Avoidance to Enhance Operational Resiliency 

Because some of the predictive quantities were so amorphous, and therefore somewhat unreliable, the team put special focus on determining the value of the assets in question. From there, the team used reductive reasoning to measure various impacts that could affect each of the assets involved. While maintaining an all-hazards approach, the team focused on determining what capabilities were in fact needed to enhance the resiliency of the various assets being considered. Finally, after these foundational principles had been determined, the team considered the probability of various potential events – but conscientiously avoided the development of specific predictions. The goal from the beginning was to: (a) define understandable protocols that would focus on minimizing loss, no matter what the cause; and (b) ensure that the valuable insights and information that staff members possess were fully and effectively integrated into a robust and useful assessment.

Today’s world is an extremely complicated place and there are too many complex factors, too many variables, and too many random events and circumstances to accurately predict exactly what will happen in the future. As various risk models become increasingly complex, the information developed by those assessments also becomes more complicated – and thus less useful to those responsible for implementing and acting upon them. In addition, overconfidence in an agency’s understanding of its risk profile can lead to courses of action that may dramatically increase an organization’s exposure to risk.

A more effective approach might be to begin an assessment from the perspective of valuing the asset and determining the impact likely if it were damaged, destroyed, or otherwise made unavailable. Plans, procedures, and protocols could then be developed to enhance the resiliency of the asset. The bottom line is that, by emphasizing flexibility and adaptability, most physical assets and resources can become more secure and less exposed to threats across the full all-hazards spectrum.

_____________

For a thorough discussion of risk and the limits of forecasting, please see Nassim Nicholas Taleb, Antifragile: Things that Gain from Disorder (2012).

Michael Vesely

Michael Vesely is a certified instructor of COOP, Incident Command Systems (ICS), and other DHS homeland security courses. He led the team responsible for rewriting the Homeland Security Strategic Plan for the National Capital Region, and also worked as a planner for the Mid-Atlantic Regional Center of Excellence for Biodefense and Emerging Infectious Diseases Research. He holds a J.D. degree from the University of Maryland School of Law and currently plays a leading role on economic security issues in the University of Maryland’s Center for Health & Homeland Security.

SHARE:

TAGS:

No tags to display

COMMENTS

RELATED ARTICLES

TRENDING

RELATED ARTICLES

TRENDING

Translate »