Dynamic Model of Risk ControlOrganisations and individuals constantly collect and interpret messages from their operational environment. The messages may be accurate and complete, or they may be distorted. The distortion may be caused by the transmitter, the communication channel, or the receiver. Such communication problems might be unavoidable and they may be deliberate. The data in the messages is used to develop an hypothesis about an entity’s intent and capability, or of the likelihood of some random event. The hypothesis will indicate that the entity or event is, or is not, a threat. Any new data that becomes available may be categorised in such a way that it supports the popular hypothesis.

Risk analysis failure is an inability to predict the potential or actual behaviour of an entity or the likely occurrence of a random event. It is a cognitive failure which can occur due to a multitude of factors. Politico-cultural biases will determine what data should be collected, how it should be interpreted and what action should be taken.

The risk image is compared with a datum reference perception of acceptable risk. The difference between the acceptable risk and the risk image may be amplified socially and by various overt and implicit actions or decisions. These amplifications, or gain, will affect the operational and future environments. There will also be other disturbances which may affect the environments.

The dominant cosmology of the risk decision maker will determine the importance of organisation design, lead times, lag times, the load on the risk control system and the degree of gain introduced into the operational environment. Risk analysis failure and surprise may follow a systemic increase in disordered data (entropy) or a systemic increase in noise.

Communication in complex operational environments can affect not only the psychological environment but also the operational environment. Biases are ubiquitous whenever security analysts and decision makers attempt to monitor operational environments. These biases can affect risk data indexing systems, which in turn affect images of risk and the operational environment itself. They also have an indirect effect on the future environment. These biases may be imposed on the analyst and decision maker by matters of complexity, social context, political culture and routinised heuristics. Issues of complexity and simplicity are also important within the context of cultural biases. A contingency-based risk control model relating different cultures to differing types of decision processes is needed to develop a new model for the study of the dynamical behaviour of risk management systems.

To do so requires a deductive modelling process which involves the fusion of numerous theoretical concepts from several disciplines into a unified and dynamic cybernetic model for describing risk and surprises in a simple and practical way.

Risk Control

This new model of dynamic risk analysis and its application will be explained in later editions of Security Solutions Magazine.

Kevin Foster
Dr Kevin J. Foster is the managing director of Foster Risk Management Pty Ltd, an Australian company that provides independent research aimed at finding better ways to manage risk for security and public safety, and improving our understanding of emerging threats from ‘intelligent’ technologies.