‘Victory smiles upon those who appreciate the changes in the character of conflict, not upon those who wait to adapt themselves after the changes occur’. Sun Tzu
Sun Tzu, a great ‘Red Teamer’, so good that we are still quoting him two and a half thousand years later.
Security professionals must always be looking for new ways to protect the enterprise. ‘Red Teaming’ is a powerful tool to aid understanding and improve decision-making for business and security planning in which a trained and knowledgeable team attempts; Through the eyes of a competitor (‘competitor’ is used throughout the article to represent commercial, social or national adversaries), to understand, challenge and test a system, plan or perspective.
Red Teaming is not just about playing at being the bad guy. Knowing the competition and their thought processes, motives and drivers are important elements, but Red Teaming is more than that; Red Teaming is a discipline that requires immersion in the operating environment of the organisation. Red Teaming is a structured system for understanding the enterprise, the threats facing the enterprise and its assumptions, policies, procedures and processes.
Red Teaming is about asking sound questions that stimulate thinking about the assumptions that underpin the enterprise and the way it pursues its goals. These questions arise from the application of structured, analytic techniques – diagnostic, contrarian, and creative – that hedge against the natural tendency to perceive information selectively, to search too narrowly and to be unduly influenced by expert opinions.
It is the deliberate, challenging nature of Red Teaming that distinguishes it from other management tools. Red Teaming helps overcome the tendency towards success-oriented testing where the assessment is designed to prove that the current system works. Effective Red Teaming is used when the systems, policies, procedures and underlying assumptions are to be legitimately challenged.
Red Teaming may be used at all levels within the enterprise. At the strategic level, Red Teaming is used to challenge assumptions and visions when the enterprise faces either a major threat or opportunity. At the operational level, Red Teaming provides critical analysis of how the enterprise conducts its business in order to anticipate, rather than react to problems.
Red Team findings objectively assess whether current policies, plans and procedures would defeat adversary attacks, and what must be done to modify and improve the current position. Red Teaming is a tool that identifies vulnerabilities before they are exploited by competitors’ tactics.
Success may blind organisations to the need for change. This phenomenon, called paradigm blindness, argues why change has worked in the past? New, agile and adaptive competitors are able to observe protective arrangements, identify vulnerabilities and shape their tactics accordingly.
Traditional threat analysis relies on historical data to assess the likelihood of known threats attacking known vulnerabilities at some time in the future. We rehearse the past using staged exercises with master event lists directing competitors to act according to our motivations and values. These traditional practices based on the knowledge and understanding of our operating environment, often fail to see the world through the competitor’s eyes.
As the 911 terrorist attacks demonstrated, past behaviour is not the best predictor of future behaviour for the following reasons:
- There will never be enough information to predict all possible attacks;
- It will not identify a never-before-seen attack; and
- Necessity may lead to proposed threats matching existing protective security measures.
In the face of an uncertain and dynamic threat environment – Red Teaming has developed as a specific management discipline with its own body of knowledge that is being applied in more sophisticated ways:
- To deepen our understanding of the options and responses available to competitors;
- To critically analyse existing assumptions, strategies, plans, concepts, programs, projects and processes and
- To be a sounding board during business and security planning.
Just trying to think like the competitor, role thinking, is not Red Teaming. Indeed, recent research by Green and Armstrong (2011) suggests that the forecast accuracy of trying to determine what the competitor might do using this methodology, by both novices and experts, was close to the probability of chance. However, when participants adopted the roles of competitors and interacted with each other in a structured, simulated operational environment, forecast accuracy improved significantly.
One explanation for these research findings is that people are unable to think through complex interactions between competitors with different roles in ways that realistically represent the operational environment. Red Teaming, with its emphasis on immersion, challenge and adversarial decision-making processes, offers a better basis to assess the adaptive behaviour of current and future competitors.
Red Teaming uses two broad approaches: virtual and physical.
- Virtual Red Teaming is employed in discussion exercises or desk-level research to analyse adversary plans; looking for indicators and warnings, key decision points, and vulnerabilities in those plans. In this context, Red Teaming may be closely aligned to future planning.
- Physical Red Teaming is employed as part of a practical exercise involving staff and others. Physical Red Teaming assists in portraying realistic, adversary moves and countermoves according to their assessed motivations, capabilities and intent. Here, Red Teaming is aligned to threat and vulnerability assessments, but looks beyond what is known.
Red Teaming produces benchmarks for security planning. These benchmarks provide a baseline to compare how enterprise policies, plans and procedures respond to a broad range of competitors’ tactics. Over time, the baseline allows us to assess the impact of security specifications, either during implementation or in terms of their relative effectiveness.
Despite its many advantages, Red Teaming is not a silver bullet. Red Teaming will not guarantee the avoidance of surprise. But Red Teaming can better prepare enterprises to deal with surprise by thinking the unthinkable before it occurs. As Sun Tzu advised at the outset of this article, it is the skilful, intelligent adaptation to the actual conditions of conflict that best leads to victory.
The credibility of the Red Teaming output hinges on the quality and experience of the team, the team’s approach and toolset, the quality of the leadership and the overall context of the effort. An uninformed, overconfident or culturally-biased team is unlikely to add value and may be detrimental to the project. Furthermore, the product of a successful Red Team will be of no benefit if it is rejected or not considered by executive management.
In summary, building a Red Team capability requires a deep understanding of the enterprise’s operating environment, business drivers and available resources, as well as the plausible threats and hazards to which the enterprise may be exposed. Key factors in building a Red Teaming capability include: Mandate, culture, interaction, staff selection and timing.
Mandate Red Teaming needs a scope, charter and reporting relationship that fit the management structure. A Red Team should be expected to raise issues that might not be welcome throughout the enterprise; it needs the support, sometimes from the very top levels of the enterprise, to maintain the integrity and confidentiality of the process, and to ensure Red Teams do not become marginalized or compromised to support the status quo.
Enterprise Culture Red Teaming thrives in an environment that not only tolerates, but values, internal criticism and challenge. Unfortunately, it is often the case that organizations in need of Red Teaming have a culture inimical to its use.
Robust Interaction. Red Teams and managers must interact to establish a win-win environment in which blue learns from the process, and comes out with sharper skills or more robust solutions and/or a greater appreciation for the issues that the organization must face.
Staff Selection. Staffing Red Teams presents special challenges with many very talented individuals unsuited, temperamentally or motivationally, to be effective Red Team members. Furthermore, resource constraints normally imposed on Red Teams necessitate judicious selection of the right mix of talents and perspectives. Imagination is a particularly desirable attribute. As with any management tool, Red Team members require knowledge and understanding of the discipline and not just the mechanics of how to do it.
Timing Too often, Red Teams are used only after major problems have arisen or after too many resources have been expended when an earlier use of Red Teams could have anticipated the problems and a change of direction would have been less painful. However, if used too early, with too heavy a hand, promising ideas may be prejudged as failures.
Red Teaming is a powerful tool to aid understanding and improve decision-making for business and security planning. Red Teaming is especially important to business at a time when the operating environment is becoming increasingly complex and dynamic. Security professionals must always adapt proactive measures to anticipate, defend against and pre-empt new types of threats. Moreover, one should not expect past trends to reveal future attack patterns because competitors always seek to exploit new vulnerabilities and adopt new and innovative modes of attack. Used properly, Red Teaming will make the organisation stronger, more robust and more competitive.
Brett Peppler FAIPIO is the Managing Director of Intelligent Futures Pty Ltd, an independent, Canberra-based consultancy specialising in futures studies, intelligence support and risk management. He is a career intelligence officer with over 30 years experience, as well as a keen observer and commentator on security issues. Brett is the President of the Australian Institute of Professional Intelligence Officers (AIPIO), and Vice Chair (International) of the International Association for Intelligence Education (IAFIE). Brett can be contacted at firstname.lastname@example.org.
Don Williams CPP holds qualifications in
Security Management and Security Risk Management as well as Project and Resource Management and is a Certified Protection Professional (CPP). Don has provided professional consulting services and conducted strategic security analysis for over 20 years and is the secretary of the Australasian Council of Security Professionals. He has a particular specialty in bomb safety and security. Don can be contacted at email@example.com.