Red Teams Too Close to See

Security red team

Is it possible to be too close to something? To be so familiar with the situation that the view has become ordinary? I once asked the then security manager of Sydney Opera House, as we stepped out onto the forecourt with the sun sparkling off the wakes of the ferries, the Bridge towering off to one side and the Botanic Gardens lush on the other, if they ever forgot where they worked. The answer was “yes, sometimes”.

It is the same with our risks.  We become so familiar with our work environment, with normal business and the ‘usual suspects’ that we can fail to see what has changed or what has always been there and never noticed.

Managers all talk about foreseeable risks; but foreseeable to whom? What if we are so comfortable that we don’t see the things around us that others do, particularly those in the future sitting in a court room who have had time to prepare with the advantage of hindsight?

How can we find out if we have become blind or, at the least, blinkered? Two options present themselves.

One option is to do another security risk review – this time from first principles, putting previous assessments aside and starting from absolute scratch. If done objectively and as if for a new business it should produce, if not a different range of security risks, at least an honest review of assets, threats, risks and outcomes. As the potential for self-propagating sameness will exist, perhaps the use of an external consultant to conduct an enterprise-level security risk assessment, without access to the previous reports, may assist.

The second option is to have outsiders test the validity of the existing risk mitigation methodologies.  The use of a “Red Team”. However, care must be taken when considering having an external agency deliberately try to identify and prosecute holes in the security structure.

The client needs to clearly define how far the Red Team may go, are they allowed to conduct Social Engineering against staff? Can they cut fences or pop doors? It is also necessary to remind the team that they are not to break any laws but can report on how illegal means could be used.

Discuss the possible attack vectors and motives and ensure they are realistic and relevant in relation to the organisation. There has been a tendency to “Blue Sky” threat vectors and thereby include all sorts of unrealistic scenarios. Consideration should be given to what is ‘probable’ rather than ‘possible’.

A ‘first principles’ review can help establish a new or revised baseline for security. However, it can still be biased by corporate prejudices and blind spots. A Red Team review, if appropriately managed and controlled without inflicting ‘known wisdom’ on the external assessors, can provide a valuable and relevant snapshot of exposures and vulnerabilities, possibly to assets and functions that have not been identified or considered.

Obviously, it is essential to ensure confidentiality is maintained and that the weaknesses are only reported to those who need to know.

One example where this was of benefit was an organisation that dealt with sensitive corporate data and had processes in place to manage and control the flow of information, but had failed to notice that they employed foreign nationals from countries that were direct competitors. While there was no obvious evidence of corporate espionage, the possibility certainly existed with great potential cost.

We need to step back and see both the beauty and terror of the wider view and remember where we work.


SUBSCRIBE TO OUR NEWSLETTER

[mc4wp_form id=”4789″]

Don Williams
Don Williams MIExpE, IABTI, CPP, RSecP is convenor of the ASRC Explosives 2014 forum. Don is a member of the Institute of Explosives Engineers, the International Association of Bomb Technicians and Investigators, the venue managers Associations, ASIS International and the Australian Security Research Centre’s Activities Committee. He is the Author of “Bomb Incidents – the manager’s guide” and numerous other publications relating to explosive and bomb safety and security.