Multi-Site Security Reviews: Getting Both Consistency And Accuracy

    treesBy Rick Draper.

    The practice of undertaking security reviews is generally well understood. However, delivering reliable, consistent, accurate and useful assessments across multiple sites can be difficult, particularly if there are teams of reviewers involved.

    It is important to draw the distinction between security risk assessments and security reviews. Essentially, a security review may serve to inform a risk assessment, but by definition does not necessarily involve qualifying or quantifying the level of any specific risks. This article relates to security reviews, but the principles can easily be expanded to incorporate collection of additional data to support multi-site risk assessments.

    The Output Drives the Approach

    It is not possible to generalise about the requirements that might exist across the many different contexts that may apply. As with any single-site security review, the outputs necessary from multi-site reviews will set the requirements for the approach that needs to be undertaken. The format that the reports need to take, timing and key areas of concern are all driving factors that will influence the reviewer’s approach.

    A good starting point is to mock up sample reports that reflect the required outcomes and then deconstruct them to establish the variables that apply, and the information and observations that are needed from each site. Security review reports generally include a mix of factual data and observation, conclusions derived through analysis and recommendations based on a range of factors and influences (such as a policy that states that every site must have a CCTV camera providing clear images of persons entering the reception area versus a conclusion based on understanding of likely risks).

    The structure of reports varies so widely across applications that it is not possible in an article of this nature to provide specific guidance. However, a multi-site review, similar observations will be expected to be reported consistently, although context may require a different set of recommendations at one site compared to another; the reader should be able to understand the rationale for any perceived discrepancies. This is where reverse engineering reports are so valuable in the planning stages of a project involving multi-site security reviews. The process enables foresight of the issues and the opportunity to refine the approach according to the specific project needs.

    Background Data Gathering

    While it is important for reviewers to prepare prior to any single-site security review, the importance is amplified when dealing with multiple sites. Having the right background information for each location and reviewing it prior to going on-site will greatly enhance efficiency during fieldwork. In doing so, reviewers need to be cautious about assumptions that might arise from preliminary information and should validate the background data when on-site.

    Depending on context and outcomes required from the reviews, required background data may include:

    • names of the sites/facilities (as they are referred to internally) and brief descriptions
    • list of contacts, including key stakeholders
    • location maps, site plans, floor plans and elevation drawings
    • descriptions of primary purpose (what happens where)
    • as-installed drawings of security-related systems and copies of maintenance records
    • copies of relevant plans, policies, procedures, standing orders and instructions
    • job descriptions and duty statements for relevant positions
    • occupancy and activity levels in various areas of the site, including seasonal variations
    • date and time information, such as normal hours of operation, staff arrival and departure time, regular out-of-hours activities such as cleaning and maintenance
    • details of alarm monitoring, guard and patrol services and other contracted security functions
    • copies of security awareness materials
    • photos of relevant areas (including dates taken)
    • details of recorded security-related incidents at the sites/facilities
    • information about security incidents and/or crime occurring in similar facilities or in the vicinity of the subject sites.

    Collating this background information can be time consuming for the facility and it is important that the reviewer is clear in his or her requirements. Familiarisation with the language and terminology used by the organisation and a request for information that is consistent with the terms used internally is helpful. It can be useful to provide a checklist in the form of a spreadsheet that incorporates columns/fields to assist the site with managing the collation process (such as source of information, task assigned to, target date, date supplied and so on).

    Recognising that background materials take many different forms, it can be useful to provide a secure web-based facility to upload at least some of the documentation and collect responses to preliminary questions. However, care needs to be taken to ensure the facility being used reflects the security needs of the organisation, which often rules out a range of cloud-based tools.

    On-site Data Collection

    Depending on the complexity of the security reviews being undertaken, the options for gathering consistent data across multiple sites can be limited. At the most basic, a paper-based checklist or scalar survey form can be created. Such tools can then be adapted for use in electronic form to enable data collection on iPads, tablet computers or even smartphones. With a wide array of traditional applications (such as spreadsheets) now available for mobile devices, it has never been easier to collect information in the field in a way that minimises post-processing.

    There are also a number of web-based tools and applets available to assist with data gathering in the field. For example, if the security reviews are occurring across a larger area where geolocation of observations is important, tools like Plot & Audit can be extremely effective. Similarly, if the data collection is limited to a single area on each site, applications such as iAuditor can be a good choice to streamline data collection.

    Irrespective of the approach to collecting the data, it is highly recommended that any tools developed be tested multiple times on the same sites by all of the project team members to ensure that the required level of consistency is being achieved. It is important to recognise that whilst consistency does not equal accuracy, both are important in undertaking multi-site reviews.

    Testing of the data collection and site review tools should also involve practicing contingency plans, such as damage to equipment or lack of internet connection. If the sites are all relatively accessible, there may not be an issue with rescheduling the on-site aspect of the review. However, if the sites are remote, the opportunity to revisit may be limited and very costly.

    As with all data collection, back-ups are essential. For paper-based records, scan the originals and make physical copies. If data is collected on spreadsheet applications, synchronise the files with a back-up server at the earliest possible opportunity. And, of course, consider the security of the backup choices.

    A Picture’s Worth…

    While a picture might be worth a thousand words, it is important to provide annotation so readers understand the image being presented. While it might be obvious to the photographer immediately after he or she leaves the site, the following information should always be recorded with photographs:

    • date/time taken
    • photographer’s name
    • where the photo was taken (site/area/location)
    • description of key matters of note in the photo/reason for taking the photo.

    Photos taken for orientation purposes can be extremely valuable, but may not be suitable for inclusion in the final reports. However, they should still be annotated and stored in a format that allows them to be used in the future. Plot & Audit and iAuditor include facilities for linking photos to data collected during the review, with the capability to generate PDF outputs that include the photos.

    Generating the Output

    Regardless of the preparation and data gathering time, the success of a multi-site security review project will be judged on written outputs; whether that is as basic as a photo survey or detailed reports for each site. Writing each report from the available data can of course be done manually, perhaps with a good smattering of copy and paste. However, as the number of sites increases, the time and tedium associated with generating the final output starts to increase.

    Notwithstanding the tools and applications available, one really simple technique to generate a large number of reports in short order from a set of consolidated data is to use the mail merge facility in a word processing application. The data collection spreadsheets can incorporate calculations that place words, sentences and even paragraphs in mail-merge cells, which will then appear in context in the report. For example, the output in relation to the attractiveness of a particular cash handling point as a target for robbery, dependent upon a specific set of variables, might appear in the relevant reports as either:

    • “[some recorded observations] make this location unattractive as a target for robbery”; or
    • “[some recorded observations] make this location only moderately attractive as a robbery target”; or
    • “[some recorded observations] make this location potentially very attractive as a robbery target”.

    Having the ‘recorded observations’ collected consistently in a compatible data source, along with the associated variables, means the reports will virtually write themselves. In practice, the report should be outputted to a merged file and the wording fine-tuned after proofreading. This technique can save many hours of work and deliver high-quality, consistent outputs for multi-site security reviews.

     Rick Draper is the principal adviser and managing director at Amtac Professional Services Pty Ltd. Rick has over 30 years’ experience in the security industry; the last 21 years as a consultant. He is an adjunct senior lecturer in security management and crime prevention at Griffith University and a member of the ASIS Loss Prevention and Crime Prevention Council. Rick has been involved in the development of web applications and tools to assist in undertaking security and crime prevention reviews since the 1990s, including the development of Plot & Audit (https://PlotandAudit.amtac.net). He can be contacted at Rick.Draper@amtac.net