The Path to Peak Performance
When GANNETT CO., INC. was preparing to move into a new headquarters facility in McLean, Virginia, a Washington, D.C., suburb, the company’s security director set out to identify innovative solutions to protect the 1.5 million-square-foot building and 25-acre campus, which when completed would house more than 1,600 of the media giant’s employees.
Gannett has a minimal in-house security staff consisting of one corporate security and safety manager (the author), who falls under the facilities function. The company relies on contract officers for its security needs, which include patrolling the building and grounds, assisting with parking, taking care of visitor access, handling special security needs for large or high-profile events, and performing a host of other services.
At the time of the move, the company hired a consulting firm, UMS Group, to reexamine its facility management outsourcing strategies. With regard to security operations, the objective was to maximize guard performance while minimizing costs and staffing levels. While still at its previous location, Gannett had decided to switch service providers at the time of the move to the new headquarters.
The plan. Performance-based management (sometimes referred to as management by objective) was selected as the tool of choice for all of the major service contracts, not just those involving security. As a part of this process, Gannett management had to identify meaningful goals and metrics for contract staff and establish a system of rewards for service providers when those goals were met.
In Gannett’s view, a performance-based contract meant that the provider had to be willing to support these efforts financially, putting some of its profit at risk if goals were not met. As a first step in creating these metrics to identify, measure, and reward success, Gannett established a working partnership based on honesty, candor, and open communications. From the beginning, the company met repeatedly with its selected provider to brainstorm approaches and to answer two essential questions: What constitutes quality service, and how can quality be defined at a level where it allows staff to be aware of expectations and motivated to meet them?
Metrics. Following this extensive dialog, six elements were chosen as quality assurance indicators: job knowledge, customer service, customer satisfaction, innovation, turnover, and building stewardship. When combined, these measures provide a representative mosaic of the overall quality and performance of the security staff, particularly in relation to performance in the eyes of security’s customers, which are defined as the building’s occupants, whether visitors or staff.
These quality-assurance indicators made it clear that security staff should focus primarily on customer care rather than on housekeeping issues such as timeliness and appearance. Additionally, they provided a framework in which the contract security manager could design any number of creative training and testing mechanisms to keep the work stimulating for staff.
Each of the elements was definitively measured so that a numerical rating could be assigned to the behaviors, which were separated by shift. Building stewardship, for example, was measured only by the number of work tickets turned in quarterly and carried a weight of 10 percent of the overall rating. Meanwhile, job knowledge was given a weight of 25 percent and used drill scenarios, pen-and-paper tests, and secret shoppers to rate 100 percent of supervisors and 80 percent of line staff (the vendor decided who it wanted to test and when the test would be given).
All of the metrics are evaluated quarterly to ensure that the focus remains on the overall trend rather than on daily fluctuations. One exception is customer satisfaction, which is calculated annually using an employee survey. (More on this later.)
Likewise, innovation measures the steps the vendor and staff have taken to save money, improve service, and so on. In this case, the staff person must lay out the idea—the current situation, the proposed solution, and the measurement to be used—and, if the idea is approved by Gannett, must follow the idea through to implementation to receive credit. For example, someone might suggest that by restriping certain traffic lanes, the possibility for vehicle accidents could be reduced. Thus far, the most successful innovations have related to parking and traffic control issues.
A team of four—the vendor’s director for business development and local manager, the author, and a facilities analyst at Gannett—developed the first set of metrics and the reward system. Now, the vendor is responsible for designing the tests and self-assessments, though all of the test material is reviewed jointly with Gannett management prior to use, and review meetings are held.
Reward system. The metrics become motivational tools through a reward system structured around the six factors. Under this system, a percentage of the vendor’s profit is put at risk, depending on the security staff’s success in complying with or meeting the requirements of the six quality assurance indicators; ranges of scores correspond to some percentage of the profit at risk that the vendor will receive for that score. Each of the six metrics is worth a different percentage of the total profit at risk.
Let’s say that the profit at risk is $1 for each billable hour. If 1,000 hours are worked during the quarter, this is equal to $1,000 at risk. As job knowledge scores carry a weight of 25 percent of the total metric, the job knowledge ranges will apply to $250 of the profit at risk. For our example, assume that a shift has an average score of 88. In this category, average scores between 85 and 89 percent earn 90 percent of the profit at risk, or $225.
Developing the reward mechanism required the greatest extension of trust by both parties and the greatest depth of forethought. For example, the vendor and Gannett realized that the profit at risk had to be awarded after the quarterly review meetings for it to be perceived as a reward for performance. To do otherwise might have required the vendor to return money if the desired results had not been achieved, thus creating the impression that the system was penalty based. The profit at risk also had to be sufficient to drive and reinforce the desired behaviors without being so large as to dissuade the service provider from participating altogether.
From the vendor’s perspective, there was a natural concern that Gannett might require testing so rigorous as to be too difficult to pass, while on the flip side, Gannett had to be assured that vendor testing procedures would be stringent enough to eliminate the possibility of artificially inflated scores. Because a history of frank and open dialogue had already been established, however, both Gannett and the provider were confident that they were entering a partnership with the single vision of quality assurance.
Ultimately, both entities funded the incentive pool, with Gannett agreeing to pay the vendor 110 percent of the profit at risk when success was achieved in the categories of customer satisfaction, job knowledge, and customer service. The vendor supported the other incentives by agreeing to forgo some of its profit if performance requirements were not met. And to ensure that line staff profited from the fruits of their labor, a percentage of the award was made available for distribution to the officers, at the discretion of the contract account manager.
Implementation. Gannett and its service provider agreed to wait six months after security staff began operations at the new campus in April 2001 before bringing the performance metrics component on board; by that point, Gannett’s employees had started to move to the site. This allowed security staff a window of opportunity to become familiar with their new surroundings, equipment, and post orders. It also provided the account manager with time to identify areas needing improvement, train staff to correct these deficiencies, and subsequently test officers and supervisors on the material.
Testing. The creation and preparation of the test material proved to be a difficult task, with the parties working together to formulate and script questions that would appropriately assess each officer’s knowledge and abilities while ensuring that the goal of a passing score was reasonably achievable. In the end, practice testing during the first quarter required a last-minute push by the vendor to ensure that all criteria created jointly by the vendor and Gannett had been met—for instance, that 100 percent of supervisors and 80 percent of nonsupervisory staff be tested in job knowledge and customer service.
The fiber of the relationship was tested when the results of the first set of written tests in the job knowledge category yielded an average score of only 71 percent. Wanting to correct this situation before the scores were considered permanent, the vendor asked for an opportunity to retest the officers. This raised the philosophical question of whether the purpose of the testing was to create a snapshot view of strengths and weaknesses, which would determine the period’s results, or whether it was to serve as an ongoing barometer of achievement, which would be an impetus for improvement. If the latter were the case, the vendor would be given a chance to correct the deficiencies through additional training and testing prior to the end of the quarter.
The two parties took this opportunity to review their commitments to the design of the system and identify what, if any, future advantage might be gained for both options. Without hesitation or reservations, Gannett’s service provider agreed that the spirit of the agreement was more important than any immediate need to realize profit; it elected to let the scores stand and live with the results.
The approach paid off. The next round of tests yielded some very good results. For innovation, the vendor scored 100 percent; in job knowledge, 90 percent; in customer service, 96 percent; in building stewardship, 100 percent; and in turnover, 23 percent. (Customer satisfaction was not included in the first quarter metrics because it is measured annually.)
Over time, the process of generating test material became easier; at the same time, the design of the tests and the content itself grew more difficult. Since the implementation of the program, testing formats migrated from multiple choice to fill-in-the-blank and to questions requiring short narrative answers. The tests now require more effort and seek more thorough information and knowledge. In some instances, important multiple-part questions such as identifying the action steps for the complete management of a building fire alarm were graded as “all or nothing” responses.
As rank-and-file staff became more experienced and more knowledgeable about the purpose and benefits of quarterly testing, they began to submit their own questions and ideas. Soon, such efforts spawned friendly competition among the officers, as they vied to see who could submit the most difficult question or innovative idea. There has also been some competition between shifts (scores are presented by shift), particularly because some of the bonus funds are awarded by shift. (They are divided among line officers by the shift captain.)
Results. By and large, the general feedback from staff has demonstrated that once employees understand the metrics program and why it exists, they like that the program helps keep the job interesting and allows them a quarterly bonus if their performance is up to snuff. This provides an incentive for staff to improve their performance. It has also kept turnover low.
Gannett has also discovered that as security officers have longer tenures at the company, the vendor has been able to make a substantive shift in the way training is being conducted. Typical training programs in the security guard industry consist of a series of building-block lessons designed to provide an officer with increasing competency in a specific skill set or area of knowledge. Often, however, officers learn the “how” without the “why.”
Security officers who can understand and articulate not just what they do but also why they do it will exercise better judgment when faced with unforeseen events and circumstances that force them to act without a script. Training for Gannett officers now focuses on the why as well as the how. This approach allows them to be better problem solvers.
For instance, consider Gannett’s emergency evacuation procedures. These dictate that security officers move to designated exit points from floors into stairwells; once there, officers obtain headcounts from employee safety coordinators as to how many people are left on the floor. Then they are to report those figures to the supervisor at the security console.
If the security officer knows that two people remain on the floor and waits until after the drill is complete or the incident is investigated before reporting it to the supervisor, this would technically be correct, according to procedure. However, if they know why they are collecting such information—to relay the information to emergency responders so that they will be aware the building is not clear—officers are more likely to complete the task with dispatch.
Since those first days of the program, metrics scores have increased as the officers’ knowledge base has grown. In June 2003, Gannett’s security provider achieved a high score in all categories for the first time. Especially noteworthy was the 94.9 percent approval rating on the annual customer satisfaction survey.
Both parties are pleased with the results, but both also know they cannot rest on their laurels. The success of the metrics program lies in its requirement for continual improvement, and the next step in the evolution is always just over the horizon.
Through hard work, communication, and trust the metrics system has paid off big for the vendor and Gannett. The provider has earned more money than if it had a standard contract, and employee confidence in the security staff has risen to an unprecedented level. Now that’s news.
Glenn W. Sandford, CPP, is the manager of Corporate Security and Safety at the Gannett Company in McLean, Virginia. He is a member of ASIS International and the Programs Director of the National Capital Chapter. @ For an example of the measurements used, go towww.securitymanagement.com.