Skip to content
Menu
menu

Illustration by Taylor Callery

Management Trends

Security managers already know that culture is key, that understanding generational differences can reduce conflict, and that effective leadership can pave the way to the C-suite. The next trend in the management field, behavioral economics, can help security design programs that get buy-in from employees.

What is the underlying theory of your security program? It may be about punishing bad behavior, with employees written up by managers and then referred to counseling. Or, it may be about rewarding good behavior, such as praise and performance awards for security compliance. 

Chances are it’s some combination of the two, using both carrots and sticks. But there’s another, perhaps deeper, question that is often telling—why do people make choices to either comply, or not comply, with your security program?

All around us, there are small clues guiding those choices. It’s time security leaders started shaping those clues to protect employees, customers, property, and other assets. They can do so by using the applications of one of latest trends in social science—behavioral economics.​

BEHAVIORAL ECONOMICS

Behavioral economics is the scientific examination of why people and organizations make the decisions they do, in an economic context. Its scientific pedigree has its origins in the 1970s, when technology was driving major improvements in brain research. At that time, new computing tools designed to assist in modeling, in tandem with Daniel Kahneman’s Nobel Prize–winning research on prospect theory (an economic theory that seeks to explain how people make decisions based on risk), provided a new research framework to explore how economic choices are made. Today, behavioral economics combines the practice of economics, neurobiology, and psychology to gain insight into why human beings act, or fail to act, in predictable ways.

At some level, most of us realize that our decision making is influenced by a variety of factors outside of our control, such as organizational norms, peer pressure, emotions, accepted stereotypes, and mental shortcuts. By closely analyzing these factors, behavioral economists can gain a sophisticated understanding of why people, and organizations, make the decisions they do—which factors take precedence over others, how different factors interact, and so on. They can also develop cues designed to steer a person or organization to a desired outcome. Such cues have been termed nudges; the people that help frame those decisions are called choice architects. 

Public awareness of behavioral economics has slowly been gaining ground since the development of “nudge theory,” an offshoot of the science, by two academics, University of Chicago economist Richard Thaler and Harvard legal scholar Cass Sunstein. In their 2008 book Nudge: Improving Decisions about Health, Wealth, and Happiness, the two scholars postulate that there are subtle and blatant clues everywhere to influence behavior. (In the wake of his book’s success, Sunstein went on to serve as administrator of the White House office of information and regulatory affairs from 2009 to 2012.) Those clues may be accidental, but they can greatly impact the decisions we make, and there are scientific reasons for why they work or fail.

The authors argue that behaviors are guided just as much by on-the-spot decisions based on these clues, and the context these clues are found in, as they are by deeply held ethical or moral codes. Under the authors’ definition, a clue can be considered a nudge if two criteria are satisfied: the individual is free to choose it or not, and there is very little or no cost in choosing to go with the nudge as opposed to other options. In this way, nudges are meant to be subtle, not overtly coercive.  

The nudge concept isn’t entirely new. We’ve been nudged in many ways since birth. It only takes a trip to the grocery store to notice that the sugary sweet cereals are stocked at exactly the eye level of a seven-year-old, while bran flakes occupy the upper shelves. Consumers’ decisions about what action to take are influenced largely by what is put into their path. At any given time, our brains are processing a mountain of information and sensory input, so easy choices, which require less effort than searching for another option, are often viewed by the mind as the correct ones. This is especially true if the clues and context surrounding those choices don’t make them seem especially important.​

SECURITY NUDGES

Imagine having the ability to use nudges and clues as a designer and enforcer of a security program? The secret is that that you do. As a security manager, you have the ability to help make the correct choice for security the simplest choice for the user. In other words, you are a choice architect.

However, one concept must be understood before security managers can become effective choice architects. Thaler and Sunstein describe the concept as the difference between econs and humans. Econs are imaginary constructs developed by the writers of economics textbooks. They are people with the brilliance of Einstein, the self-control of Gandhi, and the logical prowess of a Vulcan who can predict reactions in a variety of environments. All econs do the same thing—and almost always, the correct thing—in any given situation.

In case you hadn’t noticed, we don’t work with econs. We work with humans. Humans are generally smart and well-meaning, but they are far from perfect in on-the-spot decision making. Further, humans are barraged every day with factors that drive them to do exactly the opposite of what their infinitely wise, long-range-thinking econ-selves would do.       

Unfortunately, the idea that econs and humans are interchangeable continues to stick around in the world of security. The overwhelming majority of security policies today treat employees as econs, not as the humans they truly are. Econs don’t need assistance complying with our complex security policies, humans do. So the idea is to help nudge the humans in the right direction—toward security compliance.      

Following are several examples of how nudge theory, and choice architecture, can be used in a security context.

GAMING SPEED   

An interesting example of a security nudge comes from law enforcement in the form of a speed camera that rewards speed compliance. In 2008, the city of Stockholm, Sweden, introduced a speed camera along a problematic stretch of road in a town center. Initially the camera was placed to record the speed and license plates of violators, but later it was made the focus of an experiment in nudging. The camera would record not only the speed and license tag numbers of speeders, but also the speed and license tags of those who were respecting the 30 kilometer-per-hour (kph) speed limit. 

At the end of the experiment, all drivers who were photographed driving at or below the speed limit were entered into a raffle, with the winner awarded a check for 20,000 kroner (roughly $3,000) partially paid by the fines of speeders. This spurred a dramatic change in average speed. Prior to the experiment, the average speed on that stretch of roadway was 32 kph. After the introduction of the “speed lottery,” the average speed dropped 22 percent, to 25 kph.  

Besides being a successful nudge, the speed example is also an excellent example of gamification. It encouraged people to comply with speed limits and improve public safety, while also giving them entry into a larger game to win a tangible, but not budget-busting, prize. 

OUT OF POCKET

Security nudges have also been employed to increase security efficiency and compliance at airports. One of the first took place at the Nepalese airport of Tribhuvan, where officials noticed a marked increase in graft among airport customs inspectors. 

Nepal was hard hit in the economic slowdown of 2008, and many Nepalese sought employment outside of the country to support family members. When these expatriates returned to Nepal, crooked customs inspectors preyed upon them by insisting on bribes in exchange for quick facilitation through customs while they were in possession of foreign currency, which otherwise could have delayed their entry. 

Nepalese anticorruption authorities fought back by redesigning the uniforms of airport customs workers to remove all the pockets. Collecting payola becomes much more complicated without a convenient pocket to quickly stash the loot. The lack of pockets also served as a reminder for the customs workers to adjust their behavior and avoid illegal activity. Every time employees reached for their pockets, they were reminded about corruption and management’s refusal to condone it. Although there has been no formal study performed to assess the effectiveness of bribe-resistant trousers, news reports have found that graft and bribe-taking has been reduced at Tribhuvan airport.  

Creative nudges also help the flow of lines at U.S. airport security checkpoints. By and large, passengers choose the shortest available line to proceed through security screening. However, each passenger situation is different, so the shortest line may not necessarily turn out to be the fastest—six frequent business travelers familiar with airport security routine might proceed much faster than a vacationing family of four that fly infrequently.  

So, airports near ski resorts have taken to designing self-selection lines marked according to a ski slope theme: Green Circles for families and those needing special assistance, Blue Squares for frequent travelers somewhat familiar with TSA procedures, and Black Diamonds for the expert travelers.  

Under this system, there is no enforcement of lanes; passengers are free to choose whichever line they wish. However, by encouraging people to make proper line choices through color coding, security personnel are able to channel passengers toward the type of security screening they would be best served by, and increase the overall efficiency and security of the entire system. In nudge theory terms, this is a good example of placing a “designed decision” in front of a security customer.​

ENGAGE TO NUDGE

The National Retail Federation estimated 2014 retail losses due to inventory shrinkage at $44 billion. Facing such challenges, the field of loss prevention is one of the most dynamic in security today, and is also a discipline full of nudges.  

Most retail stores have some form of CCTV monitoring for the prevention and investigation of theft, and this technology can be used to nudge customer behavior. The most visible nudge is conveyed through the placement of a live CCTV video feed at the store entrance.  This provides an immediate environmental reminder to would-be thieves that they are being watched and the store is on the lookout for shoplifters. 

Another frequent nudge is conveyed through employee engagement with customers. According to the ASIS Retail Loss Prevention Council, a staff that greets customers and maintains active engagement with them can significantly reduce retail theft. 

There are actually two nudges here. The first is the interaction between the employee and shopper; the customer is reminded that the employee is committed to the job, and consequently of the risk of getting caught if the shopper decides to shoplift. The second is the employer nudging the employee to habitually engage customers. This is usually accomplished when the employer sets default rules; it becomes the expected norm of all employees through training, feedback, and evaluations. The added benefit is that it allows security and customer service to be on the same side of an issue, and that’s an increasingly rare opportunity.  

Other possible nudge cues to deter shoplifting are explored in the paper Nudge, Don’t Judge: Using Nudge Theory to Deter Shoplifters, by Dhruv Sharma and Myles Scott of Lancaster University. They include signs that offer to donate profits not lost to shoplifting to charity; attention-grabbing events such as music or videos when customers interact with certain products; and applying the general premise of crime prevention through environmental design (CTPED) to store layouts to increase visibility and surveillance coverage. ​

NUDGE TRAINING

Security nudges have also been incorporated into awareness training. In 2014, the XL Group, a global insurance provider, sponsored an employee challenge. Each time an employee viewed one of the company’s security videos, XL would donate a dollar to charity. The videos were short (usually about a minute long), and focused on helping the employee secure not only vital company information, but personal information as well. The donations also appealed to an employee’s sense of social responsibility by involving a charity. The campaign managed to amass over 10,000 views of security videos, and a hefty charity donation.

Some U.S. government agencies are also using nudge theory practices in security training. In an effort to train employees on the proper ways to respond to email phishing attacks, one agency offered the following incentive: everyone who correctly followed procedure in a phishing attack exercise was made eligible for a small “Phishing Derby” prize. The cost of the prize was minimal (less than $50 dollars), but offering it greatly increased participation compared with previous exercises.  

Another agency took a different approach. When the agency sent out reminder notices to employees to complete mandatory security training, it made sure that the notices included the percentage of other employees who had already completed the training. Thus, this approach used peer pressure to conform in a nudge aimed at achieving the desirable result. The result was a higher completion rate, and in a shorter time, than previous years.  ​

DEVELOPING SECURITY NUDGES

Nudges can be used anywhere a user is offered a choice to do the correct thing versus the incorrect thing. The keys are understanding your security policy, understanding your users, and sustaining a willingness to experiment.   

The best place to start is with your own security metrics, especially those that are the most problematic. What areas, process, or programs have been the most troublesome in terms of compliance? A brainstorming session with a good cross section of security personnel (who in this context are serving as choice architects) often results in useful data and ideas for developing nudges. This cross section should include not only program leaders but program users, who are often the source of the most valuable insights—they provide the “ground truth” on how effective existing security measures really are, and on the parts of the program that are most at risk of noncompliance.  

It’s also important to recognize what kind of decision we’re trying to influence, in the terms sketched out by Thaler and Sunstein:

  • A complex decision: A decision with many variables
  • An overwhelming decision: A decision with many options
  • An infrequent decision: A decision that comes up very rarely
  • A low feedback decision: No obvious feedback from the decision
  • A delayed consequences decision: Where the feedback comes much later

Then, according to Thaler and Sunstein, we need to figure out what flavor of nudge to use:

  • Default rules: Change the rule for everybody to a compliant default
  • Environmental reminders: Posters, checklists
  • Commitment reminders: Constant reminders to steer behavior, like wearing a fitness band as a  reminder to take the stairs
  • Designed decisions: Placing the correct decision in front of the customer at the instant the decision needs to be made

When implementing nudges, it’s always important to keep two things in mind: ethics and metrics. Ethical nudges don’t compromise the autonomy or the integrity of employees and customers. They simply nudge them into making the correct decision regarding policies they have already agreed to.

Metrics are necessary both to ensure that the nudges are effective and to justify resources needed to implement them. Few things in business are free; even things that seem small normally have some kind of cost attached to them. The best way to address management on these issues is the cost-benefit approach: have a story to tell, explain the financial and reputational costs of noncompliance, and come prepared with a full cost accounting of the nudge and a plan to for implementation. Make approving your plan the “easy” thing to do. If you haven’t caught on by now, you’re nudging your management. ​

SAMPLE SECURITY NUDGE

Here’s an example case of how security nudges can be developed. Nudgella, the security manager at Company X, has noticed an increase in security incidents involving sensitive company information left unattended in the copy room. So Nudgella sets a meeting with the head of the guard force, along with representatives of human resources and IT, to determine the causes and seek solutions. 

In the meeting, it is determined that the issue with the copy room is that employees are printing sensitive documents to the community printer and then failing to retrieve them. Thaler and Sunstein would call this a “delayed consequences decision.” The person actually printing the document doesn’t suffer any consequences for failing to retrieve it for a period of some time, if at all.  

Those attending the meeting brainstorm solutions, and three rise to the top for possible implementation: an environmental reminder in the form of signs placed around the office reminding employees of their responsibility to safeguard sensitive information; a default rule that would switch all employees to a “secure print” mode where they would be required to input a code at the printer to retrieve their document; and a commitment reminder in the form of a pop-up window reminding employees to retrieve their printouts every time the print button is clicked on.  

Now, the managers need to convince the C-suite. They arrange a meeting, and the security manager brings in a well-developed plan that can be implemented at minimal cost. Since the IT folks were brought in at the beginning, the technical solutions of secure printing and pop-up banners are well thought out. Since HR was part of the process, any concerns about ethics and privacy were addressed early on. The guard force has already agreed to make periodic rounds of the copy room to assess compliance and provide metrics reporting.  

The CEO and CIO couldn’t be happier with the effort. Nudge accomplished.  

EMBRACE CHOICE, EMBRACE CHANGE

Here’s the big picture question for security managers: Is it easier for an employee to comply with specific security policies and procedures, or not comply? If the answer is not comply, some nudges may be in order.

Given its importance, security compliance can be seen as a high-value, all-encompassing moral imperative. But managers should also view it as a series of choices made every minute of every day by every individual. Thus, it is the job of the security professional to enable every individual to make the correct choice by making those choices the easiest and least painful ones. Security managers are not just compliance enforcers. They should also embrace their role as choice architects, which will lead them to become change architects as well. 

--

Sean Benson, CPP, is a program security specialist at ISS Action, Inc. He is currently leading technology protection efforts on NASA’s Space Launch System. He is the chairman of the ASIS North Alabama Chapter.

arrow_upward