Progress Report on Canine Training
Print Issue: September 2013
Detection dogs have been used since World War II and can be trained to detect anything from bed bugs to blood to drugs and explosives. Although scientists have studied dogs’ super-sensitive snouts for ages, there is still no clear answer as to how they can sniff out trace amounts of a certain chemical, even when it is masked by other smells. One thing is clear, though: they have to be properly trained to be useful in security operations. But what does it mean to be properly trained?
Currently, there are a variety of government programs—including those run by the U.S. Transportation Security Administration (TSA), the U.S. Department of Defense, and U.S. Customs and Border Protection—as well as programs run by private companies specializing in certain fields of detection. In many programs, especially law enforcement and federal programs, handler/dog teams must complete a certain number of training hours and pass tests to be certified to enter the workforce.
However, there is no standard that the tests or certifications must meet, which can make the requirements behind certifications seem arbitrary, says Ken Furton, dean of the arts and sciences program and director emeritus of the International Forensic Research Institute at Florida International University.
For example, one certification might require 100 hours of training with a 75-percent pass rate on a double-blind detection test, where neither the handler nor the dog knows where the substances are placed. The other certification could be for 240 hours of training and a 90-percent pass rate in a test where the handler does know where the targets are.
The Scientific Working Group on Dog and Orthogonal Detector Guidelines (SWGDOG), founded in 2005, has been trying to address the problem by putting forth a set of best practices based on scientific research and feedback from experts in the field, says Furton, who also serves as chairman of SWGDOG.
“We’ve developed best practice guidelines that are based on the current state of the scientific literature and what the experts in the field believe are best practices,” Furton says. “We talk to scientists, academics, attorneys, administrators, federal agencies, canine handlers, veterinary professors, and so forth. We get everyone together and develop recommendations on what kind of protocols should be followed and try to get certification organizations to adopt these best practices.”
The organization issues guidelines for training, certification, and documentation of canines and handlers in everything from detecting human remains and insects to tracking down people lost in the wilderness or avalanches. Each field has a specific set of best practices that emphasizes rigorous training and certifications.
For example, the issued best practices for explosives-detection canines states that dogs must detect certain explosive substances during a double-blind test with distractors present. It also recommends regular, varied maintenance training.
Hank Nolin, the founder and CEO of Sun State Specialty K9s, a Florida-based canine detection program that deploys explosives-detection dogs at ports, supports that type of standard. He says that anyone searching for a canine detection training program should ask trainers how the dogs were certified. The dogs should be able to pass a double-blind test conducted by a third party—not the person or company who trained the dog originally. Neither the tester nor handler should know where the substances were placed, he says.
SWGDOG is having an impact. Furton says he’s seen a trend in the industry towards more standardized and rigorous protocols in training. As many as 50 local, federal, and international organizations and agencies he’s visited with have incorporated SWGDOG’s best practices into their programs, he says. Organizations such as the North American Police Work Dog Association, a member of SWGDOG, often offer certification programs based on SWGDOG’s guidelines. Handler/dog teams can sign up for the program to become certified. But there are still many programs that are not doing that. SWGDOG’s goal is to have the best practices applied to every program across the board, including all federal agencies, which currently are not consistent in their training.
The adoption of standards is one issue. Audits or reviews to determine whether standards are followed is another. That applies to standards that are internally set since there are no industrywide standards at this stage. For example, earlier this year, the Government Accountability Office (GAO) issued a report on the TSA’s canine detection program. It discussed two programs: teams that screen airport cargo and teams that screen passengers.
The two federally-funded programs are used in airports around the nation to detect explosives. The canine detection programs were funded with $101 million in 2012, up from $52 million in 2010. The increase was attributed to an increase in the stipend the TSA provides to law enforcement agencies that participate in the programs.
The report found that although the agency tracked the number of training minutes canine teams conducted on a monthly basis, as well as the types of explosives and search areas used when training—in accordance with SWGDOG best practices—it did not analyze the data. An analysis of the data by the GAO revealed that a number of the TSA’s canine teams did not meet the TSA’s own training requirements for at least six months in 2012. Even more, “this information was previously unknown to TSA until we brought it to its attention,” the GAO wrote.
The report also found that the TSA deployed the new passenger screening program at low-risk airports without studying its effectiveness. “TSA began deploying [passenger screening canine] teams in April 2011 prior to determining the teams’ operational effectiveness and before identifying where within the airport environment these teams would be most effectively utilized,” the report noted.
GAO spokesman Stephen Lord notes that the TSA does a good job of keeping records, but doesn’t know how to use data it collects. “When you get so busy deploying the dogs, managing it on a day-to-day basis, sometimes you forget,” Lord says. “And that was our point. Here you have these nuggets of information, why don’t you use them?”
The GAO recommended that the TSA regularly analyze program data and determine the effectiveness of its screening program before training or deploying any more passenger screening canines.
Furton has some sympathy for the TSA’s position. He points out that some of the problems the agency’s canine programs face are similar to what the canine detection industry as a whole is dealing with. There is too much information and data on how the dogs should be trained and used, but not enough people are analyzing that data. TSA is collecting the information in a standard way, he notes.
“Programs like the TSA program—the fact that they have moved toward standardization of how they do things by collecting information, that’s a positive,” Furton says. “I think it’s a sign that the industry in general is moving in the right direction, moving towards the continuous improvement process. It needs to take that last step towards closing the loop, using the information and making sure that if there are certification rates that are not what they should be, that they use the data to find out how to improve that.”
In a response to the GAO report, the TSA sent a statement to Security Management saying that the agency was taking some corrective action. Beginning in March 2013, the TSA said it had plans to improve functionality and reporting capabilities addressing a GAO recommendation.
The statement notes, “The National Canine Program is executing a new training and assessment initiative designed to identify optimal passenger screening canine working zones for the 120 teams that were authorized for deployment by the end of Fiscal Year 2013. By February 2013, TSA will [have completed] passenger screening canine effectiveness assessments at Miami, Oklahoma City, and Washington Dulles airports...”