Will Your Vote Count?
THE SECURITY OF electronic voting machines has concerned computer scientists and voter accuracy advocates for years. They’ve warned that these machines (especially the ones with no paper trails) have the potential to be tapped into and tampered with. And such an attack could be hard to trace or even detect, according to David L. Dill, Stanford University computer science professor and founder of the Verified Voting Foundation.
A New York University study that looked at three types of voting machines in 2006 found “significant security and reliability vulnerabilities” with all three systems it examined. These included precinct count optical scans (PCOS), direct recording electronics (DREs, which include touch screens and others) and DREs with a voter verified paper trail, which is normally a paper record of the vote that sits behind a glass or plastic screen that a voter is supposed to check for accuracy before a vote is recorded.
The study stated that software attack programs might not be difficult to employ against these machines. To guard against a hack attack or to ensure that one would be detected, the study recommended that there be audits comparing machine counts with voter verified paper records, as well as random testing of paperless machines on election days. In addition, the task force called for decentralizing programming and administration of machines so that hack attacks against multiple jurisdictions and statewide elections would be more difficult.
Several states have run well-publicized studies on their own voting machines. For example, California released a “red team” evaluation in 2007. In that test, state-sanctioned hackers attacked the state’s voting systems and found that machines from all three of the state’s top vendors (Diebold, Sequoia, and Hart InterCivic) were vulnerable to compromise.
Peter Lichtenheld, director of election operations for Austin, Texas-based Hart InterCivic, says the company has several security controls to protect against an incursion in their no-paper-trail DRE voting machines. First, the votes are saved in three places: in a secure flash memory card that is in the voting unit; on the actual voting unit; and in the centralized controller, which can connect to up to 12 machines. Lichtenheld stresses the extremely low likelihood of being able to attack all three sites (although auditing is ultimately needed to make sure that the votes match up).
Lichtenheld cites the physical security of the machines as another barrier to compromise. For example, the presence of port protectors on the memory card area and tamper-evident seals on machines are additional deterrents.
The California red team got through to only one of the vote recording sites, says Sujeet Shenoi, a University of Tulsa computer science professor who was part of the red team that looked at Hart InterCivic’s eSlate and other vendors’ machines.
While Shenoi says that he thinks it’s possible to break into the other sites as well, Lichtenheld maintains that it would be “nearly impossible” during an election situation to tamper with all three vote recording sites at once. Any tampering with the votes at only one of the sites would easily be caught, because the number of ballots would not match the number of voters checked in.
Lichtenheld adds that although Hart InterCivic has not yet permanently changed the software and hardware on its machines (that would entail releasing a new product), the company has been working with the California Secretary of State’s office “to make certain that use conditions (people and processes) that are in place make the systems as secure as possible.”
Along those lines, it should be noted that the California study found that many security risks could be mitigated by adequate physical security at the polling site, and Lichtenheld agrees. “Poll workers are a pretty great security police force.”
Another security tool that Hart InterCivic and several other companies employ is cryptography. Hart InterCivic’s eSlate machines have cryptographic keys that must match up against the polling databases.
Dill says the additional complexity for poll workers and administrators is an argument against cryptography. “With cryptographic voting, it’s not clear that average people have the capacity or maybe, patience, to understand the system well enough,” says Dill, adding, “if something becomes sufficiently complicated or mathematically difficult that it’s like black magic, then, I don’t think it can be transparent.”
Supporting that view is a 2007 evaluation of Diebold software released by the Florida Secretary of State and conducted at Florida State University. It found that the company itself was not applying cryptography effectively before the software got to the poll workers and the election administrators.
Although cryptography is a technical attempt to ensure that no one can hack into machines to alter vote counts, Lichtenfeld says that many states are going back to emphasizing paper trails.
A paper trail is one way to verify voting accuracy and combat potential software attacks. However, paper trails are also problematic, according to a recent study for the new book Voting Technology, by a team of experts from the Universities of Maryland, Rochester, and Michigan. The team found many problems with the voter verified paper trail, from the low quality of the printing technology to the fact that many voters didn’t even notice the paper while voting.
Instead of voter verified paper trails, Dill supports the use of PCOS machines, where voters fill out ballots themselves with pencil and scan them into the machine. Although there are software vulnerabilities and glitches possible with PCOS, the ballot is filled out by the voter, rather than a computer.
At press time, the Election Assistance Commission (EAC), an independent group that was created by the Help America Vote Act of 2002, was accepting public comment on updated voluntary voting system standards. EAC establishes standards for machines generically. Any manufacturer that meets those standards can become EAC certified. Ultimately, however, it is up to states to select machines, and states may or may not require manufacturers to have their machines EAC certified, according to EAC spokesman Bryan Whitener.
Meanwhile, it appears that some states are beginning to agree with Dill’s support of PCOS machines. For example, Maryland recently chose to replace its touch screens with optical scanners. Additionally, Florida plans to have optical scanners in place before November’s elections, and California and Ohio have placed restrictions on the use of touch screens following reviews that questioned their security and reliability.
Dill reiterates the Brennan Center’s finding that use of these or any voter verification systems must be accompanied by random audits throughout the process to ensure that the machine count matches up with the ballots and that no other traditional concerns, such as ballot stuffing and stolen ballots, come into play.
“Electronic voting is a technology that stops transparency in the election process,” says Dill, adding, “Paper ballots are not a magical cure, but they at least are compatible with a set of processes and checks and balances that can give us trustworthy election results.”