Voting’s Hash Problem: When the System for Verifying the Integrity of Voting Software Lacks Integrity Itself

Problems uncovered in the hash-verification process for voting machines made by Election Systems and Software highlight issues with the trustworthiness of voting software and voting machine vendors

Voting’s Hash Problem: When the System for Verifying the Integrity of Voting Software Lacks Integrity Itself
(Elliott Stallion / Unsplash)

In September 2020, just weeks before voters went to the polls in one of the nation’s most critical and contentious presidential elections, state officials in Texas learned of a disturbing problem with election software used widely across their state and the country: a component of software provided by Election Systems and Software — the top voting machine maker in the country — didn’t work the way it was supposed to work.

The component wasn’t involved in tabulating votes; instead it was a software tool provided by ES&S to help officials verify that the voting software installed on election equipment was the version of ES&S software certified by a federal lab, and that it hadn’t been altered by the vendor or anyone else since certification.

But Texas officials learned that the tool — known as a hash-verification tool — would indicate that ES&S software matched the certified version of code even when no match had been performed. This meant election officials had been relying on an integrity check that had questionable integrity.

When voters or security experts express concern that elections can be hacked, officials often cite the hash-verification process as one reason to trust election results. Hash verification involves running software through an algorithm to produce a cryptographic value, or hash, of the code. The hash — a string of letters and numbers — serves as a fingerprint of the program. If the software is altered and then run through the same hashing algorithm again, the hash that’s created won’t match the original hash.

But Brian Mechler, an engineering scientist at Applied Research Laboratories at the University of Texas at Austin, discovered while testing ES&S software for the Texas secretary of state’s office last year, that the company’s hash verification tool didn’t always work correctly.

Letting the vendor conduct verification checks of its own software was the equivalent of the fox guarding the hen house, one voting systems examiner for the state said.

This wasn’t the only problem: ES&S’s own employees were sometimes conducting hash checks instead of election officials — a conflict of interest that undermined a process meant to provide officials with an independent integrity check of software installed on their machines.

Texas officials were furious with the findings, and in emails exchanged among themselves and with ES&S — obtained by election integrity activist Jennifer Cohn through a public records request — they expressed alarm that the vendor had been operating without oversight. Letting the vendor conduct verification checks of its own software was the equivalent of the fox guarding the hen house, one voting systems examiner for the state said.

“If the hash validation process is performed by the same vendor technician who performed the [software] installation,” the secretary of state’s office wrote in an email to ES&S, “then that validation process loses one of its major purposes, which is to keep the vendor honest and ensure that the vendor has complied with the certification requirements imposed by the state.”

The conflict of interest was even more concerning because a number of hash checks conducted on ES&S software failed verification, but ES&S withheld this information from the secretary of state’s office because it deemed the failures to be insignificant and not true failures.

Jacob Stauffer, vice president of operations for Coherent Cyber, who conducted voting-machine security assessments for California’s secretary of state for a decade, said vendors often do hash checks for election officials because the process can be daunting to manage.

“[N]ot to be saying anything bad about election officials, but ES&S hash verification process may be over the head of some of these election officials, and it’s not intuitive whatsoever,” he told Zero Day.

An ES&S spokeswoman told Zero Day that in the wake of the Texas findings, it’s “streamlining” the hash validation process to make it more intuitive for local jurisdictions to perform. “The new process will reduce the number of steps which must be performed to help ensure the process is completed accurately,” Katina Granger wrote in an email.

Susan Greenhalgh, senior advisor on election security to Free Speech For People, an election integrity organization, says the issues in Texas illustrate how the legitimacy of election integrity checks can dissolve quickly when inspected closely.

“Though it has its limitations, hash verification is an important tool for checking voting system software versions. Unfortunately, because of the fatal flaws in [ES&S’s] hash verification script… ES&S hash verification has little to no meaning,” she told Zero Day.

Hashes to Hashes

The need for voting software verification and oversight of voting machine vendors first became urgent back in 2003 when Diebold Election Systems was caught installing uncertified software on machines in four California counties and was accused of lying to state and county officials about it.

Since 2005, federal voting system guidelines have required voting machine vendors to provide election officials with a reliable method for verifying that the correct versions of software are installed on their voting machines, that the software hasn’t been altered since installation, and that no unauthorized software is on the systems as well.

“If the hash validation process is performed by the same vendor technician who performed the [software] installation,” the secretary of state’s office wrote in an email to ES&S, “then that validation process loses one of its major purposes, which is to keep the vendor honest and ensure that the vendor has complied with the certification requirements imposed by the state.”

To do this verification, the federal Election Assistance Commission and the voting system testing labs, which examine and certify voting software and hardware, create a “trusted” hash of each election software program, once it’s certified for use in U.S. elections.

Then when election officials receive new election software or updates to existing software to install on their machines, they create a hash of that software and use the vendor’s verification tool to compare that hash to the “trusted” EAC hash. Officials can also do a hash check before and after each election, though this isn’t done in most jurisdictions. [Maricopa County, Arizona had an independent lab do hash checks on a sample of its Dominion voting machines after Donald Trump and supporters disputed the presidential election results in that county last year.]

Jurisdictions vary in how they conduct hash checks. With ES&S software in Texas:

  • Software extracted from election equipment is loaded onto a USB drive.
  • The trusted EAC hash is loaded onto a second USB drive, along with three scripts: one for hashing the software extracted from the election equipment; one that compares the hashes of that software to the EAC hash; and a third script that reports the results of that hash check.
  • The USB drives are inserted into a standalone system that isn’t used for elections, where the hash check is conducted.

The script that conducts the hash check is an open-source application called “diff”; the script that reports the results is written by ES&S. The problem Mechler found was that the ES&S reporting script would indicate that hashes matched even when there weren’t two hashes being compared. Mechler discovered this by accident one day when he forgot to upload the trusted EAC hash to the system doing the hash check.

Although the “diff” script produced an error message indicating it couldn’t find the EAC hash to do a check, the ES&S script reported that the hash check was successful, even though no verification was done. It was the same with every ES&S voting machine model Mechler tested — the company’s DS200 optical-scan system (one of the most popular voting systems used across the country), the DS450 and DS850 optical-scan machines and the company’s line of ballot-marking systems: the ExpressTouch, ExpressVote, and ExpressVote XL.

In a report he wrote for the secretary of state’s office, Mechler criticized ES&S for producing a poorly written verification script.

“[The script] should have performed explicit checks on the existence of the two files being compared; failing loudly if either does not exist,” he wrote. “[T]his bug … indicates that ES&S has not developed their hash verification process with sufficient care, quality assurance, and concern for usability.”

But it wasn’t just ES&S that failed to verify the script’s accuracy. Vendors are required to include their verification method or tool with their voting software when they submit the software to federal labs for testing and certification. But there’s no indication that the labs verify that the vendor verification methods and tools work; they simply check that the vendor submitted a tool.

“[T]his bug … indicates that ES&S has not developed their hash verification process with sufficient care, quality assurance, and concern for usability.”

The Government Accountability Office recognized this problem more than a decade ago in a report it published in 2008. The GAO advised the EAC to create a repository of certified voting machine software, establish procedures for conducting hash checks of that software, and create a protocol for testing vendor hash-check tools and making sure they work. Four years later, however, the GAO noted that the EAC had ignored their advice and had no plans to develop standards for hash-verification tools or a testing protocol to verify that they work properly. This meant, the GAO wrote, that state and local jurisdictions would lack “the means to effectively and efficiently verify that voting systems used in federal elections are [using] the same [software] as those certified by EAC.”

The EAC did not respond to questions from Zero Day.

“The GAO predicted, almost ten years ago, that without reliable testing of the hash verifications by the EAC, states would not have [the reliable] tools they needed to verify voting system software,” Greenhalgh says. “And that’s exactly what happened.” She believes that a failed verification tool should disqualify a system from being certified.

Additional Problems

The problem with ES&S’s hash verification tool is exacerbated by the fact that ES&S’s own employees sometimes conduct hash checks of their own software instead of election officials doing those checks. At least this is according to Mechler’s report and the correspondence among Texas state officials that Jennifer Cohn obtained. But ES&S denies this in an email sent to Zero Day.

“It has always been standard procedure for jurisdictions to perform the hash validation process,” spokeswoman Granger wrote.

She provided an excerpt of a letter the company sent the Texas secretary of state’s office last fall, which says that “all software and firmware upgrades are performed by ES&S Technicians, and … those Technicians then verify that the software and/or firmware loaded is, in fact, the correct certified version.” But “ES&S’ technicians do not complete hash verifications [emphasis added] once the voting system has been upgraded,” the excerpt states. ES&S technicians run a “system readiness comparison to confirm that the version that was installed is the correct certified version that was approved by the Secretary of State,” and that Texas customers are then “encouraged to perform the hash verification procedures on the software and/or firmware that was installed.” In essence, the letter states, “verification procedures are performed twice” — once by ES&S’ technicians, and once by election officials.

Granger did not provide the entire letter for context, nor did she say how ES&S technicians verify the software, if this doesn’t involve a hash check, but the letter seems to suggest they use a different method other than hashes.

Although the ES&S letter states that election officials are then encouraged to do a hash check of the software, it’s not clear how many customers actually do this — especially if they believe the verification that ES&S technicians do satisfies this requirement. Granger did not respond to follow-up questions from Zero Day, and the Texas secretary of state’s office didn’t respond to multiple inquiries. Mechler also did not respond to emails.

Notably, Greenhalgh’s group found an ES&S contract with Collin County, Texas, which stipulates that the company’s employees must conduct “acceptance testing” of their own software and equipment, or else officials will void their machine warranties. Acceptance testing occurs when a state or county receives new voting equipment or software, or updates to existing software. Officials check that all components are present and working properly. And in Texas, the acceptance testing also includes a hash check of the new software.

Given that the Collin County contract stipulates that ES&S does the acceptance testing of its own equipment, instead of election officials, this suggests that the company does the hash check of the software as well. But Granger insists that ES&S technicians stop short of conducting the hash check.

In any case, Greenhalgh and others say that even giving ES&S contractual authority to do acceptance testing of its own hardware and software is a bad practice that provides opportunity for a rogue insider to subvert the systems and cover their tracks.

This is a particular concern because Texas officials discovered that in some cases, the hash checks conducted on ES&S software sometimes failed — meaning the hash of the installed software didn’t match the EAC trusted hash. And when this occurred, ES&S told county election officials it didn’t matter and that they should ignore the failures.

ES&S claimed the mismatches were caused by a small bitmap file that displays a copyright date when the voting machines boot. The issue only cropped up with the company’s ExpressVote machines and only if the machines received a software update that didn’t also include an update to the bitmap file installed on the machines. If the old bitmap file on the system had a different copyright date than the bitmap file in the certified version of the software, then a hash of the newly updated machine software wouldn’t match the EAC hash, since the copyright dates in the two bitmap files would be different. ES&S insisted to the secretary of state’s office that a non-match in this case is actually a match, if the only difference between the two hashes is the bitmap file. The mismatch is expected and thus is “considered a match.”

Texas officials called ES&S’s stance on the issue “very troubling”. The company seemed to be missing the fact that bitmap files, regardless of how small and innocuous they may seem, can harbor malicious code. The NSA in fact had written about how bitmap files can be used to exploit systems.

Mechler wrote in his report to state officials that the fact that the hash failures occurred because of just a single file “is of no comfort because it still opens a vulnerability to an insider threat.” What’s more, an insider would know exactly which file to compromise, and which file election officials were being told to ignore when the files produced a hash-verification failure.

Texas officials wrote ES&S that it’s attitude raised “doubts about our ability to trust your team to report and address these issues with us.” In emails exchanged between Mechler and others in the secretary of state’s office, they raised concerns that ES&S was training election officials to overlook or dismiss hash verification failures, when any failure should signal that the software shouldn’t be trusted or used.

“The only thing that the jurisdiction has to go on here is your word that this mismatch is the expected result,” Christina Adkins, legal director for the secretary of state’s office wrote ES&S in an email. “They have no way of knowing whether the mismatch occurred because it is the expected mismatch, or because the mismatched file was somehow altered or manipulated. The hash verification process does not distinguish between ‘expected’ mismatches and malicious mismatches, it simply identifies that a mismatch occurred.”

Texas officials called ES&S’s stance on the issue ‘very troubling’ and said it raised ‘doubts about our ability to trust your team to report and address these issues with us.’

Texas officials contacted the EAC to weigh-in on the matter. The EAC in turn contacted the voting machine testing labs to get their advice. After reviewing the bitmap file that caused the hash problem, the labs agreed with ES&S that the mismatched hashes weren’t a concern. Then with the presidential election just weeks away, the labs, the EAC and Texas state officials all agreed that jurisdictions could use the ES&S software.

If you found this article useful or interesting, feel free to share it with others.

If you’d like to receive more articles like this to your email inbox, subscribe here: