360 likes | 372 Views
This source material examines the vulnerabilities, concerns, and issues surrounding the use of voting machines in South Carolina, Florida, and Ohio. It discusses the challenges of vote-counting, the misconceptions about voting machines, and the history of electronic voting issues. It also highlights the discredited voting systems, the lack of computer security testing, and the security, quality, and software issues with the Election Systems and Software iVotronic DRE system.
E N D
Unsafe for any Ballot Count:South Carolina’s voting machines and their analysis in Ohio (2007), Florida(2006), and South Carolina (2011) Duncan A. Buell, Eleanor Hare, Frank Heindel, Chip Moore For the League of Women Voters of South Carolina
Source Material SC (Buell/Hare/Heindel/Moore/LWVSC), 2011 of the November 2, 2011 election data Ohio, Dec 2007, study for the Sec’y of State Florida, 2007, study after the 2006 election Burr, Rivest, et al., for NIST Ohio, Nov 2003, study for the (previous) SoS California, 2007, study for the SoS Burr, Rivest, et al., for NIST Follow links from www.lwvsc.org
Why is Vote-Counting Hard? An election is a one time event—no do-overs Hard to test the scaling-up to full size Highly distributed, largely independent, using volunteer workers Vulnerable to corruption Vulnerable to disruption Highly vulnerable to error
Issues and Concerns Should voters get a receipt? Are ballots indeed secret? How do we accommodate persons with disabilities? How do we handle overvotes and undervotes? Are ballots voter-verifiable? Are ballots recountable and auditable? Can we audit the results?
A Common Misconception A voting machine is NOTlike an ATM. (There are laws, you have rights, there are receipts, and your money is somewhere.) A voting machine is much more like a slot machine. (What is your guarantee that the machine EVER pays out?)
(Recent) History Florida’s hanging chads and butterfly ballot, 2000 HAVA (Help America’s Vote Act), 2002 Florida 13th congressional district election, 2006 Lots of complaints, some of which are known to be justified (Horry County 19 January 2008) and many of which are probably not justified.
Electronic Voting Machines South Carolina: Election Systems and Software iVotronic DRE (Direct Recording Electronic) and Unity software/system for counting votes Operative study: EVEREST, submitted December 7, 2007, to the SoS of Ohio, done by UPenn and UC Santa Barbara EVEREST: the ES&S iVotronic systems “lack the fundamental technical controls necessary to guarantee a trustworthy election under operational conditions … from several pervasive, critical failures”
Other Discredited Systems Diebold/Premier (RABA, Avi Rubin/JHU) Sequoia (A. Appel and Ed Felten/Princeton) Nedap (Rop Gonggrip) There are no machines that have been tested by computer experts and have not been discredited.
Voting Machine Testing All machines are tested by “Independent Testing Authorities” (ITAs) But there are only a few ITAs And one was decertified for falsifying tests And none test for “computer security” issues And the paper trail shows that the same problem can occur multiple times without being fixed, but with ITA certification
The Issues Security—can the system be corrupted? Quality—can the system be trusted to be correct? Human factors—can the system function as it should under normal conditions?
Security (page 29-30) “lack the fundamental technical controls necessary to guarantee a trustworthy election under operational conditions … from several pervasive, critical failures” “…we attempted to identify practical procedural safeguards that might substantially increase the security of the ES&S system in practice. We regret that we ultimately failed to find any such procedures that we could recommend with any degree of confidence.”
Security (page 29-30) “The security failings of the ES&S system are severe and pervasive. There are exploitable weaknesses in virtually every election device and software module, and we found practical attacks that can be mounted by almost any participant in an election. For this reason, the team feels strongly that any prudent approach to security ES&S-based elections must include a substantial re-engineering of the software and firmware to make it ‘secure by design’.”
Security Through Obscurity? The Palm Pilot emulates a PEB and can reset all passwords. (page 66)
Security Through Obscurity? (page 52) “The mechanical locks supplied … were uniformly of very low-security designs that can easily be picked …” “For the first weeks of the project, we did not have the correct keys for much of the equipment; we frequently had to pick the locks in order to conduct our analysis.”
Software Quality • Writing bad, confusing, un-maintainable, and sloppy code is not that hard. • Writing clean, professional, maintainable, secure, code that is and secure and does exactly and only what it’s intended to do is very hard. • What we would simply mark off in a freshman’s work would be unacceptable from a senior.
Software Quality “a visible lack of sound software … practices” “a buggy, unstable, and exploitable system”
The ES&S System (page 84) • 515,000 lines of code • Nine programming languages • Four hardware platforms A large and complicated computer system by any standard
Code Analysis (pp. 53ff, 83ff) All code modules have buffer overflow bugs. “Avoiding buffer overflow bugs in input processing is regarded as one of the most basic defenses a system must have.” About 63% of the code is in memory-unsafe programming languages. Compilation on Visual Studio 2005 fails unless one turns off modern security standards.
Code Analysis (pp. 53ff, 83ff) Fortify (a standard code analysis program) finds hundreds of vulnerabilities in the source code, which indicates “that the vendor did not sufficiently validate their code.” In grading CSCE 240 undergraduate homework, I take off 20% for EACH use of a memory-unsafe function.
Passwords (Florida excerpt) • Passwords are hard coded in the firmware, identical in every machine. • An undocumented back door exists. • “This represents poor practice” • “These passwords provide very little security.” • “poorly conceived and poorly implemented” • Passwords are coded in the clear in devices. • Crypto keys are stored in the clear.
Passwords (Florida excerpt) “The Service Menu password, Clear and Test password, ECA password, and Upload Firmware password are three-letter case-insensitive passwords. Each one is chosen to be mnemonic and easy to remember. The problem is that they are also likely to be fairly easy to guess. They follow a memorable pattern. Someone who knows one of these passwords can probably guess what the other ones are without too much difficulty.”
Ballot Image Randomization (page 73) • The iVotronic “uses a weak randomization procedure” that “does not properly randomize voter selections in its audit logs”. • Random number generation is a well-established mathematical and computational science. NIST even publishes a testing document and test suite (Publ. 800-22). • Failing to use proper, tested, RN generators is just unprofessional and sloppy.
Software Quality Summary • These software problems are common in the code written by first-year students. • A first-year student’s A grade (for submitting code that ostensibly worked) would probably drop to a C for these errors. • A senior student’s A grade (for submitting code that ostensibly worked) would probably drop to an F.
Human Factors Duncan Buell, Eleanor Hare, Frank Heindel, Chip Moore FOIA-d data from several counties, including Richland, Charleston, Colleton, Lancaster, Berkeley, Lexington, Sumter, Florence We have tried to reconcile the certified official counts with the counts that are supported by the data. We have yet to find a county whose numbers add up properly. :
LWVSC Press Release, 14 Feb 2011 http://www.lwvsc.org
What Actually Happened? Two paths to “the truth” of the count: • PEBs collect totals from machines and these totals go into a master file (acting like slot machine tapes) • Individual vote data is collected from memory cards and goes into a vote image file (acting like the cash drawer) We tried to verify that these two truths were the same.
What Actually Happened? We found four different errors: • Memory cards not collected, so individual votes were not in the vote image file. • Two entire precincts were missing from the vote image file. • TWO PEBs (not one) were used to collect data in Ward 21, but only one had its totals uploaded. • SIX machines were not closed in Bluff and their data not collected until 11/9/2011. • 1127 votes not counted, 2800 votes without support
What Next? Hm.m.m.m.m.m…. :
What Next? Chip Moore and I are donating our code (Perl and Java), and I meet with R County on Thursday. LWVSC proposes a statewide mandate for this kind of audit: • Each machine used should be verified to be closed and its data collected. • Each PEB used should be accounted for and its data collected. • The PEB totals should match the vote image totals. Not rocket science: mine was an afternoon coding & a minute on my laptop for all of Richland County. :
LWVUS Positions SARAT--Voting systems must be Secure, Accurate, Reliable, Accessible, and Transparent, and voting systems must provide a paper ballot or record of the voters intent that the voter can verify during the voting process and that can be used for random audits and recounts. (LWV, Impact on Issues 2006-2008, p.11)
LWVSC Positions Voting machines must • include a paper audit trail that allows the voter to verify his/her vote and provides a reliable basis for a recount if required • be randomly tested during every election • use source code that is open for inspection. LWVSC believes that SC’s iVotronics do not meet these criteria