260 likes | 267 Views
A Vision for the Testing of Election Systems in a HAVA World. Eric Lazarus EricLLazarus@yahoo.com. Transparent Identification Recommendation Cost effective Broad coverage Reliability Accessibility Usability Security. Encourage high-value innovation
E N D
A Vision for the Testing of Election Systems in a HAVA World Eric Lazarus EricLLazarus@yahoo.com
Transparent Identification Recommendation Cost effective Broad coverage Reliability Accessibility Usability Security Encourage high-value innovation Pick correct structure given success How Rate a Testing Capability?
This is a tough problem • 1983 Turing Award Lecture – Ken Thompson showed that conventional methods will fail • A Trojan Horse can live in a compiler, linker, loader, interpreter, micro code, BIOS, hardware… • Testing is hard and limited
Acceptance/Qualification Testing Code inspections/code walk through Concurrency testing Data table testing Disability Access Testing (Variation of usability testing) Installation testing Integration Testing Legal Validation/Verification (Validate legal requirements then verify legal requirements met) Load/Stress Testing Performance testing (test response times) Recovery testing Regression testing Reliability Testing Scalability testing (variation of load/stress testing) Security Testing / Penetration Testing Spike testing (Variation of Load/Stress Testing) Uninstallation testing (variation of installation testing) Unit Testing Upgrade/Patch testing (variation of installation testing) Usability testing Types of Testing
Applied Common Sense • Vision is not hard to come by • Create a vision • What are the questions? • What are common-sense answers? • Bring together smart people to think about the obvious vision
Q: States Testing Independently? • go it alone, or… • or Voluntary Consortium of States?
Q: States Testing Independently? • Voluntary Consortium of States: • Hire more and/or better people • Save $ on duplicated effort • Better shared knowledge gained in • Product evaluation • Use
Q: Who should pay for it? • Not vendor funded as with ITA system • Interest clash • Barrier to new entries • Pooled state election money • What about others including • Political parties • Good government groups • Civil rights groups • Academic institutions
Q: Big-Bang or Continuous? • Like getting regular checkups • Nevada gaming control board takes machines out of service
Q: White Box or Black Box? • Why handicap our testers by not giving them source? • We want to find bugs – source code review is good for this • Every branch much be run – too many to realistically be done in voting system software
Q: Partisans Included? • Brennan Center for Justice projects worked both ways • Working with people on both sides of debates has brought out insights • Smart and knowledgably is important – such people often have opinions
Q: Team must have… • Understand election processes • Understand computer security techniques • Testing in other domains • Background from other industries including gaming • International perspective • Heterogeneous team how do find problems
Q: Product Roadmap • Can election officials impact product direction via a consortium?
Q: Consortium Services? • What can they offer?
Q: Develop Risk Models • Testing should be driven by clear view of the risks testing is attempting to address • “We might buy a machine that is not as accessible as we are told.” • “…not as secure.” • “…not as reliable.” • “…not as easy to administer.” • Good to develop and maintain these jointly
Q: Shared Repository of Knowledge? • What was learned under testing? • What was learned in use? • What procedures work well with this technology? • Model: Information Sharing and Analysis Centers (ISACs) e.g., Financial Services Information Sharing and Analysis Center www.fsisac.com/about.htm
Q: Evaluating Election Procedures? • Could this same team evaluate procedure manuals? • Should be able to evaluate procedures against best practices
Q: Testing When? • Product Evaluation • Certification • Acceptance • Logic & Accuracy • Continuous
Q: Other services? • Negotiate joint purchasing agreements (like GSA Schedule) • Products • Services • Transparency: Arrange for purchasable by responsible organizations • Encourage innovation by • Adhering to open standards
Q: Make policy? • Should such consortia of states do testing and provide testing information or should they take on policy making role?
Q: Make policy? • Should such consortia of states do testing and provide testing information or should they take on policy making role? • I’ve been assuming that these staff would make no policy but only provide the results of their tests. They would not, for example, certify or decertify machines but would report on results of testing.
Multiple states group into a consortium (or two) Has own staff and/or consultants, small contractors, academics Performs Testing for: Usability Security Evaluates Procedures New technology Cost So one vision emerges
Does this make sense? • Very interested in collaborating around a proposal to create a consortium • How can we improve this vision? • Please contact me if you want to work on this
Testing is not an end in itself GOAL= Improved Elections Authority Commitment Skills Current State Resources Testing
Illustration: Gaming – What’s Different? • Ladder of trust with signed firmware at bottom • Multiple people with different keys • Field trails as part of certification • Hash compare in the field randomly every two years • Auditing the auditors • Certification done by government employees willing to share/discuss their methods • Post-employment restrictions on working for vendors • Penalties for messing up • Assumption of cheating