230 likes | 363 Views
VVSG 1.1 Test Suite Status. Mary Brady National Institute of Standards and Technology http://vote.nist.gov. Background. Status quo: Labs have been testing to VVSG 1.0 (2005) using proprietary, custom tooling and review processes
E N D
VVSG 1.1 Test Suite Status Mary Brady National Institute of Standards and Technology http://vote.nist.gov
Background Status quo: Labs have been testing to VVSG 1.0 (2005) using proprietary, custom tooling and review processes In 2007-08, NIST developed a set of public test suites for VVSG 2.0, to be used as part of the EAC Testing and Certification Program In 2009, to support VVSG 1.1, test methods for new and changed material was back-ported from 2.0 test suites; status quo prevails for everything else.
Why Public Test Suites? To achieve consistency across testing labs and promote transparency of the testing process To review the VVSG for ambiguities, completeness, and correctness To assist manufacturers by providing precise test specifications To assist testing labs by lowering the overall cost of testing
Test Development Timeline December 2011: Integrated into Certification Process
VVSG 1.1 Test Suite VVSG 1.1 test suite is based on VVSG 2.0 test methods associated with back-ported requirements Accessibility and usability Operational temperature and humidity Electronic records, security specifications, and VVPAT Core functionality, reliability and accuracy New test method developed for updated software setup validation requirement
Accessibility & Usability System-independent test narratives with pass/fail criteria Highly structured process surrounding the usability test protocols for the performance-based testing with test participants ISO Common Industry Format for reporting usability test results CIF templates and how-to’s for manufacturers and test labs Page 6
Accidental Activation: Input mechanisms SHALL be designed to minimize accidental activation Covers requirements: 3.2.6c Accidental Activation 3.2.6c.i Size and Separation of Touch Areas 3.2.6c.ii No Repeating Keys 3.2.6c Input mechanisms SHALL be designed to minimize accidental activation 3.2.6.c.i On touch screens, the sensitive touch areas SHALL have a minimum height of 0.5 inches and minimum width of 0.7 inches. The vertical distance between the centers of adjacent areas SHALL be at least 0.6 inches, and the horizontal distance at least 0.8 inches. 3.2.6.2.ii No key or control on a voting system SHALL have a repetitive effect as a result of being held in its active position. EXAMPLE TEST CASE
Accidental Activation, cont. • Test Method includes 7 test requirements, covering 2 pages • Excerpt: • For touchscreen systems, The tester shall examine the touch areas for at least contests #4 (Governor) and #9 (County Commissioners). Using a ruler to measure distance and a stylus to perform the touching, the tester shall determine first that the touch areas used to vote for at least the first t candidates in each contest are separated as required. • F => If any vertical distance between centers of adjacent touch areas for voting is less than 0.6 inches, then, for requirement "Size and Separation of Touch Areas", the system fails. EXAMPLE TEST CASE
Hardware Operational Temperature and humidity Page 9
Operating Temperature and Humidity Covers requirements: Volume I, Section 4.1.2.13 Environmental Control - Operating Environment. Voting systems shall be capable of operation in temperatures ranging from 41 °F to 104 °F (5 °C to 40 °C) and relative humidity from 5% to 85%, non-condensing. For testing information, see Volume II, section 4.7.1. Volume II, Section 4.7.1 Operating Temperature and Humidity Tests. All voting systems shall be tested in accordance with the appropriate procedures of MIL-STD-810D, "Environmental Test Methods and Engineering Guidelines''. EXAMPLE TEST CASE
Operating Temperature and Humidity • Covers requirements, cont.: • Operating Temperature • All voting systems shall be tested according to the low temperature and high temperature testing specified by MIL-STD-810-D: Method 502.2, Procedure II – Operation and Method 501.2, Procedure II – Operation, with test conditions that simulate system operation. • Operating Humidity • All voting systems shall be tested according to the humidity testing specified by MIL-STD-810-D: Method 507.2, Procedure II – Natural (Hot–Humid), with test conditions that simulate system operation. EXAMPLE TEST CASE
Operating Temperature and Humidity, cont. Test method includes 15 steps, covering 3 pages Excerpt: Step 8: Set the chamber to 104 degrees Fahrenheit and 85% relative humidity (see Comment 1), observing precautions against thermal shock and condensation (see Comment 2). Allow relative humidity and VSUT temperature to stabilize. All paper, including ballots, used by the system must be stabilized at the specified testing temperature and humidity levels prior to testing (see Comment 4). Step 9: Perform an operational status check. If the VSUT shows evidence of damage, or any examined function or feature is not working correctly, then record that the VSUT fails the Operating Temperature and Humidity test. End the test. EXAMPLE TEST CASE
Security Electronic records Security specifications VVPAT New test method developed for updated software setup validation requirement Page 13
Electronic and Paper Record Structure Covers Requirement 7.9.3c Electronic ballot images shall be digitally signed by the voting system. The digital signature shall be generated using a NIST-approved digital signature algorithm with a security strength of at least 112 bits implemented within a FIPS 140-2 validated cryptographic module operating in FIPS mode. Discussion: NIST approved is "An algorithm or technique that meets at least one of the following: 1) is specified in a FIPS or NIST Recommendation, 2) is adopted in a FIPS or NIST Recommendation or 3) is specified in a list of NIST approved security functions (e.g., specified as approved in the annexes of FIPS 140-2/3)". The security strengths of cryptographic algorithms can be found in NIST Special Publication 800-57: Recommendation for Key Management - Part 1 General. EXAMPLE TEST CASE
Procedure Step 1: Obtain five electronic ballot images from the VSUT. Step 2: Verify digital signature on each of the ballot images individually. Step 3: If any of the digital signature verifications fails, record “The VSUT fails the Cryptographic Protection of Records test.” End the test. Step 4: Execute the Sections 6.1.3 and 6.2.3 cryptographic tests for the digital signature cryptographic module used to sign the electronic ballot images: Step 5: If any one of the above tests fails,record “The VSUT fails the Cryptographic Protection of Records test.” End the test. Step 6: Record “The VSUT passes the Cryptographic Protection of Records test.” End the test. EXAMPLE TEST CASE
Core Functionality Votetest Basic, essential voting system logic Ability to define elections Capture, count, and report votes Voting variations 92 tests formalized as SQL scripts Tests are intentionally simple… 89 use about 10 ballots, 3 use 100 ballots A volume test (mock election) is a significant test of all supported functions together …but they exercise the complete elections and voting process Page 16
Printed Report: Counted report of contest, including votes, undervotes, overvotes Covers Requirement I.2.4.3.d: All systems shall provide capabilities to produce a consolidated printed report of the results for each contest of all votes cast (including the count of ballots from other sources supported by the system as specified by the manufacturer) that includes the votes cast for each selection, the count of undervotes, and the count of overvotes. EXAMPLE TEST CASE
Printed Report, cont. : Counted report of contest, including votes, undervotes, overvotes • General Procedure • Establish initial state (clean out data from previous tests, verify resident software/firmware); • Program election and prepare ballots and/or ballot styles; • Generate pre-election audit reports; • Configure voting devices; • Run system readiness tests; • Generate system readiness audit reports; • Precinct count only: • Open poll; • Run precinct count test ballots; and • Close poll. • Run central count test ballots (central count / absentee ballots only); • Generate in-process audit reports; • Generate data reports for the specified reporting contexts; • Inspect ballot counters; and • Inspect reports. EXAMPLE TEST CASE
Printed report, cont. • Test Method includes 38 steps • Excerpt: • Step 26: Compute the absolute value of the difference between the reported number of votes for “Car Tay Fower” in the “President, vote for at most 1” contest in Precinct 1 and the value 4, and add it to the Report Error. If the needed value does not appear in the report, increment Report Error by one (1). • ... • Step 34: For each spurious ballot count or vote total reported by the VSUT (e.g., ascribing votes to a candidate that did not run in a particular contest or reporting one or more overvotes on a VSUT that prevents overvoting), increase the Report Error by one (1). • Step 35: Record the Report Error. EXAMPLE TEST CASE
Reliability, accuracy, misfeed rate Improved test method replaces material that was historically included in the VSS/VVSG… hence, included in drafts Now evaluated using data collected during all tests, rather than a single, isolated test Page 20
Test Validation Rigorous traceability to VVSG requirements Reviewed by independent parties VSTL labs, experts, public EAC: updated to consistent nomenclature and traceability Procedural (on-going) Operating temperature and humidity – complete Other VVSG 1.1 test suite components are under consideration and will be conducted in 2011
Next Steps Continue procedural validation Round-trip with testing laboratories to discuss methods for integrating test methods into their workflow Success here will pave the way for the rest of the VVSG 2.0 test suites Continue to work with all to improve the VVSG, manufacturer implementations, testing practices, and the test suites
Discussion Page 23