210 likes | 310 Views
Fast and Thorough: Quality Assurance for Agile Data Warehousing Projects. What?. Test Types (1 of 4). Unit Evaluate the quality of a single developer story Perhaps a single ETL mapping or a session Mostly functional, some system meta data Component
E N D
Fast and Thorough: Quality Assurance for Agile Data Warehousing Projects
Test Types (1 of 4) • Unit • Evaluate the quality of a single developer story • Perhaps a single ETL mapping or a session • Mostly functional, some system meta data • Component • Unit-test style verification of an assembly of units, perhaps a workflow • Integration • Evaluate the coherence of the full application, as far as it exists to date • Add data scenarios, e.g., nominal, dirty data, missing data, EOP processing • System • All the above, but most likely a subset conducted formally as a final certification • Add very technical tests of operational topics
Test Types (2 of 4) • Functional: Does an object meet its business requirements? • Examples: Does it transform a particular set of input as predicted by an example from an SME? • Story Tests: Did the product owner accept a user story during the user demo at an iteration’s end? • Simulations: Does it transform an entire set of data as predicted by the project’s lead roles? • Alpha Tests: Can the team get the app to behave when they take the role of users?
Test Types (3 of 4) • Scenarios: Consider a compound business situation: Can app support all needs at once? • Exploratory Testing: Consider the edges of the business requirements: Other functionality needed? • Usability Testing: Will the app sustain the business functions of each “user persona” we intend to support? • UAT: Does the app deliver on every case of a structured, business-run, system appraisal? • Beta Tests: Does the system perform when real users interact with it and when it’s loaded with realistic data?
Test Types (4 of 4) • Performance Tests: Does the app have acceptable response times under a normal load? • Load Tests: …under the highest conceivable load? • Security Tests: Is the system sufficiently resistant to a wide range of techniques to reveal and/or compromise its data? • Non-Functional Requirements Tests: Does the application meet a wide range of criteria for long-term inclusion in the department’s IT platform and low total cost of ownership?
Techniques to Choose From • Unit testing: developer’s standard testing technique • Systems analyst inspection: manual data inspection by systems analyst, esp. matters concerning business rules and source-to-target mappings • Actual-data analytics: scripts, usually of SQL commands, run against actual results • Actual-to-expected result comparisons: usually scripts employing SQL “minus” commands • UAT subset: the appropriate items from the evolving user acceptance script that the solution designer and product owner are accumulating • Full UAT: execution of the full user acceptance test script for the release
Data-Based Testing Scenarios • Nominal (“happy path”) • Dirty Data (human-visible syntax flaws) • Corrupted Data (machine-visible syntax flaws) • Missing Rows / Tables (e.g., no customers record or file) • Incoherent Data (skipped or mis-sequenced files) • Duplicate Data (e.g., overlap between extracts) • End-of-Period (e.g., month, quarter, year) • Archiving (purging of data that’s too old) • Catch-Up (usually three days per run day) • Restart (test operation instructions) • High-or Full-Volume (performance and 1-in-a million errors) • Resource Outage (FTP node goes down)