200 likes | 354 Views
Automation @ Squirrel Systems. Our automated solution at a glance By: Justin Hollander. About the Presenter. Employed at Squirrel Systems since August 2007 as a QA Developer/Analyst
E N D
Automation @ SquirrelSystems Our automated solution at a glance By: Justin Hollander
About the Presenter • Employed at Squirrel Systems since August 2007 as a QA Developer/Analyst • Volunteered one year in Guatemala to help build an online system for the Central American district. Trained staff and volunteers to use the systems. Prepared manuals and other documentation • BCIT Alumni and currently attending BCIT
About the Presentation • Benefits of Automation • Description of automation needs at Squirrel Systems • Design goals vs. cost considerations • Previous attempts at solving the automation / autoretest problem • The current design of our autoretest solution • Problems we overcame • Future Considerations • Questions and Answers
Our Automation Needs • Intended to reduce overhead of manual certification checklists • To allow other testers to do more functional / exploratory testing on other parts of the system that cannot be tested in an automated fashion • Consistency in testing. Day in and day out.
Previous Attempts • Attempt 1: Batch processing, relied on playback utilities, with an Access Database(DB) to manually track successes and failures of the running scripts • Attempt 2: Improved batch processing(more readable scripts), relied on playback utilities, without the Access DB to track the successes and failures of the running scripts
Problems with Attempt 1 • Script information and state were manually tracked • Batch scripting is not as flexible or powerful as a programmatic solution to get the options you need for a comprehensive automated solution • Results were not archived and therefore could not be linked • Output was in a raw text only format and very cumbersome to process manually • Too many false positives as comparison was blind, and not intelligent, nearly impossible to trust results
Problems with Attempt 2 • Same as attempt 1, except now, without the Access DB, there was no way to track the running scripts • Solution was scaled back due to resource constraints and the project had a hard time getting traction and support • Ultimately, the project was suspended
What we Learned • Automation/autoretest is more complex than we thought • There is no “easy” solution • Externalizing our test input data while using our playback utilities to retest the system is impossible • At a minimum; output data must be analyzed better to eliminate false positives • Batch scripting isn’t working • A more dedicated approach is required
Autoretest, TAKE 3! • Wedecided to try this again • Decision to scrap previous work and start again from design phase • Requirements were well understood, however, the right skill-set was missing until now
High level Design vs. Cost Design goals Cost Considerations • To use existing playback tools within a framework to retest recorded ‘tracks’ • Use intelligent comparison algorithms to rule out false positives • Output results in a human friendly format in order to ensure a quicker analysis of the potential problems • Externalize the test input data from the system • Re-usable parts to extend framework to all parts of POS system • Limited to 40 man-hours a week(Basically me ) • Development team on a tight schedule and have limited resources for this project • No money for off-the-shelf solution • Management wanted a solution rather quickly(6 months) • Money already invested; 2 previous attempts in solving this in last 5 years have failed
The Current Design • Client/Server model • Written in Java with easy to use administrative GUI • Externalized data as much as possible • Used intelligent output comparison algorithms • One DB configuration supports one or more test scripts • Supports XML processing • Comparison output formatted in HTML • Automatically detect if software that is being tested is the current build. Will update if needed
The Current Design cont.. • Test outputs and comparison results archived to achieve the possibility of linking of historical data to present results • Script information stored in a relational SQL DB • Script information linked to GUI using custom written SQL driver • Script states linked to scripts via GUI and they are not static • Client has a head-less mode supported by multi-threaded timer mechanism • Extensible due to its polymorphic nature and can easily support additional forms/levels of testing
Code Sample – DB Compare private static void processRows(Vector<String> vB, Vector<String> vR) { if (vB.toString().equals(vR.toString())) //table contents equal?? { vB.removeAllElements(); vR.removeAllElements(); } else //everything not equal probable bug or code change in squirrel { if (vB.size() == vR.size()) //just mismatches ?? { for (int i = 0; i < vB.size(); ++i) { if (vB.get(i).equals(vR.get(i))) { vB.set(i, "VALID"); vR.set(i, "VALID"); //the only ones not tagged are the mismatches and the can get filtered out while still preserving the row number } } } else if(vB.size() > vR.size()) //missingData ?? validateDifferentSizeTable(vR,vB); else if(vB.size() < vR.size()) //extraData, this should be rare validateDifferentSizeTable(vB,vR); }// that should leave us with any differences for the HTML Manager to process further. }
Problems Solved with this implementation • All DB false positives have vanished • We are able to use our scripts longer, as it is easier to trust output with limited false positives • Output is easy on the eyes with HTML colour coded output • Playback utilities have been made more robust • System is much more configurable than before
Considerations for the Future • Remote Web Administration using JSP and Java servlets • Make better use of RMI technology and turn this into a distributed system from a client/server model • Completely externalized data • Build robots to do the maintenance on the system