130 likes | 225 Views
Test Automation for Verifying Software’s Detectability for Rule Violations. Name: Zhishuai Yao Supervisor: Pro. Jukka Manner Place: Varian Medical Systems Finland Oy. Outlines. Overview and background Objectives of the thesis Design and implementation Results and conclusions Q & A.
E N D
Test Automation for Verifying Software’s Detectability for Rule Violations Name: Zhishuai Yao Supervisor: Pro. Jukka Manner Place: Varian Medical Systems Finland Oy
Outlines • Overview and background • Objectives of the thesis • Design and implementation • Results and conclusions • Q & A
Overview and Background • This thesis is done in a company which develops software for radiation therapy in cancer treatment • Automated tests are created to verify the error detecting mechanism (“checking functions”) in the software
Overview and Background • Radiation therapy • Using radiation beam to irradiate the tumor. • Requires high accuracy at tumor positioning and dosing the treatment
Overview and Background • Treatment planning system (TPS) • Computerized application used for simulate the dose distribution in CT images • Various of inputs to the TPS increases the risk for radiation therapy
Objectives of the Thesis Implementing the tests is to: • High level: reducing the risk in radiation therapy • Low level: eliminating the errors in the TPS by verifying the “checking functions” in the application
Design and Implementation • Testing target: “checking functions” • For each specific violation of rule, checking function throws error or warning message to notify the user • Testing method: “black box testing” • Generate faulty cases to violate every predefined rules and check whether the correct error or warning message is throw by the “checking function”
Design and Implementation • Challenges • Understand each rule (requirement) and find the proper parameter to violate the rule • Setting the criteria to the test • Short execution time and reusability (e.g. for regression testing)
Design and Implementation • Test procedures: • Importing prerequisite data • Running the checking function for original data • No error or warning should be thrown • Modify specific parameter • Running the checking function again • Expected error or warning should be thrown • Log the result
Results and Conclusions • Automated test has covered 93 rules (requirements) by the time this thesis was finalized (currently more than 120 )
Results and Conclusions • Associated warning or error is not shown. • Non-related warning or error is shown in addition to the correct warning or error message. • Corruption in data model dependency rule. • Some of the mandatory attributes are not correctly configured in the system.