120 likes | 258 Views
Keyword-Driven Automated performance testing of User Interfaces: a Case Study for the Open Element Management System Suite. Thesis presentation 25.03.2009 Author: Ni Na Supervisor: Professor Jorg Ott Instructor: Jori Aro. Content. Background Research problem Research methods
E N D
Keyword-Driven Automated performance testing of User Interfaces: a Case Study for the Open Element Management System Suite Thesis presentation 25.03.2009 Author: Ni Na Supervisor: Professor Jorg Ott Instructor: Jori Aro
Content • Background • Research problem • Research methods • Desktop Robot Test in a Virtual Machine • Desktop Robot Test in Multiple Virtual Machines and Hosts • Use Cases • User keyword library • Collected Performance Indicators • Conclusion • Future research
Background • The thesis was conducted for OES product in the department of OSS at NSN • OES is a next generation element management system platform product which provides out-of-box element management system functionality and an operation and maintenance (O&M) interface solution • Identify the bottleneck and defects in OES • Verify the performance results meet the customers’ requirements
Research problem How To Implement Multiple Users GUI Performance Testing on OES Objectives: • Tests are executed by instantiating the necessary number test clients, by configuring the testing tool runtime. • Different types of users are simulated by executing different patterns of use cases. • HW resource consumption of the client and server side are recorded. • Response times of the clients are recorded. • Resource consumption profiles are collected for the GUI and J2EE server side SW, and for the relational database. • Dimensioning rules are created. • Needed HW capacity can be defined for given amount of load. • Using the collected profiles, bottlenecks of the system are identified.
Research methods • Literature study • Software performance testing • Test automation • Keyword-driven test automation framework • Technology: • Uses Robot framework with Desktop keyword library • Executes Robot in kvm (kernal virtual machine), one kvm for each user • Implementation: • User keyword library • User test cases • Executed test cases: • Server side load: 20-100 FM events/sec • Use cases: Analyze Event, Customer complaint • Number of simultaneous users: 20 -> 30
Desktop Robot Test in a Virtual Machine Client (Virtual machine) (physical) OES host Robot Tests (html) OES Server RMI (10099) OES client (Desktop) RMI, JMS, HTTPS
APPSERV BACKEND Desktop Robot Test in Multiple Virtual Machines and Hosts Test host1 Guest OS DT SUT(OES) DT Test host2 DT RMI, JMS, HTTPS DT
Use Cases Analyze Event Customer Complaint • Repeated 40 times per hour for each user
User keyword library • CallPopupOnCompositeView.java • ChangePropertyInTable.java • ClickToTableHeader.java • ExpandTreeNodeInTable.java • FindTableColumnByName.java • GetCellContentInOneColumnByContentInAnotherColumn.java • IfDialogHasBeenOpened.java • LeftClickToTableHeader.java • RightClickToTableHeader.java • SelectCompositePanelView.java • SelectFromOneRowPopupMenuForAcknowledge.java • SelectFromOneRowPopupMultipleMenus.java • SelectFromOneRowPopupOneMenu.java • SelectFromTableCellPopupMenu.java • SelectOneRowFromEventList.java • WaitForJTreeNode.java • WaitTableCell.java • The rule for naming a keyword is that it names the keyword with the clear and descriptive words that let everyone involved with the testing process know what the keyword is doing from the name.
Collected Performance Indicators • Server side resource usage • Appserver CPU usage -> max number of users/appserver • DB server CPU usage -> max number of users/DB server CPU • DB server disk IO usage -> max number of users/DB server disk => Max number of users/cluster • Client side resource usage • Client side CPU usage -> max number of users/client CPU core • GUI response time • Average and maximum response time for each GUI action • -> to be compared against the system requirements
Conclusion • The current real obtained statistics can show that THE OES has, in fact, met the desired objectives; however performance tuning and improvements will be continuingly implemented in future release of THE OES.
Future research • Utilize multiple client environments • The configuration as well as the setup of the test system should happen automatically, not only the test execution Identify the bottleneck and defects in OES • More actions should be executed during the GUI test which means that new user keywords and test cases need to be developed • Performance tuning