120 likes | 264 Views
Managing ERP System Maintenance Release Testing. Test Management Forum January 2008. Graham Marcus. Report to Global Technology “Senior Applications Consultant” Business Analyst System Analyst Functional Design Test Management Functional Testing Test Automation Project Manager
E N D
Managing ERP System Maintenance Release Testing Test Management Forum January 2008
Graham Marcus • Report to Global Technology • “Senior Applications Consultant” • Business Analyst • System Analyst • Functional Design • Test Management • Functional Testing • Test Automation • Project Manager • Certified Accountant • ISEB Foundation Certificate
System • PeopleSoft Finance • FAP • P2P • R2C • C2R • CM • Inbound Interfaces • Outbound to Essbase for Reporting • Vanilla and Customised • Single Global Instance • Housed Atlanta • Users in ASIA, EMEA, AMER
Team • Project Manager - Atlanta • Test Lead - UK • Developers – Atlanta • Technical - Atlanta • Analysts / Testers • Atlanta • UK • Sydney • QA Testers • Delhi • System Support • Global and regional • SME • Global and regional
Test Environments • Development • Developers and Unit Testing • Stage • System Testing • QA • Regression Testing • User Acceptance
Test Management Tools • Mercury Test Director for Quality Center • Releases • Test Library • Test Plan • Test Recording • Defect Management • Mercury Quick Test Pro • Functional Automation • Win Runner • Conference Calls • Test Strategy document
What Constitutes a ‘Release’? • Project Portfolio • Vendor delivered patches and bundles of fixes • Retrofitting of existing customisations over patches • Underlying Technology Changes • Further customisation enhancements • Internal infrastructure changes • Internal process changes
Project Defined Objectives Business requirements Design documentation Control interactions Emphasis on System and UAT Maintenance Release Starting with a system that the business is using Multiple varied objectives Poor documentation vendor fixes, better for enhancements Cross issue impacts Primary requirement – don’t break anything Emphasis on Regression and UAT Different from a Project
Challenges • Geography and time zones • Portfolio on different time tracks • Environment Status compared with Production • Snowballing – find an issue and search out other occurrences • Cross module ‘end to end’ testing • Defect fix and retesting • Defects requiring vendor intervention • Upstream data acquisition (stubs) • Downstream data validation (e.g. banks) • Interpretation of effect and impact of a patch
Test Approach • Vendor delivered patches and bundles of fixes • Where able to identify impact – system and regression test • Where address logged business issue – system, regression and UA test • Where replace customisation - system, regression and UA test • Where unable to identify – group by process and regression test in both test environments • Retrofitting of existing customisations over patches • Regression test in both test environments, consider UAT • Underlying Technology Changes • Identifiable issues – search out occurrences, system and regression test • Regression test again • Further customisation • Treat as mini project, system test and regression test around if required, UA test • Internal infrastructure changes • As technology changes • Internal process changes • As technology changes
Discussion • Maintenance Testing • The Release ‘Portfolio’ • Mapping Test Theory to Real World • Our ‘hybrid’ role