400 likes | 613 Views
Performance Testing in the Real World: How to Plan for a Successful Performance Test Sept 24th 2010. Mohit Verma, Tufts Health Plan Lead Performance Architect. Agenda . Background Market State Why Performance Test? Technical Environment Performance Testing Performance Testing Benefits
E N D
Performance Testing in the Real World: How to Plan for a Successful Performance TestSept 24th 2010 Mohit Verma, Tufts Health Plan Lead Performance Architect
Agenda • Background • Market State • Why Performance Test? • Technical Environment • Performance Testing • Performance Testing Benefits • Performance Testing CSFs • Performance Testing Synergies • Questions? QAAC - Sept 24, 2010
Background: • Tufts Health Plan is a Managed Care Provider (HMO)- our applications typically support Health Care Providers (hospitals, etc), Employers and our Members. • Active Channels • EDI • Legacy • Web QAAC - Sept 24, 2010
Market State • Forrester recently reported that among companies with revenue of more than $1 billion, nearly 85% reported experiencing incidents of significant application performance degradation. Respondents identified the application architecture and deployment as being of primary importance to the root cause of application performance problems. QAAC - Sept 24, 2010
Example 1: Amazon.com – June 29th Outage* • Amazon.com experienced a widespread outage on Tuesday that lasted, at least for many customers, more than three hours and displayed blank or partial pages instead of product listings. • By mid-afternoon, Amazon's home page was devoid of any product photographs and showed only a list of categories on the left of the screen. Searching for items often didn't work, and customers' shopping carts and saved item lists were temporarily displayed as empty. • At an annual revenue of nearly $27 billion, Amazon faces a potential loss of an average of $51,400 a minute when it's site is offline. Amazon shares closed down 7.8 percent, a sharper fall than the Nasdaq index. • A post on an Amazon seller community form at 12:47 p.m. PDT said: "We are currently experiencing an issue that is impacting customers' ability to place orders on the Amazon.com website." A followup announcement an hour later said the problem had not been resolved. QAAC - Sept 24, 2010
Example 2: Chase Outage Sept 14, 2010 QAAC - Sept 24, 2010
Example 3: Dell.com gave shoppers the fastest high broadband access time among large web retailers in April, 2010 according to Gomez. QAAC - Sept 24, 2010
Why Performance Test? • Software Engineers often build Software components/products not being aware of the target load or environment requirements or service level agreements • Complexity and highly distributed nature of the various hardware and web hosting servers offers challenges on optimal configuration of applications • Globalization of users offers additional complexity • Virtualization of Business Critical Applications demands Performance Testing Recommendation – Performance Test Proactively and Early in the Software Development LifeCycle QAAC - Sept 24, 2010
Technical EnvironmentN-tier Diagram - Simple QAAC - Sept 24, 2010
Typical Technical Environment Technologies used (Complex and Diverse Environment) Web Application Servers: Weblogic, WebSphere, JBOSS, Aqualogic Infrastructure Security: CA SiteMinder, IBM Tivoli Access Manager Web Server: Apache, IIS Middleware: Tibco BusinessWorks and BusinessConnect Reporting: Siebel, Lawson, Actuate Midrange/Mainframe/Legacy: HP/IBM Hardware: Itanium and Windows environment Databases: Oracle and Sql-Server EDI interfaces – HIPAA Compliance QAAC - Sept 24, 2010 Tufts Health Plan 10
Technical Environmentn-tier Environment Test Environments are most often not replica’s of production. So often we have to do some extrapolation or accept the risk of performance testing in this environment. QAAC - Sept 24, 2010
Performance Testing • What is performance testing? Testing which measures a performance goal (response time) • Testing which measures application performance under user load • Testing which measures system performance under user load of all system variables in the deployment environment • Testing to stress the application/system to find its limits QAAC - Sept 24, 2010
Performance Testing • Key variables measured: • End User Response Time • Resource utilization (CPU, Memory, Disk, etc) • Network utilization & latency • Throughput (bytes/sec, etc) QAAC - Sept 24, 2010
Performance Testing – Benefits • Measure response time for applications and enforce SLAs • Improve end-user experience • Proactive load/stress testing of mission critical applications would enable us to benchmark applications as per concurrent user support, response times, etc • Capacity Planning – save Costs( $$ ) by sizing production/non-production environments more accurately • Help build proven scalable applications • Failover Capabilities* QAAC - Sept 24, 2010
Performance Testing: Critical Success Factors • Understand the Drivers and Triggers for Performance testing (NFR) • Build or identify Production Workload model • Well Defined Success criteria - SLAs • Identify Business Critical Workflows of application • Identify/Create Test Data • Build Test Environment that models production* • Support of all teams – Performance Testing is a TEAM Effort!! • Workflow Automation Tool (Load Test Tool) • Load Generation environment • Performance Test Analysis/Reporting • Need Management that values Performance Testing • Keep control of the Performance Test Environment • Never let Development teams run the Performance Test for you QAAC - Sept 24, 2010
Successful Performance Test LifeCycle QAAC - Sept 24, 2010
Performance Testing CSFs:Drivers and Triggers • SLA Change • Hardware change (upgrade/downgrade, Virtualization) • Application Software Upgrade (New features/enhancements) • Infrastructure Software Upgrade/Patch (Security, Database, Systems, etc) • Compliance Patch (DOD) • Java/.Net version upgrade • Unexpected growth in number of users • Database retention policy change Typically, the non-functional requirements (NFRs) should dictate the need for performance testing QAAC - Sept 24, 2010
Performance Testing CSFs:Production Workload Model • What is the existing usage of the application/system? • Transaction Throughput (hour, day) • Number of concurrent users for the average hour/peak hour • Most used transactions QAAC - Sept 24, 2010
Performance Testing CSFs:Well-Defined Success Criteria • How do we know if the test was a success • Document SLA’s (response time, CPU/Memory usage thresholds) • Meets customer goals QAAC - Sept 24, 2010
Performance Testing CSFs:Define Business Critical Workflows • Identify Business Critical Workflows of application • Use the 80/20 rule (Pareto’s Principle): • 20% of the transactions cause 80% of the defects in production. Performance Testing is not typically a full regression test- 20% of the total test cases provides you 80% coverage. • Include resource-intensive transactions (CPU, database, memory, network) • Include highly used transactions QAAC - Sept 24, 2010
Performance Testing CSFs:Test Data Identification • Performance Testing is data-driven testing • Choose your test data carefully in consultation with production workload models or business analysts • Represent boundary value conditions (example – large result sets) • Represent required security roles when creating test ids • Test with a production-sized database • Test with same data setup at least 2 times for consistency • Test with a randomized data setup at least once QAAC - Sept 24, 2010
Performance Testing CSFs:Test Environment Considerations* • Develop and Enforce Test readiness checklist • Pristine Performance Test Environment • Monitoring tools setup – Historical data is mandatory • Locked down environment (including disabling virus scans) • Production sized in all respects, if possible • Document and communicate any deviations from production to stakeholders • If environment is shared? • Disable builds and deployment during test times • Build and Communicate Test Schedule • Communicate, communicate, communicate • Shutdown environments not needed • Monitor, monitor, monitor QAAC - Sept 24, 2010
Performance Testing CSFs:Team Support needed Performance testing is a TEAM effort! • Developers • DBAs • Network Engineers • System Engineers • Business (involve them to run UAT during performance testing execution) Performance Engineers typically do the first/second line of analysis* Root cause analysis tool may eliminate a total team effort QAAC - Sept 24, 2010
Performance Testing CSFs:Load Test Tool • For efficient performance testing need automation tool (industry standard or Open Source): • Quick scripting, • Correlation & Replay of scripts • Building Test Models/scenarios • Executing Test Scenarios • Analysis • Monitoring • Home grown tools may suffice where technology platform is not as varied or for proprietary applications QAAC - Sept 24, 2010
Performance Testing CSFs:Load Generation Environment Mimic production if possible • Firewalls • Several Network locations or use WAN emulator QAAC - Sept 24, 2010
Performance Testing CSFs:Performance Test tools QAAC - Sept 24, 2010
Load/Performance Test Tool – Benefits • Identify and resolve performance bottlenecks quickly • Repeatable tests can be scripted and run quickly • Real world user scenarios can be modeled by the tools • Helps improve the quality and stability of applications • Provides server monitoring capability for non-production environments • Provides co-related performance analysis reports with drill-down capability • Integrates with existing production monitoring tools QAAC - Sept 24, 2010
Performance Testing CSFs:Performance Test Analysis/Reporting • Tool Analysis module provide: • Real Time monitoring graphs • Transaction Response Time Reports • User Ramp-up graphs • Transaction Response Summary graphs • Drill-Down for Root cause analysis • Co-relating Graphs and results QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingProduction Profiles Provider Portal – July 1, 2007 to June 30, 2008 Public Portal– July 1, 2007 to June 30, 2008 QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingSample Report 1 QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingSample Report 2 QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingPerformance Test Reports – Error Rate graph Portal Crash Error Rate Jump Error Rate Jump Running Vusers QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingNon-Compliant SLA Report (MP_Login) QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingSLA Report after enhancements QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingLogin Test Results (22 and 44 Concurrent Users) Response Time Running Vusers Response Time 22 Concurrent Users 44 Concurrent Users QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingTest Results (66, 88 and 100 Concurrent Users) Response Time Response Time Response Time 88 Concurrent Users 66 Concurrent Users 100 Concurrent Users QAAC - Sept 24, 2010
Performance Testing – Analysis/ReportingDatabase Monitoring Report QAAC - Sept 24, 2010
Performance Testing – Synergies • Performance Testing and Application Performance Management (APM) go hand in hand • Performance Testing proactively identifies and resolves issues before production – metrics captured during performance testing can help build and monitor production systems more accurately • Performance Testing Scripts can be reused for synthetic transaction monitoring in production for SLA enforcement • Performance Testing Tools can be used for Root Cause Analysis and to replicate production issues QAAC - Sept 24, 2010
Application Performance Testing/Monitoring – Magic Quadrant QAAC - Sept 24, 2010
Questions/Discussion ? QAAC - Sept 24, 2010