60 likes | 100 Views
Alfresco Benchmark Framework. Derek Hulley Repository and Benchmark Team. Some History. 2008: Simple RMI-based remote loading of FileFolderService Unisys 100M document repository-specific test Then: QA wrote JMeter scripts for specific scenarios
E N D
Alfresco Benchmark Framework Derek Hulley Repository and Benchmark Team
Some History • 2008: • Simple RMI-based remote loading of FileFolderService • Unisys 100M document repository-specific test • Then: • QA wrote JMeter scripts for specific scenarios • Customers, partners and field engineers provided tests for specific scenarios • Hardware shared with QA and dev, as required • Mid 2011: • RackSpace benchmark environment commissioned • Late 2011: • Failed attempt to simulate real Share-Repo interaction using JMeter (AJAX, etc) • First major proof of 4.0 architecture started (JMeter with sequential, heaviest calls) • Later called BM-0001 • Client load driver was resource intensive and results had to be collated from 3 servers • Early 2012: • Benchmark Projects Lead role created • Evaluation of Benchmarking technology • Mid 2012: • Successful approximation of Share-repo interaction using JMeter • Benchmark Projects formalized • BM-0002 executed (ongoing for regression testing) • BM-0009 started
(Some of the) Benchmark Framework Requirements • Real browser interaction • Javascript, asynchronous calls, resource caching, etc • Scaling • Scale to thousands of active sessions • Elastic client load drivers • Shared thread pools • Results • Durable and searchable results • Support real-time observation and analysis of results • Every logical action must be recorded • Every result (positive or negative) must be recorded • Tests • Treated as real software (automated integration testing, major and minor versions, etc) • Reusable code or components • Aware of server state • Different tests can share the same data set up • Execution • Remote control from desktop • Override default test properties • Concurrent test execution • Start / Stop / Pause / Reload • Automated reloading
Benchmark Server Architecture Client Configuration Reporting MongoDB MongoDB MongoDB Test Run Event Queues Test Run Results Data Mirror Collections ZooKeeper Server configurationTest DefinitionsTest run definitions Benchmark Server 1 Thread PoolCommon Libraries eg. WebDriver Benchmark Server N Thread PoolCommon Libraries e.g. WebDriver Test Target
Modifying Test Parameters During Run HTTP connection pool refreshing Paused test Continued test Doubled workflow rate Halved workflow rate