380 likes | 511 Views
Innovative Test Practices. Paul Gerrard Gerrard Consulting 1 Old Forge Close Maidenhead Berkshire SL6 2RD UK e: paul@gerrardconsulting.com w: http://gerrardconsulting.com t: 01628 639173. What’s Innovative?. What’s innovative?. New – just invented – new to everyone
E N D
Innovative Test Practices Paul GerrardGerrard Consulting1 Old Forge CloseMaidenheadBerkshireSL6 2RD UK e: paul@gerrardconsulting.comw: http://gerrardconsulting.comt: 01628 639173 Assurance with Intelligence
What’s Innovative? Assurance with Intelligence
What’s innovative? • New – just invented – new to everyone • New to me – never heard of it • Boss’s big idea (airline magazine syndrome) • My big idea – I want to shape the future • Innovative – I can see why no one did it before, but now I can see why it might work! Assurance with Intelligence
Innovative? • But ... it could be old hat • It’s something re-badged/renamed to look new • We tried it and do it all the time now! • We tried it and it didn’t work for us • One person’s ‘innovative’ is another person’s ‘old hat’ • Should we take a sceptical view? • YES! But we must always be open to new ideas. Assurance with Intelligence
Can innovations in testing be thought of in the same way? Assurance with Intelligence
Not all innovations make it across the “chasm” Assurance with Intelligence
The Hype Cycle Assurance with Intelligence
Emerging practices • Let’s take a look at emerging practices that • Look promising • Are being hyped and might (or might not) work • Are destined to become mainstream because they make it over the ‘chasm’ • Are they • Too good to be true? • Just hype? • A REAL opportunity to get better/faster/cheaper? Assurance with Intelligence
What’s on your list? Here’s mine: Virtualisation and the cloud Behaviour-Driven Development Crowdsourced Testing Assurance with Intelligence
Virtualisation • Huge pressure on IT to rationalise environments • To reduce energy footprint • To migrate from legacy hardware • To simplify environment management and support • Tools to manage transition from legacy to virtual are here. Assurance with Intelligence
Opportunities Creating environments will become routine and automated Base test environments can be created as virtual appliances New test environments copied and SUT installed Assurance with Intelligence
Consequences A new emphasis on build and installation processes Test the deployment processes early – so environments become ‘easy’ to deploy Test environments as reusable appliances? The cloud becomes a reality The cloud is a solution looking for a problem. Assurance with Intelligence
The cloud (Wikipedia) Cloud computing is Internet- ("cloud-") based development and use of computer technology ("computing").[1] In concept, it is a paradigm shift whereby details are abstracted from the users who no longer need[says who?] knowledge of, expertise in, or control over[dubious – discuss] the technology infrastructure "in the cloud" that supports them[clarification needed].[2] Cloud computing describes a new supplement, consumption and delivery model for IT services based on Internet, and it typically involves the provision of dynamically scalable and often virtualized resources as a service over the Internet.[3][4] The term cloud is used as a metaphor for the Internet, based on the cloud drawing used to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.[5] Typical cloud computing providers deliver common business applications online which are accessed from a web browser, while the software and data are stored on servers. Assurance with Intelligence
What the cloud gives us • On demand access to: • Someone else’s spare capacity (infrastructure) • Your own spare capacity (you mean we have some?) • And the good news? • Use it, abuse it – you don’t need to worry about supporting it • It’s a relatively cheap commodity if you are an occasional user. Assurance with Intelligence
And how can testers benefit? • Client or server environments • Build, baseline, copy as many times as you like • Use once and throw away • Don’t attach a screen shot to an incident - attach your whole environment • Environments can be disposable! • Automation process: • Copy a prepared environment • Run automated tests • Store results • Throw environment awayor archive. • What else can you think of? Assurance with Intelligence
Behaviour-Driven Development (and testing) Assurance with Intelligence
What’s BDD • Dan North coined the term: and defines it thus: • “BDD is a second-generation, outside-in, pull- based, multiple-stakeholder, multiple-scale, high-automation, agile methodology. It describes a cycle of interactions with well- defined outputs, resulting in the delivery of working, tested software that matters.” Assurance with Intelligence
Domain specific languages A domain specific language (DSL) is a programming language or specification language dedicated to a particular problem domain e.g. Banking, retail, defence... COBOL was an early attempt ... uh-oh... Assurance with Intelligence
The practices of BDD 1 Establishing the goals of different stakeholders required for a vision to be implemented Drawing out features which will achieve those goals using feature injection Involving stakeholders in the implementation process through outside-in software development Using examples to describe the behaviour of the application, or of units of code Assurance with Intelligence
The practices of BDD 2 Automating those examples to provide quick feedback and regression testing Using 'should' when describing the behaviour of software to help clarify responsibility and allow the software's functionality to be questioned Using 'ensure' when describing responsibilities of software to differentiate outcomes in the scope of the code in question from side-effects of other elements of code. Using mocks to stand-in for collaborating modules of code which have not yet been written Assurance with Intelligence
A new focus on requirements elicitation • Where do testers and test managers fit in Agile projects? • With BDD the door is open for us to take on some (all?) the analyst role • Focus on stakeholder goals (great!) • We learn of features as they are thought of • Act as a conduit/facilitator between stakeholders and developers • A new title: Analyst/Tester? Assurance with Intelligence
Be involved early, involved often • Haven’t we been asking for this for years? (forever?) • A new focus on “test early, test often” at a system level • Example-driven development (cf test-case) • Automated testing from day 1 (cf TDD) • Know which code exists and what doesn’t. Assurance with Intelligence
Things to look out for • BDD • Cucumber (an OS tool to support BDD) • Still too developer focused for my liking • But it’s early enough to influence its development • Maybe a variant product for Analyst/Tester role? • Keep a close watch on the Agile movement • I sense there’s a renewed momentum, realism, pragmatism and enthusiasm • Tester as mentor/coach is “transitional” Assurance with Intelligence
Crowdsourced Testing Assurance with Intelligence
Crowdsourcing Testing – Can it Work? • Quite a lot of hype surrounding distributed and remote resourcing • Clouds and… • Crowds • Remote, removed, responsible, cheap, disposable, unaccountable, irresponsible… • What’s the real agenda with crowds? Assurance with Intelligence
What is a crowd? • Any group of people who find themselves in a shared situation • Where does the “wisdom” come from? • Markets are crowds – ability to determine the value of a stock, product, restaurant for example • Crowds don’t need super-intelligent people • Crowds usually out-perform ‘experts’ but • “Anyone taken as an individual is tolerably sensible and reasonable – as a member of a crowd, he at once becomes a blockhead” – Bernard Baruch • What is going on here? Assurance with Intelligence
Crowdsourcing The White Paper Version: Crowdsourcing is the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call. The Soundbyte Version: The application of Open Source principles to fields outside of software. Assurance with Intelligence
Crowdsourcing is a buzzword – can testing benefit from it? Not many companies are yet using a 'crowd' to test – yet Utest.com appear to be the foremost promoters of this approach in the US but it hasn't penetrated the UK much Five main options for crowdsourced testing? Assurance with Intelligence
1. Crowded Scripting and Execution • If the deliverable from testers is information, how exactly does one reward one's crowd of testers? • Do we pay for documented tests and then a log of the tests being performed? • Probably not • We would need to manage the scope of documented tests - otherwise everyone would go for the low hanging fruit and we'd end up with a load of useless, duplicate tests • We'd still get some useful test activity, but the chaotic nature of these tests (with duplicated tests and untested areas of the application in the mix) would make it impossible to manage and the rewards to testers would be hard to distribute fairly. Assurance with Intelligence
2. Scripted Tests, Crowded Execution • Maybe we could distribute scripted tests to the crowd and say "get on with it” • In this case, we may still need to manage execution and assure ourselves that the tests, as documented, have actually been executed • We could trust the crowd to execute tests remotely and log the tests and incidents as appropriate • But we still need to manage their activity to avoid people wasting their time running tests that we know will fail or logging defects we already know about • We could retain a coordinating role - that's not too hard to do either • But if we're just hiring people to execute tests, and we want people we can trust, it's probably more effective (and possibly cheaper) to hire some junior testers to run our tests reliably under our supervision • Crowded execution of scripted tests doesn't leap out to me as a great opportunity. Assurance with Intelligence
3. Crowded Exploratory Testing • Could publish an initial set of exploratory test charters and manage those across the 'team‘ • Coordination and management of the feedback from testers could be done but clearly, we are losing the benefits of close collaboration and communication • How do we reward exploratory test activity? • To be both prudent and fair to the testers involved we need to monitor and manage their useful activity. This is relatively easy to do with in-house resources, but with remote exploratory testers - that's hard - unless we trust them • One or two exploratory testing support tools are emerging and maybe they could be used to collect feedback efficiently • But, forcing the testers to use a logging tool might undermine their effectiveness • Tester A might thrash away and run hundreds of tests - achieving nothing • Tester B might spend time thinking through an anomaly and decide after due consideration that the software works fine or perhaps they logs a defect instead • The notes of these two testers could be dramatically different in scale and value. How could we reward that activity fairly without micro-managing them? Assurance with Intelligence
4. Crowded Informal Load/Performance Testing • If we hire the crowd to perform basic transactions in volume, the workload generated could be useful as a test or background load for other tests we might run internally • The crowd's contribution could easily be monitored and testers could be paid by the hour or for valid transactions executed as we could monitor this on our test environment. This could be an economic and practical proposition. • But… • If the transactions we ask the crowd to execute are simple enough that we don't need to monitor them too closely, would the use of automation, using a cheap or free tool do the job just as well? • Tools don't need global coordination across timezones, they are reliable, don't complain or get tired and can be worked 24 hours a day. The economics may not work in the crowd's favour after all. Assurance with Intelligence
5. Crowded Configuration Testing The crowd provide us with a ready-made, limitless collection of widely varying technology configurations Potentially, the hardware out there provides the most obvious opportunity Whether scripted or exploratory, incidents raised by the crowd have some validity An agent installed on the testers' machines could provide reliable inventories of hardware and software in use by the testers and this could be posted back for scrutiny This has real possibilities. Assurance with Intelligence
Pay-per-Bug • We could reward testers for raising valid and distinct defect reports • Testers will tend to focus on trivial defects to earn as much as possible but the same defects may be logged repeatedly by different testers • Perhaps the reward for defects could be weighted by severity and testers paid more for critical defects? • But the temptation would be to under play defects to pay testers less. Would testers ever believe they were being treated fairly? • The other 'problem‘ is that of software quality • If the crowd are testing high quality software - there won't be many defects to detect and report • Full day's testing might go unrewarded - it's just too hard to make a buck. However, if the quality is poor, then the number of defect reports being logged and paid for might bankrupt the client! • Is there a middle ground where the fairness and economics even out? It's hard to see how "reward by defect" will work. Assurance with Intelligence
Other Practical Considerations Remote testing of large, commercially sensitive systems across insecure connections is out of the question Only ‘simple’ systems or systems installable on home computers or handsets are viable Multi-user environments require complex test databases to be established and controlled for testing – a problem Remote, autonomous testers will either need very close coordination and supervision or they must be granted their own dedicated test environment or both This severely limits the types of application for which crowdsourced test execution is viable. Assurance with Intelligence
Crowdsourced testing - summary • The difficulties are significant • Crowdsourced configuration testing – maybe • Outsourcing all testing to the crowd - no way • It can work for tiny, trivial, toy systems • Perhaps the best use of the crowd is to use them in a "mopping up strategy“? • Alternatively, perhaps we could use the crowd to give us feedback on an unfinished or evolving system • But how different is this from a traditional beta test? • I can’t see this being significant until certain questions are answered Assurance with Intelligence
Question to be answered? What test objectives (and associated techniques) are well-served by a crowd? What test activities (plan, spec, script, run, explore, report, decide etc.) best suit this model? Where do automation and tools fit? (Or must a crowd always be manual?) What commercial models work for companies offering crowdsourced testers? Can a software company or user company recruit its own crowd? How can we identify and recruit the best testers in a crowd? How can we reward good testers in a crowd? How do we control crowded testing to optimise the outcome? How do we recruit a trusted crowd and maintain confidentiallity? How do we measure the effectiveness of crowds? (and compare with other approaches) How do we assure the work done by a crowd? How do we report the output of crowds? Assurance with Intelligence
Innovative Test Practices Paul GerrardGerrard Consulting1 Old Forge CloseMaidenheadBerkshireSL6 2RD UK e: paul@gerrardconsulting.comw: http://gerrardconsulting.comt: 01628 639173 Assurance with Intelligence