1 / 52

From 3 weeks to 30 minutes – a journey through the ups and downs of test automation

From 3 weeks to 30 minutes – a journey through the ups and downs of test automation. Who am I?. Peter Thomas Chief Software Engineer Operations IT, UBS Investment Bank Developer (mostly) I do some architecture I have done Testing I talk a lot (Mentor/ Coach)

luyu
Download Presentation

From 3 weeks to 30 minutes – a journey through the ups and downs of test automation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From 3 weeks to 30 minutes – a journey through the ups and downs of test automation

  2. Who am I? • Peter Thomas • Chief Software Engineer • Operations IT, UBS Investment Bank • Developer (mostly) • I do some architecture • I have done Testing • I talk a lot (Mentor/Coach) • From the dark side (consulting) but getting better

  3. Where did we start? • Existing mainframe legacy application • 3 week manual PPT testing cycle • 12 week delivery cycle

  4. What did we want to do? • Belief there was a better way to deliver software • Incremental development to deliver business value quickly • Address the rapidly changing business landscape with flexibility in delivery • Build quality into the solutions • Deliver the software rapidly, but in a cost effective manner • Put the fun back into software delivery

  5. New York London Kiev Hyderabad Hong Kong 2M trades per day 100 billions settling per day in all major currencies 50+ exchanges across EMEA and APAC 15 scrum teams/120 people 9 applications Production releases every 2 weeks

  6. New York London Kiev Hyderabad Hong Kong 200 commits per day 1000 artefacts updated per day 1 commit every 5 minutes peak

  7. New York London Kiev Hyderabad Hong Kong 24 Build Targets 60+ Test Targets 800 Automated Functional Tests 10, 000 Unit/Integration Tests 7, 000 Behavioural Tests

  8. But……..

  9. Our tests were….. Complicated Obscure Random failures Slow to run Difficult to fix

  10. “The TDD rut” Complicated Obscure Random failures Slow to run Difficult to fix

  11. Test the Right Thing and Test the Thing Right When all you have is a hammer, everything looks like a nail

  12. Why do you test?

  13. Why do you test? • Because TDD tells me so? • Because (insert favourite method here) says I should? • So I meet the 80% coverage metric?

  14. Why do you test? • To accept the solution • To understand and document the solution • To prove its not broken • To find the unknown unknowns • To help us design and build new features • To help us explore what is really needed • To show it won’t crash under load, to show it is secure (to test the ‘ilities) …?

  15. Why do you test? Agile testing Quadrants – Lisa Crispin, Janet Gregory

  16. Testing Purposefully

  17. The Right Thing At The Right Level Unit Component System

  18. The Right Thing At The Right Level • Tests a single class with no dependencies • If dependencies like Spring Context, Database used then called Unit Integration • Tests technical correctness and robustness • Very specific, a failing test indicates an issue in a specific class • Difficult to perform on poor quality code • Very fast to run, should run on the developer’s desktop in the IDE Unit Component System

  19. The Right Thing At The Right Level • Tests a group of components which are integrated to perform a business relevant function • Can test technical or business correctness, but should be expressed in Domain concepts • Specific, a failing test indicates problems in that component • Easier to perform on poor quality code, provided component boundaries are clear • Can be quick to run, doesn’t need the full application, should run on developers desktop Unit Component System

  20. The Right Thing At The Right Level Unit • Tests a system at its boundaries as a ‘black box’ • Primarily testing for business correctness • Not Specific, a failing test could be caused anywhere in the system flow • Easy to perform on legacy applications, requires little code modification • Slow to run, can be fragile, may not run on developers desktop Component System

  21. What We Wanted

  22. What We Had Unit tests which weren’t really Unit Tests End to End tests when unit tests would have been sufficient Duplicate and redundant End to End tests

  23. The TDD Cycle

  24. TDD? @Test 
public void shouldBeEmtpyAfterCreation() { 
  ReportObjectaTrm = new ReportObject();  assertNull(aTrm.getEligibleTrm()); 
  assertNull(aTrm.getProcessedEvent()); 
  assertNull(aTrm.getPayloadXml()); 
} @Test 
public void shouldCorrectlySetAttributesViaConstructor() { 
  ReportObjectaTrm = new ReportObject(eligibleObject, REPORTABLE_XML); 
 assertEquals(eligibleObject, reportableTrm.getEligibleTrm()); 
  assertEquals(REPORTABLE_XML, reportableTrm.getPayloadXml()); 
} @Test 
public void shouldCorrectlySetFieldsViaSetters() { 
  ReportObjectaTrm = new ReportObject();  aTrm.setEligibleTrm(eligibleObject); 
  aTrm.setProcessedEvent(child); 
  aTrm.setPayloadXml(REPORTABLE_XML); assertEquals(eligibleObject, aTrm.getEligibleTrm()); 
  assertEquals(child, aTrm.getProcessedEvent()); 
  assertEquals(REPORTABLE_XML, aTrm.getPayloadXml()); 
}

  25. The Hollow Egg

  26. The Hollow Egg 98 Tests 2.5K LOC 30 Tests 200 LOC

  27. RSpec model

  28. Outside In - The TDD Spiral

  29. Make the Intent Clear How to achieve acceptance without showing your IDE or log file to the users

  30. Unit Test Naming? testProcessError() whenWorkItemIsManuallyAssignedThenClientRuleShouldBeSetToManualOverride() shouldAllowAnActioningWorkItemToBeUpdated()

  31. Test Data Nightmare

  32. What Do You Demo?

  33. What Do You Demo?

  34. Executable Specification

  35. Improve Testing Stability Avoiding the Broken Windows syndrome

  36. Separate Progress & Regression Tests

  37. Speed-up Through Parallelism

  38. Identify Unstable Tests

  39. Quarantine Unstable Tests

  40. Avoid External Dependencies

  41. Introduce Fakes

  42. Avoid Time-Dependent Tests

  43. Test Isolation

  44. Asynchronous Testing Headache

  45. Don’t! • Does your test need to be asynchronous? • 80/20 rule? • Create synchronous test runner harness

  46. Asynchronous Testing using Events

  47. So…?

  48. Treat your Tests Like you Treat your Code “it’s just a test class” is not an excuse Clean Code applies to tests too

  49. Think about Why You are Testing Specification tests for internal quality Business tests for external quality

  50. Think about Who You are Testing For More people are interested in your tests than you may think

More Related