330 likes | 461 Views
A little Software Engineering: Agile Software Development, Practices through Values. C Sc 335 Rick Mercer. Goal and Outline. Main Goal: Suggest practices, values, and some process for completing a final project on time that is better than any one person could do it in in four times the time
E N D
A little Software Engineering:Agile Software Development,Practices through Values C Sc 335 Rick Mercer
Goal and Outline • Main Goal: • Suggest practices, values, and some process for completing a final project on time that is better than any one person could do it in in four times the time • Outline • Distinguish Waterfall (plan driven) from Agile • 11 Practices of quality software development • Four values of Extreme Programming (XP)
Waterfall Model • Waterfall was described by 1970 • Understood as • finish each phase • don’t proceed till done • W. W. Roycecriticized this • proposed an iterative approach
Became Popular • Management (usually software ignorant) like phases • to easily set deadlines • Customers provide all requirements • Analysts translate requirements into specification • Coders implement the specification • Reviews ensure the specification is met • Testing is performed by testers (not the devs) (QA) • Maintenance means modifying as little as possible • old code is good code • Change is hard (and costly)
A Spiral Approach • Dr Barry Boehm proposed a spiral approach
To waterfall or not • It became popular • This process is still is used a lot • Craig Larman's book [1] provides proof that waterfall is a terrible way to develop software • In his study, 87% of all projects failed • The waterfall process was the "single largest contributing factor for failure, being cited in 81% of the projects as the number one problem." [1] Agile and Iterative Development: a Manager's Guide, Addison-Wesley Professional, 2003
eXtreme Programming (XP) to Agile • 60% of survey 2011 respondents: up to half of company’s projects are Agile http://www.versionone.com/state_of_agile_development_survey/11/ • Set of SE practices that produce high-quality software with limited effort • Many books, first on Extreme Programming • Kent Beck: • Extreme Programming–Embrace Change, • http://www.extremeprogramming.org/
Extreme Programming • XP is • a disciplined approach to software development • code centric: no reckless coding, test-first • successful because it emphasizes customer involvement and promotes team work • not a solution looking for a problem • One of several "agile" (can adapt to change) software development processes http://www.agilealliance.org/
Essence of XP • Four variables in software development: • Cost, Time, Quality, Scope (# features) • Four Values • Communication, Simplicity, Feedback, Courage • Five Principles • Provide feedback, assume simplicity, make incremental changes, embrace change, quality work • Practices • Planning game, small releases, simple designs, automated testing, continuous integration, refactoring, pair programming, collective ownership, Continuous Integration, on-site customer, coding standard • Kent says do all and turn up the dials to 10
Cost of change Waterfall Cost of change XP time
Cost of the Project • Paraphrasing from two companies I've talked to (probably many more) • When we bid projects, we charge $X for doing it Waterfall and $X/2 for doing it Agile
Other Agile Practices: The Planning Game • The planning game involves user stories • Short descriptions of desired features • Provide value to customer • Testable (do we have that feature 2 weeks in?) • Clients write stories and prioritize them • In 335, SLs wrote the requirements, you prioritize • Developers estimate how long a story takes • If you are so inclined, estimate each requirement
Practices: The planning game • Business decisions (customer) • Scope: which “stories” should be developed • Priority of stories (requirements) • Release dates • Technical decisions (developers) • Time estimates for features/stories • Elaborate consequences of business decisions • Team organization and process • Scheduling • You do the organization and schedule
Practices: Estimation • Based on similar stories from the past • Team effort • We get good at estimation simply by just doing it • Ideal Engineering Time (IET) • could be points • We have used person hours to estimate, • Agile teams use points • Velocity = IET/Calendar Time • Say we can do 20 points each week • Which 20 points should we this week
Practices: Small Releases • Releases should be as small as possible • Should make sense as a whole • Put system into production ASAP • Fast feedback • Deliver valuable features first • Short cycle time • Planning 1-2 months rather than 6-12 months • Or in our case, ~2/3 week rather than 5 weeks • We have one "release": • The last day of lecture at 3:00 pm
Practices: Simple design • The “right” design • Runs all tests • No code duplication, No code duplication • Fewest possible classes • Short methods • Fulfills all current requirements • Design for today not the future • But design so the system can change • Design patterns help
Practices: Testing • Software should be tested, but it is often spotty or overlooked • Automatic testing (JUnit, for example) helps us know that a feature works and it will work after refactoring, additional code, and other changes • Provides confidence in the program
Testing • Write tests at the same time as production code • Unit tests developer • Feature/acceptance tests customer (not in 335) • Don't need a test for every method (but why not) • Testing can be used to drive development and design of code • But it help to have an overall architecture first (see you UML class diagram, which is subject to change • Allows for regression testing • Did my change break previously working code? • If well-tested, you see the red bar
SIM/SQS http://www.simgroup.com/Consultancy/regression.html • Regression Testing • re-testing of a previously tested program following modification to ensure that faults have not been introduced or uncovered as a result of changes. • Regression tests are designed for repeatability, and are often used when testing a second or later version of the system under test • Regression testing can be carried out on all applications, includinge-Commerce and web-based systems
Testing • Strong emphasis on regression testing • Unit tests need to execute all the time • Unit tests pass 100% • Acceptance tests (we haven't seen these) show progress on which user stories are working • Other testing frameworks include • SUnit, JMeter, HttpUnit, JProbe, OptimizeIt, CPPUnit, PyUnit (used in ISTA 130 Fall 11)
Can't unit test always • Won’t have unit tests for • GUIs: There are testing frameworks to simulate and test user interaction, but not this course • Just added to WebCat • Network, use visual inspection while running • Views, animation, drawing: visually inspect • this is system verification too • Randomness: Some strategies might have some randomness, which can be hard to work with • Could have tournaments to see which AI wins
Practices: Refactoring • Restructure code without changing the functionality • Goal: Keep design simple • Change bad design when you find it • Remove dead code • Examples at Martin Fowler's Web site: http://www.refactoring.com/see online catalog
Practices: Pair programming • Write production code with 2 people on one machine • Person 1: Implements the method • Person 2: Thinks strategically about potential improvements, test cases, issues • Pairs change all the time. Has advantages such as • No single expert on any part of the system • Continuous code reviews, fewer defects • Cheaper in the long run, and more fun • Can you form a team of 4 and have everyone see all code? • Problems: • Not all people like it • Pairs need to be able to work together, requires tolarance
Practices: Collective ownership • All code can be changed by anybody on the team • Everybody is required to improve any portion of bad code s/he sees • Everyone has responsibility for the system • Individual code ownership tends to create "experts", the "experts" tend to create difficult team situations • Every year in 335...
Practices: Continuous integration • Integration happens after a few hours of development • Checkout build with your changes, Make sure all tests pass (green bar) • In case of errors: • Do not put changes into the build • Fix problems • Check in the system to the common repository • Repeat • You could be using CVS, Subversion, Mercurial, Git to store a common repository of your code
Continuous Integration • Find problems early • Can see if a change breaks the system more quickly -- while you remember the details • Add to the build in small increments
Practices: Coding standards • Coding Standard • Naming conventions and style • Least amount of work possible: Code should exist once and only once • Team has to adopt a coding standard • Makes it easier to understand other people’s code • Avoids code changes due to syntactic preferences • You get to fight over curly brace placement
Practices: On-site customer • Many software projects fail because they do not deliver software that meets needs • Real client should be part of the team • Defines / decides the needs • Answers questions and resolves issues • Prioritizes features • Helps prevents devs from making decisions like: "They probably wanted us to ....” • We will simulate this at the end of lecture • Get together by teams …
Values: Communication • Communication • Clients centric (write "Stories") • Pair programming • Task estimation • Iteration planning • What to do in the next cycle • Design sessions
Values: Simplicity • Simplicity • Choose the simplest thing that will work • Choose the simplest design, technology, algorithm, technique
Values: Feedback • Feedback very important • Small Iterations • Frequent deliveries • Pair programming • Constant code review • Continuous integration (add often to the build—sync your code with the system) • Check out, run all test including your new tests and code, if all pass, check in the updated system • automated unit tests (JUnit, for example)
Values: Feedback • Compiler feedback: seconds • Pair programming feedback: half minutes • Unit test feedback: few minutes • Acceptance testing: half hours • Customers write these, we are not doing this in 335 • Customer feedback: daily (or several times/week in our case) • Iteration feedback: weekly • Project manager meetings weekly