600 likes | 607 Views
Learn how to make the award, manage the contract, and incentivize digital service contracts. Understand software engineering practices for high-quality digital services and how to execute an exit strategy. This is the final capstone assessment for the course.
E N D
Release 3 Classroom Session – Day 5 November 18, 2016
Day 5 Agenda Day 5 – Preview of Release 3 and Demo Day
Release 4: Awarding & Administering Digital Service Contracts Introduction Making the Award, Managing the Contract, and Incentives
Introduction to Release 4 Awarding and Managing Contracts Iteration 4.A: Awarding Digital Service Contracts • The source selection and award process • Preparing for post-award management, kick-off, expectations, schedule, and so forth Iteration 4.B: Digital Services Delivery • Performance evaluation – How do you know when it is going badly? • Using an exit strategy • Capstone and Classroom Preview: • Final Capstone Assessment • Live Digital Assignment final presentations • What’s next?
Iteration 4.A: Awarding Digital Service Contracts Performance Objectives • Assess the readiness of the technical evaluation team. • Understand when and how to negotiate during the solicitation process. • Describe how to run an effective evaluation to get the best solution. • Determine the next steps that follow contract award. (Kickoff, Ramp-up, Baselining)
Iteration 4.B: Digital Services Delivery Performance Objectives • Identify software engineering practices for high-quality digital services like version control, continuous integration, and continuous delivery. • Explain which metrics can be used, how they are derived, and why they are used. • Determine how to execute an exit strategy and course correct. • Identify when failure actually occurs.
Capstone and Final Classroom Preview • At the end of Release 4, we will have a final capstone assessment • Covers all course content • You’ll need a passing grade in order to pass the course • There is also one final classroom session • Schedule: January 9th– January 11th • Final review • Guest Speakers • What’s next • Presentation of final Live Digital Assignment Projects • Graduation!!
Iteration 4.A Awarding Digital Service Contracts Introductions
Iteration 4.A: Awarding Digital Service Contracts Iteration 4.A schedule: December 5ththrough December 16th How do you select and prepare your technical evaluation team?
Iteration 4.A: Awarding Digital Service Contracts When and how should you negotiate during a solicitation? • How are we getting the best value solution? • Do you open negotiations or not? • What are some tactics that can help you adjust to the room and personalities to get your desired outcome?
Iteration 4.A: Awarding Digital Service Contracts • How do you run an effective evaluation and get the best solution? • What are the tradeoffs the team needs to prepare to dicuss? • How do you get the best value? • Do you understand what you are evaluating?
Iteration 4.A: Awarding Digital Service Contracts What comes after a contract is awarded? • Debriefing • Administration tasks • Kickoff • Project planning • What do these tasks look like for an agile project? • Contract Financing
Iteration 4.A: Awarding Digital Service Contracts Acquisition Blogging Opportunity • Think about the acquisition package you worked on during this Classroom experience • Write to an audience whom you want to support your acquisition • How do you gain their buy-in? • Why is it important to them? • Consider what social media you could use to promote your post
Hart and Holmström • Winners of the 2016 Nobel Prize for Economics • Identify two factors • Conflict of interests • Measurement • Implications for government contracting • Limit over-incentivizing for cost reduction • Promote accurate, meaningful measurement
Agile Metrics • “Amplify learning” • Fail fast
Core Agile Metrics • Lead/cycle time • Throughput • Cumulative flow • Bugs (number, type, severity) • Scrum • Metrics we already know • Story points • Burndown • Velocity • Potential flaws • Relative • Administrative burden • #NoEstimates
Sources for Metrics • Project tracking system • Source control • Build pipeline • System monitoring
Project Tracking System Metrics • Estimates • Tags • Bugs • Recidivism • Assignees
Sample Project Tracking Systems • JIRA • Trello • OnTime • LeanKit • RealTimeBoard • TeamPulse • PlanBox • FogBugz
Source Control Metrics • Who, what, when, where of code changes • Distributed source control • Accepted and denied pull requests • Reviews
Sample Version Control Systems • Git • Subversion • CVS • Mercurial • Team Foundation Server
An Important Lesson Focus on idle work; not idle workers.
Build Pipeline • Automated software builds • Automated artifact versioning • Automated documentation generation • Automated testing • Automated static analysis • Automated deployment • Sample tools • Jenkins • Hudson • Travis CI • Bamboo • GoCD
Automated Testing • Types of tests • Developer tests • Unit tests • Integration tests • Functional acceptance tests • Non-functional tests • Scalability • Security • Accessibility • Metrics • Tests passed/failed • Time to run • Number of tests total • Number of tests run • Can make for an interesting review • Quality from the start
Functional Acceptance Testing • Should use Behavior-Driven Development (BDD) • Tests defined as specifications • Reasonably plain-English scenarios mapped to executable code • Created by the Product Owner with the vendor • Sample tools • Cucumber • Numerous language- and framework-specific tools
Scalability Testing • Ensures application continues to perform with greater demands on bandwidth, memory, processing power, I/O, etc. • Different kinds of load • Users • Data • Static assets • Sample tools • Gatling (reports) • JMeter
Security Testing • Ensures application remains secure from all threat vectors • Multiple fronts • Functional tests from user interface • Functional tests from elsewhere • Tests against common vulnerabilities • BDD Security • Gauntltt • Mittn • Infrastructure scanning • OWASP Zap
Accessibility Testing • Particularly critical for web interfaces produced by government • Not nearly as mature so may have to resort to periodic manual testing • Sample tools • Pa11y • AATT
An Aside: Testing As Documentation • Guaranteed to be current • Baseline with existing metrics • Essential for a good transition plan
Static Analysis • Measures code quality • Provides hints into functionality, scalability, security • Technical debt metrics • Code coverage • Duplication • Cyclomatic complexity • Adherence to standards • Lines of code • Sample tools • SonarQube • Numerous COTS
Technical Debt • Typically refers to shortcuts developers may take to get stuff done fast • Types • Naïve • Necessary • Strategic • Common causes • Deadlines • Faking velocity • Insufficient code coverage • Consequences • Increased time to market • Bugs • Schedule and budget sink • Poor morale • A bad product
Continuous Delivery • Automated deployment to production • Accomplished with DevOps • Metrics • Deployment frequency • Lead time • Failure rate • Mean Time To Recover (MTTR)
System Monitoring In Production • Operations • System uptime • Resource utilization • Anomalies • Availability of external resources • Usability • Web analytics (e.g. bounce, conversion, response, error rates) • Social media • Ratings
Sample System Monitoring Tools • New Relic • Splunk • Hyperspin • StatsD with Graphite • Gnip • Modern application technologies • Good logging and analytics
Mashups • Combining data > Data in isolation • API data visualized with dashboard tools
Why It Matters • Digital Services Playbook plays 4, 7, 9-12 • Immediate understanding of project health • Pre-award evaluation of vendor maturity • Metrics and automation awareness • Willingness for scrutiny • Distinguishing bugs and avoiding recidivism • Delivery speed • Post-award • Identifying barriers to productivity • Stakeholder pressure • Absence of Product Owner • Staffing issues • Fail fast • Building trust through transparency and continuous delivery • Transition plan • Process metrics vs. Product metrics
Epilogue: Engineer Motivation • Sense of mission • Technical challenge • Cool technology
Why This Activity Is Important • The importance of tracking and monitoring digital and agile • COs don’t need to run the data, but they do need to know the questions to ask to uncover potential issues or risks to your project • Give you a sense of the types of challenges that arise when managing digital and agile projects • A total of 5 scenarios you could encounter when managing agile, Kanban, and Cloud • Recognize that the same data could tell different stories • Work with your group to generate questions and answers about the data and identify what is, or should be, happening in each scenario • Learn from others’ experiences managing these projects
Activity Overview and Instructions Activity Overview • Set of cards on your table that include a situation on metrics (each table has a different set of cards) • Cards include 3-4 questions that become more involved/introduce new details – take about 4-5 minutes on each question – you will have 15 minutes on each rotation Activity Instructions • Turn over the first card, review the situation, and respond as a group • Turn over the next card in your stack, respond to it, etc. • Organize the stack for the next group and rotate; you’ll rotate to all 5 situations
Debrief – Reading a Burndown Chart from a 3-Week Sprint • What should you “zero in on” when reviewing these two burndown charts? What are some reasons that 15-5 could be three weeks long while 15-8 is only two weeks long? • How can user stories, scope, and resources impact the trends you observe in the charts (both positively and negatively)?
Debrief – Scoping Stories to Prevent a Bottleneck • How does the INVEST test help you scope the stories in this scenario? • What are the tradeoffs of restructuring, and how do you keep quality as a constant deliverable? I ndependent N egotiable V aluable E stimate-able S mall T estable
Debrief – Cloud Provisioning • What can you conclude about the data in the following chart? • If you see a prolonged spike in site traffic, what should you do? • What is your sense of how the contract will perform in comparison to the ceiling? • What are your recommendations for the following year?
Debrief – Preparing for a Weekly Meeting Using Burndown Charts • What key points help you understand the overall status of multiple projects? • When running the contract administration meetings, what types of topics should you bring up? • How do you keep team momentum and engagement no matter what the project status is?
Debrief – Managing Workflow Using Kanban August 2013 • In May 2015, is your team doing well? How can you tell? • What happens if you do not group user stories into small, medium, and large groups? • If you have lots of large user stories, what can you do to help them take less time? • What other metrics should be tracked to know you’re getting a quality product? May 2015