1 / 29

Maximizing Performance in Agile IT Operations

Explore the key capabilities and strategies for high-performing IT organizations to achieve faster delivery, increased reliability, cost-effectiveness, innovation, and competitiveness.

smithsusan
Download Presentation

Maximizing Performance in Agile IT Operations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Agile Imperitive Lt Col Jeremiah Sanders Deputy Commander, AFLCMC Det 12

  2. The Agile Imperative Automate, Automate, Automate! • High Performing IT organizations (2017/18 State of DevOps Report): • Meet mission faster – 440x faster Lead Time from commit to deploy; 46x more frequent code deployments • More reliable – 60x fewer failures, 170x faster recovery (MTTR) • Better software – 5x lower change failure rate, 22% less time on unplanned work/rework; 50% less time remediating security issues • More cost effective – overall dev costs reduced ~40%, costs per program down 78% • More innovation – 44% more time spent on new work vs. sustaining legacy • Improved morale – Employees 2.2x more likely to recommend their org • More competitive – 2x more profitability, market share, productivity, customer satisfaction, quality of products or services

  3. How to Innovate for the Future 24 Key Capabilities of high performing IT organization (Accelerate, by Forsgren, Humble and Kim) • Lean Management and Monitoring Capabilities • Lightweight change approval process • App & infrastructure monitoring inform business decisions • Proactive system health checks • Work-in-process limits • Visualize work to monitor quality and communicate throughout the team • Cultural Capabilities • Westrum-style generative culture • Encourage and support learning • Support and facilitate collaboration across teams • Provide resources and tools that make work meaningful • Support or embody transformational leadership • Continuous Delivery Capabilities • Version control all prod artifacts • Automate deployment process • Continuous Integration • Trunk-based development • Test automation • Manage test data • Shift left on security • Continuous Delivery • Architecture Capabilities • Loosely coupled architecture • Architect for empowered teams • Product and Process Capabilities • Customer (end-user) feedback • Workflow visible through value stream • Small batch sizes • Enable team experimentation

  4. Value Creation: 15 capabilities in ops saving ~1,100 AOC man-hours per month in AFCENT Target Development, ABP Development and Mission Reporting; reduced CPD & COD daily ATO plan & execution workflows 70%; reduced deliberate targeting workflow 85%; decreased MISREP reporting sys downtime 99%; • Concept to operations in ~5 months • Lead Time: 1 week; down from ~5 yrs • Acceptance environment Deployment Frequency: 1330/month • Continuous Authority to Operate; able to push to SIPR in <1hr • Production Deployment Frequency: 38/month; on average, KR is delivering new capability to warfighters in ops every day • ~90% automated/repeatable test code coverage • 84% reduction in path to production; 97.9% faster provisioning; 70% developer time spent on new feature delivery • 96% faster OS patching; 99.1% faster app patching Kessel Run Results

  5. But I Thought We Were Already “Agile”… • “94% of federal IT projects are over budget or behind schedule...40% of those never end up seeing the light of day; they are completely scrapped or abandoned.“ – Haley Van Dyck, Deputy Administrator, U.S. Digital Services • (Accelerate, Lean Enterprise and The Startup Way) • 2/3 capability built using waterfall methods have no, or even negative, value when replacing a legacy capability; 90% when building something for the first time • 40% time/resources spent program planning is completely wasted • ~75% of industry IT resources spent sustaining legacy capabilities instead of innovating; most are tech stack capabilities now provided as commercially managed solutions (IaaS/PaaS/SaaS) • Cost of product • Schedule • Spec Conformance • Process • Outputs • Optimized for certainty/stability • Capacity-driven VALUE creation • Dynamic, prioritized resource allocation • Delivery Performance • Product • Outcomes • Optimized for change Vs.

  6. Not Agile PMO + finance water- Design & planning Study & Approval The fuzzy front end “Agile” teams Analysis scrum- Development Test & Showcase Iteration 0 1 2 3 4 Agile Litmus Tests: • When was the last time you deployed software into ops? • What did you learn? • How do you know? • What is your cycle time? fall- Centralized QA IT Operations Release & Operation Integration & QA @jezhumble The last mile

  7. Agile Development = Experimentation in Ops Developing hypotheses based on user research Validating solutions based on user testing

  8. User’s Hierarchy of Needs + Lean Startup • Competencies Driving Innovation • Lean Startup (Entrepreneurial Management) – creating value on next most important thing? • User-Centered Design – solving real user pain? • Optimize engineering & architecture for change – is it shippable, safe and secure…repeatedly? • Extreme Programming, including TDD • Layered Abstraction; Bounded Context • Automation & Disaggregation JOYFUL USEFUL • Measuring Software Delivery • Lead Time • Deployment Frequency • Mean Time To Restore (MTTR) • Change Failure Rate USABLE FEASIBLE 8

  9. Agility within Acquisitions • Automated DT/OT • Continuous, incremental manual OT as needed (UAT) • Recurring post-deployment test events • Automated Security/ATO • Automated Fielding • Value Stream Mapping • Impact Mapping • FAR Parts 8,12,13 for any contracted support • Production OT for cloud platform • Modular contracting for Toolchains/ COTS • Estimate Dev Capacity req’d • Size platform needs • PPB&E on Product Teams & Platforms Mitigate risk through metered funding Allocate funding via growth boards Growth boards hold product teams accountable for achieving JCIDS defined impacts 9

  10. Contracting Mindset • Reject massive, all-in-one contracts • Understand cycle time is: new-need to value delivered • Say YES to smarter, leaner, cleaner, concise, documentation • Just say NO to FAR subpart 15.3 - source selections • Leverage big, slow authorities for small, fast action • Dominate regulatory gray area, own the narrative • Never cross the line When someone says, “this is how we’ve always done it” I hear “this is how we lose the next war.”

  11. Contracting Outcomes • Kessel Run Agile Acquisitions team awarded 18 contracts over the last 20 months in an average of 70 days, leveraging a variety of contract types and contract vehicles. These contracts support SW Engineering, SW Design, SW Environment, SW Toolchain, Platform Engineering, SW Development Bootcamps, and Business SW Tools.

  12. Contracting Concepts • Using FAR Part 8, 12, 13, & 16 • Large requirements: DO/TO from best-in-class IDIQ (Part 16) • Medium: Multiple Award BPA on GSA Schedule 70 (Part 8) • Small: Simplified acquisition commercial contract (Part 12, 13.5) • Small/Med: 8(a) sole source; socioeconomic, programmatic impact • Accepting volume • More contracts not a bad thing • Volume drives efficiency; doing something once every 5 years does not • Defending simplicity • When asked to “add this” or “combine that” – say no, prove it • Throw out documents written during the 2-year award process • OTA: The Bitcoin of Contracting

  13. The Cultural Imperative Innovating culture for the future • Generative Culture - what we engender and value in our people • Passionate and transformational people with bias for action • Leadership support (empowerment, encourage innovation, delegation and trust) • Willingness to work hard; collaborative, learning organization • Values simplicity (does the simplest thing), values feedback (learns by doing) • Exhibits intellectual humility, courage, respect for others and empathy • Freedom to disrupt status quo; encourage innovation • Failure = learning opportunity; psychological safety • Leverage industry best practices & tech; building not planning • Workplace environment • Relaxed dress code; “aircrew model”…ideas not rank • Small, balanced teams working at a sustainable pace • Amenities on par with commercial industry • Location, location, location….and colocation! • Unfettered access to actual end-users and the tools to do so • Attracting Talent and Retaining It • Mission value • Branding • Compensation – driven by commercial industry • Personality, competencies and ability to learn vs. degrees and experience • Measure Employee Net Promoter Score (eNPS) to gauge org health

  14. Questions? Code. Deploy. Win.

  15. KESSEL RUN VISION Build a “software company” that can sense and respond to conflict in any domain, anytime, anywhere.

  16. KESSEL RUN MISSION Continuously deliver war-winning software our Airmen love.

  17. IMPLIED TASK Revolutionize the way theAir Force builds and delivers software.

  18. THE KESSEL RUN ORGANIZATIONBuilt to focus on value Detachment Commander Col Enrique Oti Deputy CommanderLt Col Jeremiah Sanders KR Xccelerate Matt Brown Agile Acq BranchMaj Matt Nelson Sustainment &I20 BranchLt Col Matt Ross Air Operations Branch Mr. Adam Furtado Wing Operations BranchLt Col Aaron Capizzi Enterprise Services Branch Ms. Erynn Petersen AOC T&G ALIS PEX UC2 DIB CENTAUR JTT

  19. Typical Architecture Kessel RunArchitecture Platform as Abstraction APPLICATIONS APPLICATIONS • Use commercial cloud infrastructure • Public, private, or hybrid… doesn’t matter, depends on operational requirements • Use a leading commercial platform • In-housing the platform layer will cost the government more money and produce inferior capability • Needs to have portability across infrastructure choices • Pipeline of processes from PM tracking, to code base, to testing, to build, to binaries, to deploy • Don’t build when you can buy; don’t buy when you can rent! You manage DATA DATA RUNTIME RUNTIME MIDDLEWARE MIDDLEWARE PCF PaaS VS O/S O/S You manage VIRTUALIZATION VIRTUALIZATION SERVERS SERVERS Cloud Infrastructure STORAGE STORAGE NETWORKING NETWORKING

  20. Notional Continuous Delivery + Parallel Test Construct Discovery Framing Continuous Delivery Iterations Continuous user feedback loops Solution Prioritization Pain Point Discovery Pain Point Prioritization Dev Starts Solution Generation Project Kickoff Iterations go on forever… 3-6 month cadence After MVP Independent org assessments Periodic/Parallel OT, Cyber Test, MCO Scenarios, Exercises & Training Events • Periodic Test Event Intent: snapshot assessment to inform backlog priority • not a pass/fail grade…software is never done! 21

  21. Defense Innovation Board: “Detecting Agile BS” There is surely nothing quite so useless as doing with great efficiency what should not be done at all. – Peter Drucker

  22. Agile – a working definition Leveraging a generative culture of empowered small-teams, hypothesis-driven experimentation, commercially-managed technologies, user-centered design, Lean Startup management and build-to-adapt engineering methodologies, and feedback from actual end-users, in operations, to validate learning and dynamically drive decisions and resource allocation in order to continuously deliver valuable outcomes, optimized for change, with lead times measured in days, safely and sustainably, forever.

  23. Kessel Run Organic Manpower Actions • Conducted hiring event 23-24 January (AFLCMC & KR) • Direct cite request was approved by ACC—$2M BTR worked through SAF/AQX • Targeted 50 vacancies (19 funded and 31 Direct Cite positions as target), most external but working some internal positions • 212 applicants, 57 pre-screened and invited to hiring event, 12 accepted offers on the spot, expect final onboarding of ~30-35 • Set up as broad software business and technical tracks leveraging 2210 flexibility as pilot in KR • Priority NH-04 hires (AFLCMC/DP, AFMC/A1) • 1st hire on-board faster than our goals—approved by AFLCMC/CA 14 Dec; in place before the holidays; 2nd and 3rd in work • Map the on-boarding process and eliminate constraints • Negotiations/final offer-acceptance process (AFMC/A1)—prioritize hiring actions; AFMC/OL-WPAFB (personnel team) • In-processing/computer/CAC card/etc (work with 66ABG/CPS to fast track; waive drug testing before on-boarding) • Allocate 3D0X4 enlisted coder billets into KR • SAF/A1 KR task force established and met 15 January; AFLCMC working AFMC POM initiative with HB and BES • 16S and 8S secondary AFSCs for agile software development • Cyber talent management design sprint at USAFA next week • KR building UMD to annotate requirements • Pilot broad software business and technical tracks for civilians (AFMC, AFLCMC, HB) • Initial discussions—further development during 25 Jan meeting at KR • Pilot WCF construct or alternative approach in KR (AFLCMC/FM, HB FM OSF) • Initial discussions—further development during 25 Jan meeting at KR • Training and certification of SW leaders (AFMC, AFLCMC, HB) • Proposal in development—work with AFLCMC/AQ & CA

  24. Challenges at scale • 1. Manpower. If Kessel Run does not receive additional manpower to support platform, product, and business THEN scaling outside of existing enterprise will be delayed • Mitigation: Working with A1 for additional billets and mitigating staffing shortfalls with LTTDY personnel • 2. Acquisition Tooling and Process. If business tools and processes are developed to support an enterprise model THEN, developing/reporting/planning an enterprise sharing model will be manual and inefficient • Mitigation: Create a KR team that focuses on this pain point. Pilot an Enterprise Acquisition construct • 3. Networks. If the Government does not update its network infrastructure (especially on SIPR), THEN operations leveraging cloud based technologies will be negatively impacted • Mitigation: KR to establish a network experiment for commercial classified network solution 3 1, 2 Likelihood Consequence

  25. Contracting Toolkit • Modular “focused” contracting and Acquisition Strategy • Meaningful, transparent, continual Market Research • Targeted competition (Rule of 3, Simplified Acquisition) • Class D&F for T&M and Labor Hour contracts on Agile Dev efforts • Class Commercial Item Determination (CID) • 8(a) Program, when/where it makes sense • Best-in-Class IDIQs • Army ACCENT (Cloud) – Pending JEDI • 18F Agile BPA • Streamlined evaluations: oral & video presentations, coding challenges

  26. FASTEST TIME THROUGH THE KESSEL RUN = 88 DAYS The Kessel Run ONGOING 2-3 HOURS ONGOING 4-6 WEEKS ONGOING 1 DAY < 30 DAYS ONGOING ACC & Kessel Run ACC Kessel Run VALUE STREAM MAPPING PRODUCT SCOPING OPPORTUNITY BACKLOG INCEPTION ITERATIONS VaDER SPRINT IMPACT MAPPING DISCOVERY & FRAMING TESTABLE JOYFUL USEABLE USEFUL • De-risking analysis to scoping growth board • Prioritized backlog • ID Solution Hypothesis • User Adoption • Legacy Sunset • ID Target Condition • Key Performance Indicators (KPIs) • Prioritized, validated Backlog • Product team resource allocation • D&F Review • Development Kickoff • First “Push to SIPR” • Initial User Adoption • Beta Test LAUNCH GROWTH BOARD INCEPTION GROWTH BOARD FIRST VALUE GROWTH BOARD VADER SPRINT REVIEW KICKOFF GROWTH BOARD JCIDS DAS 27 PPBE

  27. Lessons Learned in Achieving Agility - Metrics • Measuring the Outcome (The One Metric That Matters (OMTM)) • “What combat capability are you delivering?” - CSAF • Drives dynamic resource decisions; can and will change over time depending on problem area • Measuring the Process (Software Delivery Performance Metrics) • Lead Time • Deployment Frequency • Mean Time To Restore (MTTR) • Change Failure Rate Last deployment in production Last release of code Avg. release frequency Avg. user value lead time User value lead time: time between a feature being released and the time that feature is delivered into the hands of the use

  28. Air Tasking Cycle Evolution

More Related