280 likes | 297 Views
Webinar Track Your Progress Toward High-Performance EA By Creating And Using An EA Maturity Model. Tim DeGennaro, Analyst. September 12, 2012. Please call in at 10:55 a.m. Eastern time.
E N D
WebinarTrack Your Progress Toward High-Performance EA By Creating And Using An EA Maturity Model Tim DeGennaro, Analyst September 12, 2012. Please call in at 10:55 a.m. Eastern time
It is important to track EA’s progress and performance. But creating and running an EA assessment takes time and is difficult — and can’t compete with other EA priorities. This presentation provides a pragmatic approach to creating or customizing your assessment.
Agenda • Starting an EA assessment initiative • Developing questionnaire • Deciding on scale • Examples to guide your approach • Running an assessment
Where does EA assessment fit in? AN EA ASSESSMENT RESTARTS AND INFORMS THE EA CYCLE EA assessment
Which questions can it answer? AN EA ASSESSMENT CAN PROVIDE A FEW TYPES OF BENEFITS “Prove to me that you’ve made progress with EA.” “Where do we need to get up to speed?” “What kinds of skills should we develop?” “Are we aligned with our organization’s vision for EA?” “Have we delivered?”
Five dimensions comprise a good assessment model Validate Dimensions Questions Scoring scale Interview targets Explanation of results A list of the dimensions the assessment addresses A questionnaire with clear questions A simple scoring scale/methodology that improves transparency A list of potential interviewees for outside-in EA assessments An explanation of how the framework communicates results
Developing questionnaire and criteria Validate Dimensions Questions Scoring scale Interview targets Explanation of results
Determine the dimensions STEP 1: PICK YOUR STRUCTURING PRINCIPLES Does the organization dissect EA as: A set of EA domains? A set of major activities? A group of bodies? . . . ?
Five types of criteria to include STEP 2: MATCH CRITERIA TYPES TO STRUCTURING PRINCIPLES Choose your criteria areas pragmatically to save time: Processes: the methods EA applies to accomplish tasks/goals. Always include. Content: the tangible resources EA keeps to inform its processes. Always include. Organization: the internal people or collections of people that perform EA tasks. Include if skill development/hiring is needed. Relationships: the interactions EA has with external roles. Include to show the need for collaboration. Management: the leadership applied internally to EA. Include if EA has gone through this cycle a few times already.
Criteria types applied to structuring principles (e.g., 1 — EA domain) Tech arch Process: standards management Content: standard repository Organization: infrastructure architect Relationships: technology SMEs Info arch Process: data governance Content: information models Organization: data architects Relationships: process owners App arch Process: application arch Content: reference architecture repository Organization: application architects Relationships: business analysts Bus arch Process: strategic planning Content: capability map Organization: business architects Relationships: LOB managers
Criteria types applied to structuring principles (e.g., 2 — major EA activities) Planning and road mapping Process: strategic planning Content: tech life cycles Organization: business architecture leads Relationships: IT and business management Tech selection Process: vendor analysis Content: standards repository Organization: ARB Relationships: LOB managers Design Process: application arch Content: process model repository Organization: business architects Relationships: business analysts Consolidation Process. APM Content: application data Organization: portfolio manager role Relationships: LOB managers
Developing the scale Validate Dimensions Questions Scoring scale Interview targets Explanation of results
Best practices for developing a scale A good scale: Follows a “0 to 5” numerical scale. Is consistent across criteria types. Process 1 and Process 2 scored the same way. Artifact 1 and Artifact 2 scored the same way. Describes the differences between scores. Is not 100% prescriptive (allow for some vagueness). Reflects your definition of progress.
What do you mean, progress? STEP 3: DEFINE WHAT PROGRESS WILL MEAN Choose from four main types of EA progress: Alignment to needs. Does EA do what is needed, and are stakeholders pleased? Repeatability/adoption. Are EA activities formalized and followed? A specific outcome. What kind of measurable changes has EA enacted? Depth/best practices. How descriptive or rigorous is our approach versus the most advanced examples?
Create a “skeleton” scale STEP 4: APPLY YOUR CONCEPT OF PROGRESS TO THE CRITERIA Source: June 21, 2012, “Track Progress Toward High-Performance EA” Forrester report
Examples for comparison Validate Dimensions Questions Scoring scale Interview targets Explanation of results
Use existing models for completeness STEP 5: VALIDATE AND CLOSE GAPS External models provide a good mix of perspectives: NASCIO: developed with input from 22 US states E2AMM: developed by Institute for EA Developments EAAF: developed by US OMB DoC: developed by US Department of Commerce Forrester EA Maturity Assessment: developed by Forrester’s EA team Note: Links appear in appendix.
Running the assessment Validate Dimensions Questions Scoring scale Interview targets Explanation of results
Assessment approach impacts results STEP 6: CHOOSE ASSESSMENT APPROACH BASED ON TIME AND SCALE
A complete outside-in assessment targets many perspectives Source: June 21, 2012, “Track Progress Toward High-Performance EA” Forrester report
Select the appropriate technique STEP 7: CHOOSE AN APPROPRIATE COLLECTION TECHNIQUE Collection techniques vary in effort: Discussion. Get everyone in a room, score together. (1-3 participants) Excel. Build an excel model, send out individually. (4-10 participants) Online survey. Build an online survey, tab results. (10+ participants)
“What next” requires some perspective STEP 8: INTERPRET RESULTS How you react depends on context: Broad architecture programs: Focus on improving “no zeroes.” New programs: Focus on dissatisfaction. Stagnated programs: Focus on stakeholder needs.
Key tenets of communicating results STEP 9: COMMUNICATE RESULTS Communications should cover: Assessment objectives (1 slide). Assessment process (1 slide). Criteria and scale/what progress means (1-2 slides). Summary results — good and bad (1-2 slides). Prioritized areas of improvement and rationale (as needed). EA’s one-year development plan for process, content, relationships, and structures (1-2 slides). Impact on EA resources (1-2 slides).
Tim DeGennaro tdegennaro@forrester.com
Selected Forrester Research June 21, 2012, “Track Progress Toward High-Performance EA” July 28, 2011, “Forget EA Nirvana: Assessing EA Maturity” July 28, 2011, “Forrester’s EA Maturity Assessment, Q3 2011” July 2, 2009, “Use EA Assessment And Maturity Models To Guide Your EA Program Next Steps”
Appendix — links to assessment frameworks NASCIO Maturity Assessment: http://www.nascio.org/publications/documents/nascio-eamm.pdf E2AMM Maturity Assessment:http://www.enterprise-architecture.info/Images/E2AF/E2AMMv2.PDF OMB’s EAAF:http://www.cio.gov/Documents/OMB_EA_Assessment_Framework_2.pdf DoC Assessment:http://pubs.opengroup.org/architecture/togaf8-doc/arch/chap27.html#tag_28_03