400 likes | 533 Views
Telling the Performance Story of the Real Choice Systems Change Grants. Melissa Hulbert Division of Advocacy and Special Initiatives Disabled and Elderly Health Programs Group Centers for Medicaid and State Operations. Telling the Performance Story of the Real Choice Systems Change Grants.
E N D
Telling the Performance Storyof the Real Choice Systems Change Grants Melissa Hulbert Division of Advocacy and Special Initiatives Disabled and Elderly Health Programs Group Centers for Medicaid and State Operations
Telling the Performance Storyof the Real Choice Systems Change Grants • Alert you to important changes CMS is making in the reporting system for FY04 RCSC grants. • First in a series of communications between CMS and the FY04 grantees. Today: • Overview of the “why and how of the proposed changes”; • Next Steps. • Introduction to the Logic Model Framework.
Need for Change • RCSC Grants for Community Living: Investment by Congress, CMS, and state partners in programs that will improve the lives of people with long-term illness and/or disabilities. • Think strategically about how to apply pertinent lessons from grantee accomplishments and challenges to the formation of state and national agendas for the provision of long-term supports in the future. • Demonstrate, in a measurable way, the value of the investments we have made. We must better “tell the performance story” of the impact of the RCSC grants.
Need for Change • Adoption of various “Performance Measurement” techniques, i.e. GPRA, PART, etc. • CMS: Logic modeling as a tool for strategic planning and program evaluation. • What policies/programs are making improvements in the lives of individuals? • How outcomes should be measured? • How these initiatives and their impact should be described?
Existing Challenges • Grant Diversity: • Scope: nature of the expected change • Target population • Each grantee describes/characterizes the components of their grant differently • Existing reports tend to lack the “so what”; what difference is/could the grant making in the operation of the system or in lives of beneficiaries.
What does this mean for you? • Purpose of Today’s web-cast: • Notify you of upcoming changes to the web-based reporting system • Introduce grantees to the context, terminology and concepts of the refined web-based reporting system. • Provide an initial introduction to the logic model approach and encourage use of this approach in grant project planning and evaluation.
What does this mean for you? • Refinements to the RCSC Reporting System based upon logic modeling framework. • Reordered/replaced existing reporting categories, such as Goals, Activities, and Accomplishments: • Establish a standard and more informative definition of reporting categories; • Provide a better description of each grant’s programs and expected outcomes. • Create a sequence of related events – connect the need for project planning with the grant s desired results. • Create measureable, quantitative measures for each grant and in some cases across grants.
What does this mean for you? • New Reporting Framework and Data Fields: • Goal/Impact • Objectives/Activities • Outputs • Outcomes: Intermediate, System, and Beneficiary • With support from RTI, for each grant: • Incorporated framework into the web-based system • Extracted information from grant applications and semi-annual report and developed proposed text for each new data field.
What does this mean for you? Assumptions • Each grant will directly achieve, at a minimum: • Goal and Impact statements; • Objectives and Activities; and • Outputs. • Given difference in scope, recognize that “outcomes” for some grantees will be outside the grant’s “sphere of influence”. • However, articulation and measure of possible and intended impacts are still important to telling our performance story.
Next Steps • September 9th: “Report Cross-Walk” disseminated to each grantee • Description of new framework; • Summary of new data fields and definitions; • Grant-specific proposed text for the new fields developed by CMS/RTI. • Proposed Text will be automatically uploaded to the web-based report. • September 27th: Grantee Access to Web-based reporting system • Conference calls to “walk through” new web-based report. • September 28th at 10:00 EST • September 29th at 6:00 EST.
Next Steps • Annual Report: Due November 1 • RCSC Technical Assistance Provider: • First responders to questions; and • A resource to review your grant-specific cross-walk and determine any necessary refinements to grant activities and/or proposed cross-walk text. • CMS Project Officer: Request alternative text for web-based reporting fields.
Logic Modeling:A Tool to Guide Program Design & Evaluation John A. McLaughlin MACGROUPX@AOL.COM
My Aim Today • Orient you to a new way think about conceptualizing and telling the performance story of your project. • Provide a simple tool for creating a functional picture of how your program works to achieve its aims • Offer some helpful hints for framing a useful evaluation strategy for your program.
Orientations for Performance Measurement & Evaluation • PERFORMANCE MEASUREMENT • Accountability, description • What objectives/outcomes have been accomplished at what levels? • PROGRAM EVALUATION • Learning, Program Improvement, Defense • What factors, internally and/or externally influenced my performance? (Retrospective) • What effect will this level of performance have on future performance if I don’t do something? (Prospective) • What roles (+/-) did context play in my performance?
Benefits of Logic Modeling • Communicates the performance story of the program or project. • Focuses attention on the most important connections between actions and results. • Builds a common understanding among staff and with stakeholders. • Helps staff “manage for results” and informs program design. • Finds “gaps” in the logic of a program and works to resolve them.
Logic Models as Recipes • Recipes have three essential components: • A description of the entree to be produced; • A list of specific ingredients according to specific measures; • and specific steps to put the ingredients together. • A good cook follows the recipe – managers would do well to create and follow their recipe for success!
Simple Logic Model Contextual Influences 1rst Order Outcome 2nd Order Outcome Resources Customers Activities Outputs Impact 1 2 3 4 5 6 7 Program’s Sphere of Influence HOW WHY
Elements of the Logic Model • Resources/Inputs: Programmatic investments available to support the program. • Objectives/Activities: Things you do– activities you plan to conduct in your program. • Outputs: Product or service delivery/implementation targets you aim to produce. • Customer: User of the products/services. Target audience the program is designed to reach. • Outcomes: Changes or benefits resulting from activities and outputs. • Outcome Structure • Short-term (K, S, A) – Changes in learning, knowledge, attitude, skills, understanding • Intermediate (Behavior) – Changes in behavior, practice or decisions • Long-term (Condition) – Changes in condition • External Influences: Factors that will influence change in the affected community.
Outputs & Outcomes OUTCOME OUTPUT
Recruiting & Training Staff/Volunteers #/% of clients served #/type of participants Participant Satisfaction Surveys conducted Curricula Developed Research Generated Outputs or Outcomes?
Volunteers • If the program is addressing a situation of low volunteer involvement in community affairs and the purpose of the program is to increase volunteering among community residents as a part of a larger community development initiative, then increased numbers of residents volunteering in community life would be an outcome. The outcome is expressed as a behavioral change.
Number or type of participants who attend; number of clients served. • If the purpose of the program is to increase use of a service by an underserved group, then numbers using the service would be an outcome. The outcome is not numbers attending or served; the outcome is expressed as use that indicates behavioral change.
Participant Satisfaction. • For our purposes in education and outreach programming, client satisfaction may be necessary but is not sufficient. A participant may be satisfied with various aspects of the program (professionalism of staff, location, facility, timeliness, responsiveness of service, etc) but this does not mean that the person learned, benefited or his/her condition improved.
Training, Research, Producing • These are Outputs. They may be essential aspects that are necessary and make it possible for a group or community to change. But, they do not represent benefits or changes in participants and so are not outcomes. They lead to, result in outcomes, but in and of themselves, they are outputs.
Steps in the Logic Model Process 1 Establish a stakeholder work group and collect documents. 2 Define the problem and context for the program or project. 3 Define the elements of the program in a table. 4 Verify the logic table with stakeholders. 5 Develop a diagram and text describing logical relationships. 6 Verify the Logic Model with stakeholders. 7 Then use the Logic Model to identify and confirm performance measures, and in planning, conducting and reporting performance measurement and evaluation.
“Z” LogicUnpacking the Program’s Logic • In real life program’s achieve their strategic results through a series of actions similar to a relay race. • Action A produces a set of outcomes that become inputs to Action B. • Action B produces a set of outcomes that become inputs to Action C. • Action C produces a set of outcomes that lead to the final strategic goal of the program. • These actions could be thought of as nested programs within the larger program.
“Z” LogicSupplier-Customer Relationship Unpacking supports more focused Performance Measurement and thus more useful evaluation, as well as better understanding and communication about how the “Program” is supposed to work!
Key Questions Grantees Need to Answer About Their Programs • What am I doing, with whom, to whom/what? (effort) • How well am I doing it? (quality) • Customer Feedback • Peer Review for Technical Quality • User Review for Social Validity • Is anybody (anything) better off? (effect) • Short-term • Long-term • What role, if any, did my program play in the results? • What role, if any, did the context play? • Were there any unintended outcomes? • What will happen if I don’t do something? Performance Measurement Program Evaluation
Performance Measurement Hierarchy Program Logic Elements Matching Levels of Performance Information 7. Measures of impact on overall problem, ultimate goals, side effects, 7. End results social and economic consequences 6. Measures of adoption of new practices 6. Practice and behavior change and behavior over time 5. Measures of individual and group changes 5. Knowledge, attitude, and skill changes in knowledge, attitude, and skills Program Logic Hierarchy 4. What participants and clients say about the program; satisfaction; interest; 4. Reactions strengths; weaknesses 3. The characteristics of program Hierarchy of Performance Measurement Data participants and clients; numbers, nature 3. Participation of involvement; background 2. Implementation data on what the program 2. Activities actually offers or does 1. Resources expanded; number and types of 1. Resources staff involved; time extended
In the end Logic Models: Enable grantees to: • Develop a more convincing, plausible argument RE how their program is supposed to work to achieve their outcomes and communicate this to funding agencies and other stakeholders. • Focus their PM/PE on the right elements of performance to enable program improvement and the estimation of causal relationships between and among elements. • Be better positioned to present and defend their claims about their program performance.