150 likes | 267 Views
Program and Compliance Management. V I R T U A L L Y. Workshop: Common Measures NOT. Outline. Not Management Tools Intermediate Metrics Master Your MIS Share the Accountability. Not Management Tools …that is, common measures aren’t. Accountability and comparison tools
E N D
Program and Compliance Management V I R T U A L L Y Workshop: Common Measures NOT
Outline Not Management Tools Intermediate Metrics Master Your MIS Share the Accountability
Not Management Tools…that is, common measures aren’t Accountability and comparison tools Performance assessed annually; States/Locals must assess operations on ongoing basis Common measures focus on outcomes, not operations or strategies Translation: Common measures focus on ‘bottom line’ results, not the drivers of performance
The timing of data availability precludes utility for day-to-day management State and local staff should be able to respond to issues, as opposed to having to react Local staff need to answer question: What changes in service design or delivery would enhance performance (including common measure outcomes)? Not Management Tools (2)…that is, common measures aren’t
Managing Performance in Lieu of Federal Measures Managers and staff need measures that: Provide real-time information Deploy agency’s strategic plan and focus/align agency activities and efforts Test cause/effect relationships among program activities Evaluate center and system performance (not just program performance)
Developing Intermediate Measures INPUT PROCESS OUTPUT OUTCOME IMPACT National (federal) measures focus here But many things happen before “the outcome” that can be measured. Some might even predict the ultimate outcome!
“Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” Harrington, The Improvement Process Measure the right things, the right way, at the right time Many metrics just waiting to be crafted based on your State’s strategic plan! Developing Meaningful Metrics Keep It Mind!
Examples Input Measures Measures related to outreach and recruitment Enrollment rates Measures related to percentage of accepted referrals from other partners Process Measures % of individuals who don’t receive services for >30 days Extent of partnering/referrals for co-enrollments Timeliness of reports (internal or external) Employer repeat usage
Output Measures Completion rates # of exits with positive outcomes by ‘x’ time period Percentage of referrals to registered apprenticeships Outcome Measures Employment, Retention, Average Earnings, Earnings Change, Wage Replacement, Customer Satisfaction, Credential Attainment (Program-specific or system-wide) Percentage of customers employed at exit, 30 days post-exit Impact Measures Measures of self-sufficiency, etc. Examples (2)
You Must Master Your MISNo Kidding! • Whether it’s AJL or VOS, etc. • Know what you have, what it can produce, how to get key information out of it, and understand any data issues (e.g., qualifications) • Remember that “every ONE counts” (Session #1) • Know your data-related policies (e.g., maximum timeframe for data entry) • MIS training/retraining both necessary
The ‘Best’ MIS Training • Hands on • Marries data entry and staff “interaction” with the MIS with case management and program management (keep it real!) • Demonstrates policies “in action” • Individual and Group Exercises (e.g., case studies, mock participants) • Reference materials for post-training
Share the AccountabilityContribution vs. Attribution Various partners contributing their resources and services in order to meet the needs of employer and job seeking customers is the intent We can share the outcomes so why not the accountability that goes along with it? Push the accountability downward to the extent you can! We discussed this in Session #1
Sharing Accountability…how? • With system partners – through MOUs, for instance • With One-Stop Operators – through RFP and contracting process, through local reviews • With service providers – through RFP and subsequent contract provisions • Within centers – through public sharing of data about other offices within same LWIA • …These are but a few examples
Example of Not Sharing AccountabilityRemember: Contract Management is part of Performance Management • Many contract vehicles lack appropriate protections, which obviously vary depending on the context (e.g., probationary provisions for declining performance?) • A state workforce agency could no longer continue financing a certain youth provider for WIA. When the state took over operations, the youth case files were not returned, leaving the state without key information by the time of a DV review.
In An Ideal World . . . (?) State/Local staff already collect the necessary data (consistently) to develop meaningful metrics (e.g., completion info/updates) Data entry is timely and accurate and staff understand the impact of timely/accurate reporting The data are part of the statewide MIS or another system that processes the data (not Hotel California) Management reports are readily produced/available to all staff Performance data are routinely discussed at staff meetings Data management is a priority