700 likes | 768 Views
NRS Data Monitoring for Program Improvement. Unlocking Your Data. Objectives—Day 1. Describe the importance of getting involved with and using data; Identify four models for setting performance standards as well as the policy strategies, advantages,and disadvantages of each model;
E N D
NRS Data Monitoring for Program Improvement Unlocking Your Data M. Corley
Objectives—Day 1 • Describe the importance of getting involved with and using data; • Identify four models for setting performance standards as well as the policy strategies, advantages,and disadvantages of each model; • Determine when and how to adjust standards for local conditions; • Set policy for rewards and sanctions for local programs; • Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention. M. Corley
Agenda—Day 1 • Welcome, Introduction, Objectives, Agenda Review • The Power of Data • Why Get Engaged with Data? Exercise • The Data-driven Program Improvement Model • Setting Performance Standards • Adjusting Standards for Local Conditions • Establishing a Policy for Rewards and Sanctions • Getting Under the Data • Data Pyramids • Data Carousel • Evaluation and Wrap-up for Day 1 M. Corley
Objectives—Day 2 • Distinguish between the uses of desk reviews and on-site monitoring of local programs; • Identify steps for monitoring local programs; • Identify and apply key elements of a change model; and • Work with local programs to plan for and implement changes that will enhance program performance and quality. M. Corley
Agenda—Day 2 • Agenda Review • Planning for and Implementing Program Monitoring • Desk Reviews Versus On-site Reviews • Data Sources (small group work) • Steps and Guidelines for Monitoring Local Programs • Planning for and Implementing Program Improvement • A Model of the Program Improvement Process • State Action Planning • Closing and Evaluation M. Corley
STOP! Why Get Engaged with Data? M. Corley
Question for Consideration Why is it important to be able to produce evidence of what your state (or local) adult education program achieves for its students? M. Corley
The Motivation Continuum IntrinsicExtrinsic Which is the more powerful force for change? M. Corley
NRS Data-driven Program Improvement (Cyclical Model) STEPS • Set performance standards • Examine program elements underlying the data • Monitor program data, policy, and procedures • Plan and implement program improvement • Evaluate progress and revise, as necessary, and recycle M. Corley
What’s Under Your Data?The Powerful Ps __Performance_(Data)_ Program Policies Procedures Processes Products M. Corley
NRS Data-driven Program Improvement Model Set Performance Standards NRS DATA Plan and Implement Program Improvement; Evaluate Improvement Examine Program Elements Underlying the Data Monitor Program Data, Policy, Procedures M. Corley
Educational Gains for ESL Levels and Performance Standards Exhibit 1-2 M. Corley
Questions Raised by Exhibit 1-2 • How were performance standards set? Based on past performance? • Are standards too low at the higher levels? • Is performance pattern similar to that of previous years? If not, why not? • What are program’s assessment and placement procedures? Same assessments for high and low ESL? • How do curriculum and instruction differ by level? • What are student retention patterns by level? M. Corley
Essential Elements of Accountability Systems • Goals • Measures • Performance Standards • Sanctions and Rewards M. Corley
National Adult Education Goals • educational gain, • GED credential attainment, • entry into postsecondary education, and • employment. Reflected in NRS Outcome Measures of M. Corley
Performance Standards • Similar to a “sales quota”: how well are you going to perform this year? • Should be realistic and attainable, but • Should stretch you toward improvement • Set by each state in collaboration with ED • Each state’s performance is a reflection of the aggregate performance of all the programs it funds M. Corley
Standards-setting Models • Continuous Improvement • Relative Ranking • External Criteria • Return on Investment (ROI) M. Corley
Continuous Improvement • Standard based on pastperformance • Designed to make all programs improve compared to themselves • Works well when there is stability and a history of performance on which to base standard • Ceiling reached over time, resulting in little additional improvement M. Corley
Relative Ranking • Standard is mean or median performance of all programs • Programs ranked relative to each other • Works for stable systems where median performance is acceptable • Improvement focus mainly on low-performing programs • Little incentive for high-performing programs to improve M. Corley
External Criteria • Set by formula or external policy • Promotes a policy goal to achieve a higher standard • Used when large-scale improvements are called for, over the long term • No consideration of past performance: unrealistic, unattainable M. Corley
Return on Investment • Value of program :: Cost of program • A business model; answers question, Are services or program worth the investment? • Can be a powerful tool for garnering funding (high ROI) or for losing funding (low ROI) • May ignore other benefits of program M. Corley
Decision Time for State Teams • Which model(s) do you favor for setting standards for/with locals? • Is it appropriate to use one statewide model or different models for different programs? • How will you involve the locals in setting the standards they will be held to? M. Corley
Question for Consideration How do the standard-setting model(s) that states select represent a policy statement on the relationship between performance and quality that states want to instill in local programs? M. Corley
Adjusting Standards for Local Conditions Research suggests that standards often need to be adjusted for local conditions before locals can work to improve program quality. WHY IS THIS SO? M. Corley
Factors that May Require Adjustment of Standards • Student Characteristics • An especially challenging group • Students at lower end of level • Influx of different types of students • Local Program Elements • External Conditions M. Corley
Shared Accountability State and locals share responsibility to meet accountability requirements • State provides tools and environment for improved performance • Locals agree to work toward improving performance M. Corley
Locals should know… • The purpose of the performance standards; • The policy and programmatic goals the standards are meant to accomplish; • The standard-setting model that the state adopts; and • That State guidance and support is available to locals in effecting change. M. Corley
Shared Accountability • Which state-initiated efforts have been easy to implement at the local level? • Which have not? • What factors contributed to locals’ successfully and willingly embracing the effort? • What factors contributed to a failed effort? M. Corley
High Locals Out of Control?? Hot Dog!! We’re really moving! Local ProgramInvolvement Anything Happening Out There?? Get OFF our backs!! Low Low High State Administrative Control Shared Accountability M. Corley
What About Setting Rewards and Sanctions? • Which is the more powerful motivator: rewards or sanctions? • List all the different possible reward structures you can think of for local programs. • How might sanctioning be counter-productive? • List sanctioning methods that will not destroy locals’ motivation to improve oradversely affect relationships with the state office. M. Corley
Variations on a Theme Exercise • (Refer to H-10). Brainstorm as many possible rewards or incentives as you can for recognizing local programs that meet their performance standards. • Then brainstorm sanctions that the state might impose on local programs that do not meet their performance standards. • Select a recorder for your group to write one reward per Post-It Note and one sanction per Post-It Note. • When you have finished, wait for further instructions from the facilitator. M. Corley
Summary of Local Performance Standard-setting Process M. Corley
Getting Under the Data NRS data, as measured and reported by states, represent the product of underlying programmatic and instructional decisions and procedures. M. Corley
Four Sets of Measures • Educational gain • NRS Follow-up Measures • Obtained a secondary credential • Entered and retained employment • Entered postsecondary education • Retention • Enrollment M. Corley
Educational Gain Assessment Policies and Approach Assessment Procedures I n s t r u c t i o n Goal Setting and Placement Procedures Retention Class Organization Professional Development Educational Gain M. Corley
GED Employment Postsecondary Instruction Support Services Tracking Procedures G o a l-S e t t i n g Retention Professional Development Follow-up Measures M. Corley
Retention Students Class Schedules and Locations Placement Procedures I n s t r u c t i o n Support Services Retention Support and Policies Professional Development Retention M. Corley
Enrollment Enrollment Community Characteristics Class Schedules and Locations R e c r u i t m e n t Instruction Professional Development M. Corley
Data Carousel M. Corley
Question for Consideration How might it benefit local programs if the State office were to initiate and maintain a regularmonitoring schedule to compare local program performance against performance standards? M. Corley
Regular Monitoring of Performance Compared with Standards • Keeps locals focused on outcomes and processes; • Highlights issues of importance; • Increases staff involvement in the process; • Helps refine data collection processes and products; • Identifies areas for program improvement; • Identifies promising practices; • Yields information for decision-making; • Enhances program accountability. M. Corley
BUT… • How can states possibly monitor performance of all local programs? • Don’t we have enough to do already?? • Where will we find staff to conduct the reviews? • You’re kidding, right?? M. Corley
Not! M. Corley
So….Let’s Find Some Answers • How can you monitor performance of locals without overburdening state staff? • What successful models are already out there?? • How does your state office currently ensure local compliance with state requirements? • Can you build on existing structures? M. Corley
Desk Reviews Ongoing process Useful for quantitative data Proposals Performance measures Program improvement plans Staffing patterns Budgets On-site Reviews Single event, lasting 1-3 days Useful for qualitative data Review of processes & program quality Input from diverse stakeholders Approaches to Monitoring M. Corley
Data Collection Strategies for Monitoring • Program Self-Reviews (PSRs) • Document Reviews • Observations • Interviews M. Corley
Program Self-Reviews • Conducted by local program staff • Review indicators of program quality • Completed in advance of monitoring visit and can help focus the on-site review • Results can guide the program improvement process M. Corley