410 likes | 497 Views
COABE State Staff Pre-Conference developed for adult education state staff through the NAEPDC State Staff Workgroup. A State Staff Guide to Intervention Strategies for Low-Performing Programs. National Adult Education Professional Development Consortium. The Role of State Staff.
E N D
COABE State Staff Pre-Conference developed for adult education state staff through the NAEPDC State Staff Workgroup A State Staff Guide to Intervention Strategies for Low-Performing Programs National Adult Education Professional Development Consortium
The Role of State Staff • Low-performing programs • Don’t need to know all of the answers • EVERY state staff member has a role • Need to provide • Clear expectations on effective program performance • A structure and process for defining, identifying, and prioritizing low-performing programs • A technical assistance structure and resources to promote continuous improvement Polis and McLendon, 2008
Training Objectives You will: • Examine ten state-level decision points for intervening with low-performing programs • Determine the most viable options for your state • Create a structure and process for identifying, prioritizing, monitoring, and assisting low-performing programs Polis and McLendon, 2008
What questions to ask? • Activity 1: • Scenario: You are given the assignment to design a state intervention structure and process for low-performing programs. • Make a list of some of the questions you would ask yourself as you begin that task. Polis and McLendon, 2008
Introduction to Decision Points • Have we set clear expectations for effective program practices? • What are our criteria for defining low-performing programs? • How should we prioritize low performance? What is our capacity to provide assistance? • Who has the expertise to provide targeted technical assistance? • How can we get low-performing programs to feel ownership in the program improvement process? Pg. 4 Polis and McLendon, 2008
Introduction to Decision Points • What approach or method will we use to help low-performing programs identify and prioritize needs? • How do we match identified needs to best practices and appropriate resources? • How will local programs pilot and monitor the impact of their program improvement efforts? • How do we monitor program improvement and measure impact at the state level? • What is our exit strategy? Polis and McLendon, 2008
DP #1: Setting clear expectations • What does an effective program look like? • Setting clear expectations through program standards • Indicators of program quality • State samples on NAEPDC website • http://naepdc.org/resource_library/program%20planning%20library/QSProgram_Standards.html • Sample in your packet Pg. 11 Polis and McLendon, 2008
DP#2: Criteria for low-performing programs • What criteria will you use? • Failure to meet core performance measures? • Unacceptable on-site review? • Results of annual desk monitoring? • Other developed state criteria? • State samples Polis and McLendon, 2008
DP#2: Criteria for low-performing programs • Michigan • Meeting or exceeding the state performance targets for completion of individual Educational Functioning Levels (EFL) as determined by standardized assessments; • Meeting or exceeding the state’s overall EFL completion rate; • Helping adult learners set realistic follow-up goals related to employment, post secondary education/job training, and GED/high school diploma; • Meeting or exceeding the state’s overall attainment rate of follow-up goals; • Meeting or exceeding the state’s target for pre-testing and post-testing of students to determine level completion; and • Meeting or exceeding the state’s student attendance hour targets. Polis and McLendon, 2008
DP#2: Criteria for low-performing programs • Michigan • Uses a weighting system • For example, • Student outcomes are worth more than some of the other measures. • Total score determines ‘rating’ of • Exemplary • Superior • Acceptable • Not acceptable Polis and McLendon, 2008
DP#2: Criteria for low-performing programs Polis and McLendon, 2008
DP#2: Criteria for low-performing programs • What criteria will you use? • Additional state samples • Is your data accurate and reliable in identifying these criteria? Pg. 17 Polis and McLendon, 2008
DP#2: Criteria for low-performing programs • Activity 2 • Think about the most reliable data you have on local programs. • Review the list of data elements in your packet. • Which of these data would be most appropriate for the initial identification of low-performing programs? • When would you examine these data to identify low-performing programs? • Who would do this? Pg. 14 Polis and McLendon, 2008
DP#2: Criteria for low-performing programs • Initial data checks for red flags • Was valid and reliable pre-testing and post-testing conducted? • What percentage of students were actually post-tested? • If the percentage is low, why? • Did teachers just not post-test, or did students not remain in the program long enough to be post-tested? • What percentage of students exited within the first 12 – 20 hours of instruction? Polis and McLendon, 2008
DP#2: Criteria for low-performing programs • Initial data checks • What level of confidence do you have with the follow-up data? • High school/GED completion • Entry into post-secondary/job training • Employment • Job retention • How confident are you that student outcomes were accurately input into your state’s data system? Polis and McLendon, 2008
DP#2: Criteria for low-performing programs • Initial data checks • How pervasive was the program’s low performance? Did one or two classes affect the whole program, or did multiple classes have low performance? • Was there a sufficient number of students enrolled in a particular functioning level, or did the low number of students negatively impact performance? Polis and McLendon, 2008
Initial Data Checks Who will conduct the initial data checks on the identified low-performing programs? Polis and McLendon, 2008
DP#3: Prioritizing low performance • Are there different levels of low performance? How do you prioritize? • NGA recommendation: • Target more intensive technical assistance to identified programs that: • Are weaker performers, • Have low internal accountability, or • Have limited capacity to improve. • Triage recommendation: • Concentrate on those who just need moderate assistance Polis and McLendon, 2008
DP#3: Prioritizing low performance • Example: • West Virginia • At-Risk: failure to meet at least 60% of performance measures in the prior program year • Targeted Technical Assistance: failure to meet at least 60% of performance measures for two of prior three years • Low-Performing: Failure to meet at least 60% of performance measures for three consecutive years Pg. 16 Polis and McLendon, 2008
DP#3: Prioritizing low performance • Each tier has a different level of corrective action, technical assistance and support. • Important consideration: • What types of incentives, resources, and assistance do you have the capacity to provide to low-performing programs? • Answer may influence your tiered structure. Polis and McLendon, 2008
DP#3: Prioritizing low performance • Incentives, resources, and support • NGA: sanctions resulted in lower program improvement than did increased, focused TA and support • TA and support require time, staff, and energy • What percentage of your resources (financial and human) can you dedicate to assist low-performing programs? • Who makes the initial contact with program director? Polis and McLendon, 2008
DP#3: Prioritizing low performance • Activity 3 • Should you prioritize low-performing programs? • If so, what criteria would be used to define each level? • Do you want to target programs most in need but may require significant, prolonged assistance for program improvement or low-performing programs with the greatest chance of improvement with nominal assistance? Polis and McLendon, 2008
DP #4: Expertise for technical assistance • NGA: Provide extensive on-site follow-up support from expert educators to implement research-based instructional improvement strategies • NGA: Invest energy in training expert educators, instructional specialists, and assistance team members to work with programs. Polis and McLendon, 2008
DP #4: Expertise for technical assistance • Designate lead state staff person to oversee technical assistance process • His/her role • Take the lead in identifying low-performing programs based on developed criteria • Coordinate scheduled TA visits and meetings • Match identified needs to TA sources • Monitor program improvement efforts Polis and McLendon, 2008
DP #4: Expertise for technical assistance • Identifying your own expertise • In what areas do state staff feel confident in providing direct technical assistance? • In what areas would you prefer to use experts in the field – local directors and instructors with proven track records? • How will you identify them? • How will you train them for their new role? • Will they be compensated for their efforts? Polis and McLendon, 2008
DP #5: Ownership in program improvement • “People don’t argue with what they help to create.” Ron Froman • Low-performing programs must feel ownership in the program improvement process. Polis and McLendon, 2008
DP #5: Ownership in program improvement • Program improvement at the local level requires: • Leadership • Time • Skills • Will Polis and McLendon, 2008
DP #5: Ownership in program improvement • Programs don’t get repaired unless questions are raised by those who know the program best. • Create and nurture a culture of inquiry and continuous improvement Polis and McLendon, 2008
DP #5: Ownership in program improvement • Local program effectiveness teams (PET’s) • A group of people who work together to develop, lead, and coordinate the program improvement process • Six to eight people • Representative group • Coordinated effort • Commitment to the task Polis and McLendon, 2008
DP #5: Ownership in program improvement • Local program effectiveness team responsibilities • Obtain input from other staff and incorporate it into the program improvement process • Collect data • Meet regularly to discuss progress, make preliminary conclusions, reflect on what data shows • Assist with documentation and evaluation of the process Polis and McLendon, 2008
DP #5: Ownership in program improvement • State staff role: • Facilitate a local meeting with all staff to provide an overview of the program improvement process • Outline the role that each staff member plays in program improvement • Facilitate first meeting of the Program Effectiveness Team Polis and McLendon, 2008
DP #6: Identifying and prioritizing needs • Need a deliberate and strategic approach for identifying and prioritizing needs • Two approaches • Possible Causes, Probing Questions, and Strategies Chart • The Program Improvement Prioritization Process Polis and McLendon, 2008
DP #6: Identifying and prioritizing needs • Possible Causes, Probing Questions, and Strategies Chart • Aligns with criteria for identifying low-performing programs • Possible causes and probing questions help programs isolate the root causes of low program performance Pgs. 26 Polis and McLendon, 2008
DP #6: Identifying and prioritizing needs • The Program Improvement Prioritization Process • More global approach • Refer to flowchart in your packet • Prioritization charts • Plotting charts Pg. 25 Pgs. 17 - 22 Pgs. 23 Polis and McLendon, 2008
DP #7: Matching identified needs to best practices • Critical need – most difficult step • Invest time and resources to match needs to strategies • NCSALL, CAELA, TESOL, LINCS • NAEPDC • Sample Causes, Probing Questions, and Strategies chart Pg. 26 Polis and McLendon, 2008
DP #8: Pilot testing local program improvement efforts Local programs need to pick their best sites to: • Ensure the impact of the new strategy on correcting the problem. • If the impact is positive: • build the professional development to implement it program wide. • recommend policy and procedure changes to support its use throughout the program • propose financial needs to scale it up. • identify the data that will need to be collected to monitor the impact program wide. • Scale it up! Polis and McLendon, 2008
DP #9: Monitoring program improvement and measuring impact • Make sure you collect the right data. • Engage the Program Effectiveness Team in collecting and analyzing the data to monitor the impact program wide. • Report the results to the agency head and the state office. Polis and McLendon, 2008
DP #10: The exit strategy • Positive exit • How good is good enough? • Negative exit • De-funding Polis and McLendon, 2008
Introduction to Decision Points • Have we set clear expectations for effective program practices? • What are our criteria for defining low-performing programs? • How should we prioritize low performance? What is our capacity to provide assistance? • Who has the expertise to provide targeted technical assistance? • How can we get low-performing programs to feel ownership in the program improvement process? Pg. 4 Polis and McLendon, 2008
Introduction to Decision Points • What approach or method will we use to help low-performing programs identify and prioritize needs? • How do we match identified needs to best practices and appropriate resources? • How will local programs pilot and monitor the impact of their program improvement efforts? • How do we monitor program improvement and measure impact at the state level? • What is our exit strategy? Polis and McLendon, 2008
Always willing to help Lennox McLendon lmclendon@naepdc.org Kathi Polis klpolis@suddenlink.net Polis and McLendon, 2008