630 likes | 868 Views
It’s All About Program Improvement. Part I: Finding the Red Flags in Your Data Part II: Student Retention – Sharing What Works. Michigan Administrative Leadership Institute Building World Class Programs March, 2004. It is all about. Program Improvement
E N D
It’s All About Program Improvement Part I: Finding the Red Flags in Your DataPart II: Student Retention – Sharing What Works Michigan Administrative Leadership Institute Building World Class Programs March, 2004
It is all about Program Improvement Improving the Quality of our Services and the Success of our Students
Program ImprovementThree Starting Points • Assessing Current Capabilities • Programs • Staff • Using Data to pinpoint program improvement targets • Integrating Research
TODAY Using Data to pinpoint program improvement targets
Warming Up to Data When I talk about data and graphs, I feel like (a) _______________________ because _________________________.
Warming Up to Data When I talk about data and graphs, “I feel like Marcia Clark because I’ve got all kinds of data and the jury (my teachers) just won’t believe it.” “I feel like I’m in the wrong room because I noticed the accountants meeting next door.”
Part I Training Objectives By the end of today’s workshop, you will be able to use a data analysis process to promote continuous improvement by: • Developing critical success factors for key program components, • Identifying types and sources of available data for those components, • Determining appropriate indicators for flagging potential problems/red flags, • Generating appropriate questions to identify possible causes of red flags, and • Developing a structure to apply what you learned.
OVERVIEW Part I: Types of data you have at hand: • Performance data • Local census data • Enrollment and attendance data • (Handout # 1A)
OVERVIEW • Part II: Using data for decision making • What are the Key Program Components of my program? • How do I know we are successfulwith each? 3. What data do I have to determine if we are successful?
OVERVIEW • Part II: Using data for decision making 4. Finding the Red Flags (What is not “good enough?”) 5. Isolating the problem 6. Finding and Pilot Testing possible alternatives 7. Integrating the new alternative throughout my program (Going to Scale)
OVERVIEW Part III: Making it Work • Structure: Creating a structure and process to do the 7 items above • Resources: Setting aside a few resources to perform the 7 items above.
Data: A Carrot or a Stick? Data can be used… • To highlight, clarify, and explain what’s happening in your program, OR • To show what’s not happening in your program
Using Data to Your Advantage (i.e., as a carrot) is up to you.
Data Tells You… • What has happened, • What is happening now, • Where do we need to focus our program improvement energy?
What kinds of data do you have? • Attendance/Enrollment Numbers • Student Test Scores and Learning Gains • Student Drop-out/Completion Rates • Teacher Characteristics • Student Demographics • Budgets • Census Data • Other
What can the data tell me? • Are students coming? (enrollment) • Are they staying? (retention) • Are they learning? (educational gains) • Are they meeting their goals? (student transitions) • Other?
Functions of Data • Accountability • Help us identify whether goals are being met • Program Management and Improvement • Help us isolate problems • Help us replace hunches and hypotheses (gut and guru) with facts concerning the changes that are needed • Marketing • Tell our funders, our boards, and our communities about the value of our programs and the return on their investments
Quote of the Day “People without information cannot act. People with information cannot help but act.” (Ken Blanchard)
Activity 1: Warming Up to Data • Data show that young adults in your program are dropping out at a significantly higher rate than older adults.
Activity 1: Warming Up to Data • What data would have told you this? • What drop-out rate would you consider to be a problem (i.e., 5%, 20%)? • Are there additional data you would want to look at to explore this issue? • What questions would you ask?
Program ImprovementThree Starting Points 3 • Assessing Current Capabilities • Programs • Staff • Using Data to pinpoint program improvement targets • Integrating Research
Data Analysis Process • Identify your key program components. • Determine the critical success factors for each component. • Identify and collect the data you need. • Analyze and interpret the data to determine problems/red flags. • Develop probing questions to isolate possible causes. • Return to the Trident.
Step 1: Identify the Key Program Components • What is most important to you • For today’s workshop: • Enrollment: Are the students coming? • Retention: Are they staying? • Federal Performance Measures: Are they learning? Are they meeting their goals? • Learning Gains • High School Credential • Transition to Postsecondary/Job Training • Employment Goals
Step 2: Determining Critical Success Factors What criteria do you use to determine if your program components are successful?
Step 2: Determining Critical Success Factors Any other success factors?
Step 3: Identifying and Collecting the Data • What data do you already have that will answer your question? 2. What additional existing data, if any, will you need to answer your question? 3. Where are you going to get the additional data?
Activity 2: Data Data Everywhere Small Groups: Group 1: Enrollment Group 2: Learning Gains Group 3: High School, Postsecondary, Employment
Activity 2: Data Data Everywhere For your group’s designated program component/s and critical success factors, determine: • What data do you already have that will answer your question? • What additional existing data, if any, will you need to answer your question? • Where are you going to get the additional data?
A Look at the Census • Making Sense of the Census • Types of Data Available • Data Tables • Data Maps
Census Levels • US • State • County • County Subdivisions • Census Tracts • Zip Codes
NRS Tables: Up Close and Personal A Look at Your MAERS Reports • What information do they contain? • What do they tell us? • Do you trust the data?
Step 4: Analyzing and Interpreting the DataFinding the Red Flags • Determining what is good enough • Are there specific performance benchmarks you must meet? • Are there other program standards you have established? • Do you know the state average for a particular component (i.e., average cost per student, average number of instructional hours/student)?
Activity 3: Red Light, Green Light • For your group’s program component: • Examine the data you have • Were there additional data that would have been helpful to have? What? • Determine if there are any red flags • Were the data consistent with the critical success factors? Were there gaps? • Graphically display your findings on the flip chart (graph, chart, etc.)
Focusing the Data If you know why, you can figure out how… (W. Edwards Deming)
Step 5: Developing probing questions to isolate possible causes. • What questions do I need to ask? • Resources: • Red Flag chart • 50 Questions handout • Super Duper program self-assessment
Developing Measurable Questions RED FLAG:Failed to meet state performance target for Low Intermediate Students • Poor Question: • Does my program have good teachers? • Good Question: • Does student learning differ by average number of instructional hours? • Better Question: • What are the differences between the average number of instructional hours for low intermediate students and high intermediate students?
Developing Measurable Questions RED FLAG:Failed to meet state performance target for students earning a high school credential. • Poor Question: • Do teachers/directors know how to set realistic student goals for high school completion? • Good Question: • What entry EFL produces the most high school/GED completions? • Better Question: • How do the entry EFLs differ among students with high school/GED completiongoals who actually earn a credential?
“Let’s do unto the data before the data gets done unto you.” (Edie Holcomb)
Sample Probing Questions HO #5
Sample Probing Questions HO #6
Sample Probing Questions Additional Questions for Class/Teacher-Specific Data • Is there a relationship between completion of EFL’s and instructional setting (e.g., learning lab versus classroom)? • Is there a relationship between years of teacher experience and completion of EFL’s? • Is there a relationship between part-time and full-time teachers with completion of EFL’s?