160 likes | 320 Views
Evaluation Framework, Design and Data Collection (Part 1). Evaluation Questions and Construction of Evaluation Framework. Evaluation Questions—A Review Formative Summative Question : From WHERE do we derive our Evaluation Questions?. Evaluation Framework.
E N D
Evaluation Questions and Construction of Evaluation Framework • Evaluation Questions—A Review • Formative • Summative • Question: From WHERE do we derive our Evaluation Questions?
Evaluation Framework • What are the key constructs that will be measured? • What changes am I hoping to find? • What do I need to know about the strategies and content used in professional development? • Who has the information I need to collect? • How will I gather the information I need from each source? • What tools or processes will I use to gather the information I need?
Evaluation Framework Components • Program Goals • Measurable Objectives • Information/Data Needed • Data Source • Data Collection Strategy • Data Analysis Plan • Timeline
Creation of Evaluation Framework Question: What do we consult to develop the Evaluation Framework?
Putting the Pieces Together Theory of Change Logic Model Evaluation Framework
Questions to ask of your Theory of Change: • What key concepts will be measured? • How will those key concepts be measured? • How will data from those measures be analyzed to construct the answer to your evaluation questions?
Evaluation Design: • Experimental • Quasi-experimental • Descriptive • Naturalistic • Case studies • Mixed-method
Communicating Results • Allow your data to tell your story • Guiding questions or goals provide the basis of your narrative (NOT the survey) • Be clear about what was studied • Be concise! (Always include an executive summary) • Be objective!
Creating Tables • Write table titles that report exactly what is in your table • Label every column and every row • Avoid using too many numbers in the table • Report group sizes • Report whole number percentages • Present data in some sort of order (i.e., low to high)
Spring 2008 Survey of MD Online IEP Users • Training Perceptions • Trained vs. Untrained • User Perceptions Related to the Student Compass Wizards • Frequency of Use of the Student Compass Wizards • User Perceptions Related to the Searchable VSC • Goal: The Student Compass Wizards within the Online IEP tool will increase the quality of IEPs developed by IEP End Users.
The percentage who agree or strongly agree that the content of the Wizards was helpful in the IEP development process: The percentage indicating that they always or sometimes use the *Goal Wizard when creating students’ goals: Summary of Findings: Spring 2007 to Spring 2008 • * Wizard usage dropped related to the Present Levels and Accommodations Wizards – these Wizards were released at a later date and were not accessible until mid-year.
Summary of Findings: Spring 2007 to Spring 2008 The percentage who agree or strongly agree that the searchable VSC increases access to the general education curriculum for students with disabilities: The percentage who agree or strongly agree that the searchable VSC promotes alignment of students’ goals to the general education curriculum:
Summary • There was a significant increase in the percentage of end users who see the value in the use of the Student Compass Wizards from 2007 to 2008. • The majority of MD OIEP users see the value in the Student Compass Wizards, but may need further PD to ensure proper implementation and increase frequency of use, as well as the perceived value of the student compass wizards. • The level of training that users perceive greatly affects the level of frequency of use. • The searchable VSC is a highly regarded component of MD OIEP that provides teachers with an efficient way for teachers to scaffold instruction to better provide students with disabilities access to the general education curriculum.
Tamara Otto Research and Evaluation Center for Technology in Education tamaraotto@jhu.edu