330 likes | 535 Views
Mickey Garrison, PhD Oregon Department of Education. Are Data-Based Decisions Promoted Through SLDS Happening at the Teacher Level? . Denise Airola, PhD Karee Dunn, PhD Sean Mulvenon , PhD University of Arkansas & Next Level Evaluation, Inc. Key Elements Addressed in SLDS Project.
E N D
Mickey Garrison, PhD Oregon Department of Education Are Data-Based Decisions Promoted Through SLDS Happening at the Teacher Level? Denise Airola, PhD Karee Dunn, PhD Sean Mulvenon, PhD University of Arkansas & Next Level Evaluation, Inc.
Key Elements Addressed in SLDS Project • What were the key elements in the design and implementation of the Oregon Direct Access to Achievement (DATA) Project? • Explosion of initiatives • Integration of initiative elements into comprehensive Oregon DATA Project • Professional development • Build capacity and sustainability • What were the key evaluation considerations? • Evaluation questions • Implementation and outcome measures
Key Element in Project Design:Responsive to anexplosion of initiatives at state and local level • Response to Intervention (RTI) • Positive Behavior Support (PBS) • Oregon DATA Project/ Data Teams • Marzano/Effective Classroom Strategies • DuFour/Professional Learning Communities (PLC) • Effective Behavior and Instructional Support Systems (EBISS) • Scaling Up Evidence Based Practices (state level) • Effective School Practices Network (ESD/county level) • Coaching
Key Element in Project Design: Integration Direct Access to Achievement: The Oregon DATA Project • Data teams • Data-Driven Decision-Making-CIP & SIP • Unwrapping Standards • Lesson Design • Effective Teaching Strategies • Intervention Design • Progress monitoring • System of Accountability Curriculum Instruction
Key Element in Project Design: Professional Development Living Likert Read the statement below, then indicate your agreement level with a thumbs up meter. “Schools’ in my state/district are using data to inform their decisions in leadership and instruction.” Strongly agree Strongly disagree
Key Element in Project Design: Capacity and Sustainability Essential steps toward building a PD framework that is focused on capacity and sustainability
How would the educators you work with complete these thoughts? • I know what I’m doing is working in the classroom because… • I know this program is helping students because…. • I know our improvement efforts have positively impacted student achievement because…
A wise man proportions his belief to the evidence. David Hume What evidence do you have to support your perceptions?
Key Considerations for Evaluation Evaluation Questions Implementation Outcomes
Key Consideration: Evaluation Questions • Teacher Impact—Do professional development and support through a job-embedded approach change teachers’ DDDM practice compared to non-participating teachers? • Student Impact—Do professional development and support through a job-embedded approach impact student achievement in participating teachers’ classrooms compared to non-participating teachers?
How do we monitor implementation? Key Consideration for Evaluation: Monitoring Implementation • Grant reporting requirements for participants • Activity log completion by trainers and data team leaders • Focus of data team meeting • Use of PD content embedded in data team meeting • Changes to PD content documented • Strengths and concerns • Observation tool • Provides an additional measure of implementation fidelity
Improved Teacher Assessment and DDDM Knowledge Change in teacher DDDM efficacy Strand 3 DDDM content Change in teacher’s instructional practices Improved Student Achievement Change in teacher concerns about implementation Support for DDDM in data teams and PLCs Teacher and student measures are based on our ‘theory of action’ Key Consideration for Evaluation: Monitoring Outcomes
Measures Used to Monitor Change in Adults • Concerns—Stages of Concern Questionnaire (SocQ) • Efficacy—Data-Driven Decision Making Efficacy Survey (3DME) • Knowledge--Knowledge Measure (KM) • Direct Observation—support what is learned from the 3 profiles with direct observation of teacher/leader behavior • Data Team Observation Protocol
Why Monitor Concerns? • Concerns—individual’s set of thoughts and feelings related to an innovation (Hall & Hord, 2001) • Concerns follow a developmental, predictable pattern in individuals faced with change (Conway & Clark, 2003; Fuller & Brown, 1975; Hall & Hord, 2001) • Stages of Concern Questionnaire (SoCQ) • (SFSoCQ)
Six Stages of Concern • Stage 0 Unconcerned • Stage 1 Informational • Stage 2 Personal • Stage 3 Management • Stage 4 Consequence • Stage 5 Collaboration • Stage 6 Refocusing
Why Monitor DDDM Efficacy? • Efficacy influences motivation-individuals with higher efficacy for a given task are more motivated to engage in and persevere at the task (Bandura & Schunk, 1981; Bouffard-Bouchard, 1990).
How might efficacy for using data to make instructional decisions impact teachers’ instruction? • What would you expect from teachers with high DDDM efficacy? • What would you expect from teachers with low DDDM Efficacy? Brainstorm 2-3 ways efficacy could impact teachers’ behavior or actions in planning and implementing classroom instruction or assessment.
Hypothesized that efficacy and concerns are related to educators’ knowledge and skills for DDDM. Why measure knowledge?
Oregon DATA Project Knowledge Measure • Informs trainers about changes in educators’ knowledge and skills related to • Interpretation, evaluation and application of data-related information, particularly state test results. • Provides insight into the relationship between efficacy in DDDM and actual knowledge/skills
Monitoring and Informing Implementation • DATA Project leaders and trainers were provided with profile reports for the three measures that enabled them to respond to different training/support needs within each region, ESD and district. • Direct observations of DATA Teams were used to provide greater detail about implementation progress and challenges.
Pre- and Post-Survey Results for SoCQ, 3DME and KM and Recommendations: All participants Regions ESDs Districts Pre- and Post- Results for the 2010-2011 School Year
Examined change among participants using all three measures • Changes in Concerns • Less resistance • Changes in Efficacy • Relatively higher efficacy • Changes in Knowledge • Relatively more knowledge
Each profile represented different considerations for implementation and support for an innovation over time
Disaggregated profiles showed differences in change in implementation concerns within entities Change in concerns was linked to fidelity of implementation.
More variation among districts—ESDs and districts encouraged to use pre-post results to inform continued implementation
Key Consideration for Evaluation: Student Outcomes • Teacher-Student connection not available • Smallest unit of analysis was school level • Caveats • Variability in teacher level participation in ODP training within schools. • Split-plot repeated measures design with school as unit of analysis
Significant increase in math mean school z scores from 2010 to 2011 • Math-significant interaction between effect of time and program—closing gap between higher performing NonODP schools and ODP schools.
Student Outcomes in terms of Met/Exceeded Math cut scores were raised for 2010-11 Students in participating ODP schools increased percent Met or Exceeded at a
Evaluation Challenges • Survey completion rates varied • More reluctance to complete surveys as funding comes to a close • Opportunities for direct observation limited due to scheduling • Teacher to student link availability • Limited time to take ODP to full-scale within participating schools
Key Elements of Project Design, Implementation and Evaluation • Integrate your initiatives. • Connect everything to student learning. • Use NSDC guidelines when designing your professional development. • Build capacity and sustainability from the onset. • Evaluate the effectiveness of your PD on teachers and students. Student achievement is a critical indicator!
For more information: Mickey Garrison 541-580-1201 mickeyg@rmgarrison.com Sean Mulvenon 479-575-5593 seanm@uark.edu Denise Airola 479-575-7397 dairola@uark.edu Karee Dunn 479-575-5593 kedunn@uark.edu