890 likes | 1.05k Views
Opening the Black Box of Individualization in Programs for Young Children. Division of Early Childhood Conference San Francisco, CA October 18, 2013. The Black Box. How Do we use Progress Monitoring to help us individualize?. Panelists. Judy Carta, Senior Scientist, University of Kansas
E N D
Opening the Black Box of Individualization in Programs for Young Children Division of Early Childhood Conference San Francisco, CA October 18, 2013
Panelists • Judy Carta, Senior Scientist, University of Kansas • Sally Atkins-Burnett, Senior Researcher, Mathematica Policy Research • Charlie Greenwood, Senior Scientist, University of Kansas • Alisha Wackerle-Hollman, Research Associate, University of Minnesota • Jane Squires, Professor, University of Oregon
Questions we want to discuss • How do we do progress monitoring and how do we use it to individualize for young children in our programs? • How do we know when progress monitoring and individualizing are being done well? What are our quality indicators? • Do we have evidence that progress monitoring and individualizing leads to better outcomes? • What do programs need to know to determine when progress monitoring is done well?
Format for today’s session • Sally will describe a recently completed review of literature describing what we know about progress monitoring and how it’s used for individualization • Charlie will describe a general outcomes approach to progress monitoring with infants and toddlers, describe how it’s used for individualization and ways to determine the quality of implementation of progress monitoring • Alicia will describe a general outcomes approach to progress monitoring with preschool-aged children • Jane will describe a curriculum-based approach to progress monitoring and individualization • We will have time to hear from YOU.
Assessing Early Childhood Teachers’ Use of Child Progress Monitoring to Individualize Teaching PracticesDEC Conference October 18, 2013 Sally Atkins-Burnett, Lauren Akers, Patricia Del Grosso, Shannon Monahan, Judith Carta, Barbara A. Wasik, Kimberly Boller
Project Aims • Evaluate the existing evidence • Are there existing measures of teachers’ use of assessment for individualization? • What is important to measure? • Develop evidence-informed conceptual model of ongoing child assessment for individualizing instruction • Develop a plan to efficiently assess implementation of the model • What methods and modes of data collection are needed? • What is feasible? • Develop and pretest a measure that could inform early childhood research and practice
Purpose of the Literature Review • Identify the key areas to include when evaluating the activities involved in the process of monitoring child progress and using that information for instruction and individualization • Find examples of how others have measured teachers’ implementation of ongoing assessment and progress monitoring • Identify gaps in the literature • Inform the project’s conceptual framework
Methods and Results • To identify studies we: • Conducted a structured library search of last 10 years of research • Solicited recommendations from the project team and expert consultant group • We identified 1,322 studies; of those 198 met relevance criteria and were screened into the review • Studies screened out for being off-topic, not an eligible target population, not a relevant document type, duplicate studies, or not published in English
Methods and Results • To identify studies we: • Conducted a structured library search of last 10 years of research • Solicited recommendations from the project team and expert consultant group • We identified 1,322 studies; of those 198 met relevance criteria and were screened into the review • Studies screened out for being off-topic, not an eligible target population, not a relevant document type, duplicate studies, or not published in English
Teachers’ Perceptions of Progress Monitoring • 10 studies reported on teachers’ perceptions or experiences with progress monitoring and using data to inform instruction • Commonly cited barriers to implementation: • skill needed to use data for individualization and • knowledge of the subject matter • Teachers reported the need for additional training
Activities Involved in Using Progress Monitoring • No studies focused on how teachers actually select observationand assessment targets and methods • Over one-third of studies described methods for documenting children’s progress and systems for organizing information • Most only briefly mentioned the types of methods used • Studies frequently discussed web-based or technology-enhanced systems • 8 studies explored implementation experiences
Activities Involved in Using Progress Monitoring (cont’d) • Few studies described how teachers interpret data and/or apply it to instruction and individualization • 17 studies discussed ways to engage families in progress monitoring • Studies described using it to provide regular feedback to parents on children’s progress or engaging families in collecting data and using data for goal-setting • One study focused on teaching primary caregivers to conduct formative assessments
Measures Used to Assess Implementation • Studies most often measured what teachers do • 14 of 18 studies measured whether teachers implemented a specific progress monitoring tool with fidelity or reliability • 5 studies examined the instructional modifications teachers made • 4 studies measured what teachers think • 1 study used teacher written reports; another study used a series of teacher interviews, including a think-aloud data analysis scenario • 2 studies assessed what teachersknow about child development, assessment, and/or instruction
Features of Successful Implementation • Web- or computer-based systems can assist teachers in documenting, organizing, interpreting, and planning how to use data • Implementation issues can hinder the utility of these systems • Ongoing professional development and support for teachers may assist teachers with using data to individualize instruction • Families are essential partners
Gaps in the Literature • Very limited research exists about the use of progress monitoring in domains other than language/literacy, social-emotional, and math • Minimal research has focused on • Using progress monitoring in home visiting programs • Supporting families to in collecting assessment information and observing their children • Research points to the importance of ongoing support for teachers • Although much of this research has been conducted with teachers in K-3
Gaps in the Literature (cont’d) • Few studies have assessed implementation of progress monitoring and the individualization process • No studies have assessed implementation across a range of progress monitoring tools • Studies typically only looked at one or two of the activities involved in the process of using child progress monitoring for instruction and individualization
Gaps in the Literature (cont’d) • Few studies have assessed implementation of progress monitoring and the individualization process • No studies have assessed implementation across a range of progress monitoring tools • Studies typically only looked at one or two of the activities involved in the process of using child progress monitoring for instruction and individualization
General Outcomes Approaches • Continuous, frequent, and standard assessment of child progress toward a long-term goal or outcome • Repeated measurement of a set of indicators predictive of a later outcome • Increasing proficiency indicated by rate of growth • Trend line compares expected versus actual rates of learning • Brief and quick to administer
Conceptual Model for the GOM Approach
Curriculum-Embedded Approaches • Information often collected within the context of the delivery of the curriculum • Assessments closely aligned to the curriculum • Assessments intended to be authentic in context
Conceptual Model forthe Curriculum-Embedded Approach
For More Information • Please contact: • Sally Atkins-Burnett • SAtkins-Burnett@mathematica-mpr.com • Kim Boller • kboller@mathematica-mpr.com
A general outcomes approach to progress monitoring and individualizing for infants and toddlers Charles Greenwood, Division of Early Childhood Conference San Francisco, CA October 18, 2013
My Topics for Today • How is individualizationdefined? • What is my approach to progress monitoring? • What framework guides data-based individualization? • How is progress monitoring data used to track a child’s response to intervention? • How is implementation quality addressed? • Is it working to improve children’s outcomes?
How is Individualization Defined? • A long held tenet of educational and behavioral psychology and early intervention is that instruction (intervention) should be “adjusted” based on its observed effects on the learner • If effective, continue and improve the intervention • If not effective, change the intervention and try something else • Individualization is occurring when this kind of dynamic “adjusting” is happening for all children in a program when needed
What is My Approach to Progress Monitoring? • General outcome measurement in the form of infant/toddler Indicators of Individual Growth and Development (IGDIs) • Seasonal universal screening (quarterly) • Monthly progress monitoring for children receiving intervention • IGDI benchmarks and trends over time (decision points) are used for making intervention decisions
Individual Child’s Growth Chart with Benchmark Indicators Normative Trajectories Child’s Scores
Yes Yes What is causing the problem? Is the intervention being implemented? What intervention strategies should be used? What Framework Guides Data-Based Individualization? Quarterly ECI Assessments No No Is there a problem? Is the intervention working? No Tilly, W. D. (2002). Best practices in school psychology as a problem solving enterprise. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology IV (Vol. 1, pp. 21-36). Washington, DC: National Association of School Psychologists.
How is Implementation Quality Addressed? • The infant/toddler IGDI website provides supports: • information • training • Implementation • data handling (entry, charting) • reporting (child and program levels) • making intervention decisions (MOD)
How is Implementation Quality Addressed (Continued)? • Programs can monitoring their own implementation quality in two ways: • Monitoring the quality of IGDI data collection within a program and managing issues • Tracking the fidelity of intervention provided by home visitors and parents in the home
Program Level Reports Data Collection Indicators
Two Fidelity Indicators:1. Certified Assessors2. Percentage Outliers
How is Progress Monitoring Data Used to Track a Child’s Response to Intervention? • A child’s individual growth trajectory interrupted at a point in time is examined for evidence of change in level and slope • A positive effect of intervention is signaled by a change in level, change in slope, or both • A none effect of intervention is signaled by no change in level or slope
Is the Intervention Working – Child Level? Change in Slope Slopes Before and After
Is It Working To Improve Children’s Outcomes? • Program’s can monitor: • Child-level progress over time • Program level progress over time
Is it Working to Improve Children’s Outcomes? – Program Level
References • Buzhardt, J., Greenwood, C. R., Walker, D., Anderson, R., Howard, W. J., & Carta, J. J. (2011). Effects of web-based support on Early Head Start home visitors’ use of evidence-based intervention decision making and growth in children’s expressive communication. NHSA Dialog: A Research-to-Practice Journal for the Early Childhood Field, 14(3), 121-146. • Buzhardt, J., Walker, D., Greenwood, C. R., & Carta, J. J. (2011). A study of an online tool to support evidence-based practices with infants and toddlers. NHSA Dialog: A Research-to-Practice Journal for the Early Childhood Field, 14(3), 151-156. • Greenwood, C. R., Buzhardt, J., Walker, D., Howard, W. J., & Anderson, R. (2011). Program-level influences on the measurement of early communication for infants and toddlers in Early Head Start. Journal of Early Intervention, 33(2 ), 110-134. • Greenwood, C. R., Walker, D., & Buzhardt, J. (2010). The Early Communication Indicator (ECI) for Infants and Toddlers: Early Head Start Growth Norms from Two States. Journal of Early Intervention, 32(5), 310-334. • Greenwood, C. R., Walker, D., Buzhardt, J., Howard, W. J., McCune, L., & Anderson, R. (2013). Evidence of a continuum in foundational expressive communication skills. Early Childhood Research Quarterly, 28, 540-554. • Greenwood, C. R., Walker, D., Buzhardt, J., McCune, L., & Howard, W. J. 2013). Advancing the construct validity of the Early Communication Indicator (ECI) for infants and toddlers: Equivalence of growth trajectories across two Early Head Start samples. Early Childhood Research Quarterly, 28, 743-758.
Intentional Individualization: Using IGDI 2.0 progress monitoring data to inform instruction Alisha Wackerle-Hollman Ph.D. Division of Early Childhood Conference October 18th, 2013
“Our best teachers today are Using real time data in ways that would have been unimaginable just five years ago. They need to know how well Their students are performing. They want to know exactly what they need to do to teach and how to teach it.(Duncan, 2009)
Individual Growth and Development Indicators (IGDIs 2.0)- Design Principles • General Outcome Measure Approach • Treatment Independent ( Slavin & Madden, 2011) • Construct-Aligned and Validated for Specific Purposes • Validity as an argument (Kane, 2013) • Psychometrically Robust • Rasch Modeling • Potential to match test material to student ability “window” • Increased opportunities for sensitivity to growth
Individual Growth and Development Indicators (IGDIs 2.0)- Content & Tasks • Four domains of early language and literacy • Oral Language • Phonological Awareness • Alphabet Knowledge • Comprehension • Five IGDI 2.0 tasks: Picture Naming, Rhyming, First Sounds, Sound Identification, and Which One Does Not Belong?
Individual Growth and Development Indicators (IGDIs 2.0)- Content & Tasks
Progress Monitoring: Making Decision • 30 item sets selected from a performance range characteristic of Tier 2/Tier 3 candidacy. • Designed to be used every three weeks • Efficiency in data-based decision making and practical utility (Jenkins & Terjeson, 2011) • However, K-8 data suggests frequency may need to be MORE often (Christ , Zopluoglu, Monaghen & Van Norman, 2012) • Addressing Responsiveness The balancing act- the year before kindergarten • Waiting for a skill to emerge • Waiting to make instructional changes • Waiting for robust trends in performance