950 likes | 1.33k Views
E N D
Scaling Progress in Early Childhood Settings (SPECS)STEPHEN J. BAGNATO, Ed.D., NCSPProfessor of Pediatrics & PsychologyDirector, Early Childhood PartnershipsDirector, SPECS (PAPREKA/PEIOS)Children’s Hospital of Pittsburgh/UCLID CenterUniversity of Pittsburgh School of Medicinesteve.bagnato@chp.eduwww.uclid.org
SPECS Program Evaluation Research Team Leaders: PAPREKA and PEIOSStephen J. Bagnato, Ed.D., NCSP, Director Candace Hawthorne, Ph.D., OTR/L, CoordinatorIlene Greenstone, MA, CoordinatorPip Campbell, Ph.D., OTR/L, CoordinatorAssisted by Western and Eastern PA Research Teams
What is the Authentic Assessment Alternative to Conventional Testing in Early Childhood?
John T. NeisworthStephen J. BagnatoJohn SalviaFrances M. Hunt
Inauthentic Measurement in Early Childhood “Much of developmental psychology [early childhood assessment] as it now exists is the science of the strange behavior of children with strange adults in strange settings for the briefest possible periods of time.” (Bronfenbrenner, 1979, p. 19)
Authentic Assessment in Early Childhood • Natural observations of ongoing child behavior in everyday settings and routines vs. contrived arrangements; • Reliance on informed caregivers (teachers, parents, team) to collect convergent, multi-source data across settings; • Curriculum-based measures linked to program goals, content, standards, & expected outcomes; • Universal design; equitable assessment content and methods; • Intra-individual child progress supplemented by inter-individual normative comparisons; • NAEYC/DEC/HS & PA DAP Assessment Standards & Practices
Are There Professional and Pennsylvania Standards for Authentic Assessment in Early Childhood/Early Intervention?
Selected Professional Standards for Early Childhood Assessment (DEC, 2004; NAEYC, 1997, HS, 2000) • Reliance on developmental observations-ongoing observational assessments overtime • Performance on “authentic, not contrived, activities” • Integration of assessment and curriculum • Child progress on past performances as the reference, not group norms • Choose materials that accommodate the child’s special functional needs • Use only measures that have high treatment validity • Rely on curriculum-based measures as the foundation or “mutual language” for team assessments • Defer a diagnosis until evaluation of a child’s response to a tailored set of interventions • Use scales with sufficient item density to detect even small increments of progress
6 “Best Practice” Criteria for Authentic Assessment in Pennsylvania’s ECE Programs • Purpose:Assess for program planning not diagnosis or exclusion; eliminate “readiness” testing practices • Method:“No tabletop testing”; Deemphasize scores; observe and assess functional skills that link to the curriculum standards • Context:Observe and record evidence of natural, ongoing child development and behavior in typical, everyday routines not contrived settings • Process:Rely on teachers/caregivers to observe and record child progress 2-3 times each year • Standards-based:Align all assessments and their item content with ELS and curricula; link assessment with expected outcomes • Parent Partnerships:Enable parents to have central role in providing observational data on progress
Measures for PEIOS/PAPREKA Research: Balance of Attributes • Tension to balance research rigor with utility; Choice of measures based on following elements: • Simplicity • Authenticity • Utility • Evidence-base • Standards-referenced • Functional content • Sensitivity
The Pennsylvania Early Intervention Outcomes Study(PEIOS)Documenting the Benefits of Early Intervention Supports in PA to Fulfill State and Federal Mandates
What are the missions, research questions and authentic measurement design for PEIOS?
PEIOS “Fast Facts” • Aim: Document early intervention outcomes for state & federal mandates • County agencies and MAWAs in 6 PA Regions mapped to PQP • Random selection • Collect data on entry-level functioning compared to performance at followup • Use both/either an existing measure and a common functional measure across programs: ABAS II • File reviews to code program and service intensity • Multiple research strategies to analyze interrelationships among program intensity and child/family outcomes • Classify outcomes by OSEP/ECO categories
PEIOS Outcome and Research Measures • Child Measure [Caregiver]: Adaptive Behavior Assessment System II (ABAS; Harrison & Oakland, 2004) • Child Measure [Caregiver]: Program-Identified Measure (BDI; DC; COR) • Program Measure [PEIOS Team]: Program Specs (Bagnato, 2005) • Program Measure [PEIOS Team]: Developmental Specs (Bagnato, 2005)
Adaptive Behavior Assessment System II (ABAS, 2003) • Multi-dimensional observational & judgment-based rating scale (3-point) of functional competencies • Ages: 0-89years • Early childhood forms: Parent & Teacher/Provider • Nationally standardized: Ages 0-5=2100; all forms and all ages=5270 • Norm-referenced scores: General Adaptive Composite (100,15); Subskills (10,3) • Excellent technical research base; disability studies; Aligns with DSM IV; AAMR categories • Psychological Corporation
ABAS Rating Format 0……..Is not able [can’t; too young; physical limits] 1……..Never/almost never when needed [prompts] 2……..Sometimes when needed [with/out help] 3……..Always/almost always when needed [before] G……..Check if you guessed
ABAS Domains: Parent Form (241); Teacher (216) • Communication • Community Use • School Living/Home Living • Functional Pre-Academics • Health & Safety • Leisure • Self-Care • Self-Direction • Social • Motor
ABAS Disability Research: Clinical and Matched Control Samples • Mental Retardation • Developmental Delay • Biological Risk Factors (e.g., prematurity; drugs) • Motor and Physical Impairments • Language Disorders • Autism and PDD • Learning Disability • ADHD • Alzheimer’s Disease • Neuropsychological Disorders • Behavior and Emotional Disorders • Deaf and Hard of Hearing
PEIOS Program Evaluation Research & Measurement Model • January 2006: Identify and train PEIOS program evaluation liaisons • January-February 2006: Random selection of children • January-February 2006: Train on use of the ABAS II (if chosen) with liaisons, teachers, others • January-April 2006: Collect EI-entry child data • January-April 2006: Conduct file reviews to document program intensity • May 2006: Collect child progress data using ABAS II or program-chosen measure (each September and May)
TIME IN INTERVENTION CHILD & FAMILY PROGRESS OUTCOMES PROGRAM INTENSITY
PEIOS LONGITUDINAL REPEATED MEASURES REGRESSION RESEARCH DESIGN AND TIMELINE REGION 1 September May REGION 2 REGION 3 ABAS II Program Scale D-SPECS P-SPECS ABAS II Program Scale D-SPECS P-SPECS
What are the federal OSEP/ECO outcome indicators and reporting timelines for PEIOS?
OSEP/ECO Child Outcome Indicator Domains • Positive social-emotional skills (including social relationships) • Acquisition and use of knowledge and skills (including early language/communication; early literacy) • Use of appropriate behaviors to meet one’s needs Domains, sub-domains, and item content of measures are mapped to these integrated functional areas
OSEP/ECO Child Outcome Indicator Metrics • % of children who reach or maintain functioning level comparable to same-age peers • % of children who improve functioning toward same-age levels • % of children who did not improve functioning • % of children maintaining own rate and preventing regression • % of children showing specific curricular skill improvements compared to own previous skill levels • % of children whose developmental progress profiles exceed own pre-intervention (maturational) expectations and those of their local EI peer group (IEI; CEI; PCI, HLM-EAPS)
Statistical Impact of ECI on Child Progress Exceeding Maturation after 31 Months of Programming [Pooled HR/DD groups; n = 104; p<.000; 48th%ile>68th%ile; 95% confidence interval]
OSEP/ECO Family Outcome Indicators[Draft Reconciliation] • Based on ratings on the Family Outcomes Survey (in development) • Understand their child’s strengths, abilities, and special needs • Know their rights and advocate effectively for their children • Help their children develop and learn • Have support systems • Access desired services, programs, activities in their community
Timelines for State Reporting of Child Outcome Data to OSEP(Recent Report from OSEP/ECO National Meeting , Washington, DC, 1/12-13/06 • December 2005 SPP: Measurement plan submitted • February 2007 APR: Report on EI-entry child data only—no progress data wanted • February 2008 APR: 1st progress report • February 2009 APR: 2nd progress report • February 2010 APR: 3rd progress report
What is the collaborative model for training and implementation in PEIOS and PAPREKA?
DIRECTOR PAPREKA COORDINATORS: Western PA Eastern PA RESEARCH SYSTEMS DATA MANAGER EVALUATION ASSISTANTS STATISTICIAN PROGRAMMER REGIONAL PA CONSULTANT
COORDINATORS COUNTY MAWA SCHOOL DISTRICT PARTNERSHIP- Administrators LIAISON SPECS TEAM REGIONAL CONSULTANT TEACHERS CAREGIVERS EVALUATION ASSISTANT
The Pennsylvania Pre-Kindergarten Analysis (PAPREKA)A 4-Year Independent Program Evaluation Research Collaborative to Document the Impact and Outcomes of Partnership for Quality Pre-Kindergarten (PQP)
SPECS Pennsylvania Early Childhood Intervention Outcome Studieswww.uclid.org; Early Childhood Partnerships: SPECS • Heinz Pennsylvania Early Childhood Initiatives (ECI) [1997-present)(Bagnato, etal., 2002; 2005) • Pennsylvania Pre-Kindergarten Analysis (PAPREKA; 2005-2009] (Bagnato etal. 2005) • Pennsylvania Early Intervention Outcomes Study (PEIOS; 2005-2008) (Bagnato etal., 2005) • TRACE Center of Excellence for Early Childhood Assessment (2002-2007; Dunst, Trivette, Bagnato) • The Efficacy of a Direct Instruction Add-On to a DAP Curriculum in 4KIDS at Braddock (2005-2007) • Pennsylvania Preschool Integration Initiative (PAPII; 1989-1993) (Bagnato & Neisworth, 1993)