310 likes | 339 Views
The SIPP Event History Calendar Field Test: Analysis Plans and Preliminary Report. Jeff Moore Statistical Research Division, U.S. Census Bureau Jason Fields Housing and Household Economic Statistics Division, U.S. Census Bureau ASA/SRM SIPP Working Group Meeting September 16, 2008.
E N D
The SIPP Event History Calendar Field Test:Analysis Plans and Preliminary Report • Jeff Moore • Statistical Research Division, U.S. Census Bureau • Jason Fields • Housing and Household Economic Statistics Division, U.S. Census Bureau • ASA/SRM SIPP Working Group Meeting • September 16, 2008
Overview • Background: • - SIPP “re-engineering” • - event history calendar (EHC) methods • Goals and Design of the EHC Field Test • Evaluation Plans • Preliminary Results [not yet available]
SIPP Re-Engineering • Implement Improvements to SIPP • - reduce costs • - reduce burden • - improve processing system • - modernize instrument • - expand/enhance use of admin records • Key Design Change: Annual Interviewing
EHC Interviewing (1) • Human Memory • - structured/organized • - links and associations • EHC Exploits Memory Structure • - links between to-be-recalled events • - coherence, consistency, sequence • EHC Encourages Active Assistance to Rs
EHC Interviewing (2) • Evaluation: EHC vs. Q-List Comparisons • - various methods • - in general: positive data quality results • BUT, Important Research Gaps • - data quality for need-based programs? • - extended reference period?
Field Test Goals & Design • Basic Goal: • Can an EHC interview collect data of comparable • (or better) quality than standard SIPP? • - month-level data • - one 12-month ref pd interview vs. three • 4-month ref pd interviews • - especially for need-based programs • Basic Design: • EHC re-interview of SIPP sample HHs
Design Details (1) • Main Sample: • SIPP Wave 10-11-12 Interview Cases • - reported on CY-2007 via SIPP [Fig. 1] • Supplemental Sample: • SIPP Wave 8 Sample Cut Cases • - dropped from SIPP in 2006; “unprimed” • EHC Re-Interview in 2008, about CY-2007
Design Details (2) • Two Sites • - Illinois (all) • - Texas (4 metro areas) • N = 1,945 Addresses • - cooperating HHs in SIPP • Sample Distribution:
Design Details (3) • Administrative Records • (for some characteristics, and with R approval) • - Medicare • - Social Security retirement, disability • - SSI • - TANF • - Food Stamps • - [Medicaid?]
Design Details (4) • EHC Questionnaire [handout] • - paper-and-pencil • - 12-month, CY-2007 reference period • - selected SIPP topics (“domains”) • - start with landmark events • - within domains, anchor on “now” • - month-level (at least) detail • Sample of Addresses, Not People • - post-interview clerical match to SIPP
Design Details (5) • $40 Incentive, Non-Contingent • Same Response Rules as SIPP • - EHC interview for all adults (15+) • - self-response preferred (proxy permitted) • Field Staff: Census Bureau FRs • - most with some interview experience • - ~1/3 with SIPP experience • - 3-day training on EHC methods
Design Details (6a) • Field Period: Mid-April thru Late June 2008 • Outcomes: • - 1,627 HH interviews • - 3,318 individual EHC interviews • - 2,747 EHC Rs matched to SIPP
Evaluation Plans (1) • Compare SIPP and EHC Survey Reports • - same people • - same time period • - same characteristics • Data Quality Comparison using Admin Records • (later) • Evaluation of “Priming” Bias
Evaluation Plans (2) • Other Evaluations • - R debriefing form • - FR “case report” debriefing form • - FR debriefing focus groups • - interview observations • Focus on EHC Interview Process
Compare SIPP/EHC Reports (1a) • 2x2 Consistency Table for “Participation” • (Employed? Enrolled? Insured? etc.) • - for each characteristic • - for each month of CY-2007 • - unweighted / unedited data
Compare SIPP/EHC Reports (1b) • b=c equivalent data quality • (high if (b+c)/N~0; low if (b+c)/N is large) • b>c EHC “underreporting” (rel. to SIPP) • b<c SIPP “underreporting” (rel. to EHC)
Compare SIPP/EHC Reports (1c) • Patterns of Consistency/Inconsistency • - b>c for most months? b<c? mixed? • - early months vs. late months?
Compare SIPP/EHC Reports (2a) • Total Reported Months of “Participation” • - by Qtr / combined Qtrs / whole year
Compare SIPP/EHC Reports (2b) • Patterns of Off-Diag Clustering Across Time • - above for most Qtrs? below? mixed? • - early Qtrs vs. late Qtrs?
Compare SIPP/EHC Reports (2c) • Patterns of Off-Diag Clustering Across Time • - above for most Qtrs? below? mixed? • - early Qtrs vs. late Qtrs? • # Reporting At Least 1 Month of Participation
Compare SIPP/EHC Reports (3) • Other “Participation” Comparisons: • - ANY need-based program participation? • (by month / Qtr / combined Qtrs / year) • or • - ANY health insurance coverage • [etc.] • - alignment/sequencing across domains • (e.g., moves & jobs, employment & • health insurance, etc.)
Compare SIPP/EHC Reports (4a) • Month-to-Month Transitions (yesno; noyes) • SIPP’s Staggered Interview Design: • - each month-pair is a “seam” for ¼ sample • - each month-pair is off-seam for ¾ sample • Compare Reporting of Transitions
Compare SIPP/EHC Reports (4b) • Seam Bias: • - too much Δacross interview “seams” • - too little Δ within a single interview • EHC Δ rates below SIPP’s (seam), and above SIPP’s (off-seam) Improved Quality
Compare SIPP/EHC Reports (5a) • Income Amount Report Comparisons • - unemployment benefits • - disability income ($) • - workers’ comp • - Social Security ($) • - Medicare Part B deduction ($) • - TANF ($) • - Food Stamps ($) • - SSI ($) • ($)=admin records
Compare SIPP/EHC Reports (5b) • $$ Comparison is Less Straightforward • Continuous $$ Variable • - arbitrary definition(s) of “agreement” • - disagreements are directional • Limited to “Yes/Yes” Cases
Compare SIPP/EHC Reports (5c) • $$ Reporting Comparisons • - mean amount (EHC; SIPP; difference) • - levels of correspondence • (e.g., ±5%; ±5-10%; ±10-25%; • ±25-50%; >±50%) • - direction of differences • ($EHC > $SIPP; $EHC=$SIPP (±1%); • $EHC < $SIPP) • - timing of amount changes
Assessment of “Priming” (6a) • W-10-11-12 Rs Provide CY-2007 Data Twice • - first SIPP, then EHC • Are Their EHC Reports Biased? • - e.g., more accurate EHC response • - could bias field test interpretation • Control Group: W-8 Sample Cut • - last SIPP response in Jun-Sep 2006 • - “unprimed” re: CY-2007 (not SIPP content)
Assessment of “Priming” (6b) • Compare Distributions for Key Characteristics • - e.g., monthly “participation” reports • - weighted (sub-sampling; attrition) • Similarity of Profiles Extent/Nature of Priming Bias • Admin Records for Some Characteristics • - meaning of distribution differences • - may also reveal hidden quality diffs
Guidance, Questions, Advice… • Questions? • Thoughts/Comments...? • - on the evaluation approach? • - about additional analyses? • - about how to weigh evidence from the • field test in deciding whether or not to • adopt a 12-month EHC?
Thank you very much! • jeffrey.c.moore@census.gov • 301-763-4975