1.54k likes | 1.55k Views
Overview of 2010 EHC-CAPI Field Test and Objectives. Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the ASA/SRM SIPP Working Group November 17, 2009. “Re-SIPP” Development. * Following successful completion of the EHC Paper Field Test.
E N D
Overview of2010 EHC-CAPI Field Testand Objectives • Jason Fields • Housing and Household Economic Statistics Division • US Census Bureau • Presentation to the ASA/SRM SIPP Working Group • November 17, 2009
“Re-SIPP” Development * Following successful completion of the EHC Paper Field Test
“Re-SIPP” Development * Following successful completion of the EHC Paper Field Test * Develop the 2010 plan to test an electronic EHC instrument
“Re-SIPP” Development * Following successful completion of the EHC Paper Field Test * Develop the 2010 plan to test an electronic EHC instrument * Broad involvement across Census Bureau - DID - FLD - TMO - DSD - HHES - DSMD - SRD
Primary Goals of 2010 Test (1) Strong evidence of comparable data quality
Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel?
Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel? - Especially for income transfer programs
Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel? - Especially for income transfer programs (2) Strong evidence to guide development and refinement before implementation in 2013 as the production SIPP instrument
Basic Design Features (1) 8,000 Sample Addresses
Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities
Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum
Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected
Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected State-Based Design
Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected State-Based Design - likely (possible?) access to admin records
Basic Design Features (2) Field Period: Early Jan - mid March 2010
Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009
Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan
Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training
Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training Use FRs with a wide range of experience
Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training Use FRs with a wide range of experience Expand RO involvement
Research Agenda 1. Quantify likely cost savings
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs”
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC)
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?
Special Methods 1. Quantify likely cost savings
Special Methods 1. Quantify likely cost savings - new cost code(s) established - timing interview length - exchange between 12-month recall and 3 interviews per year
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?
Special Methods 2. Test the data processing system
Special Methods 2. Test the data processing system The data collected in this test will be used to develop and test a new data processing system.
Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?
Special Methods 3. Evaluate data quality
Special Methods 3. Evaluate data quality - administrative records