1 / 20

Initial Plans for the Re-engineered SIPP 2010 Electronic Prototype Field Test

Initial Plans for the Re-engineered SIPP 2010 Electronic Prototype Field Test. Jason M. Fields, Housing and Household Economic Statistics Division, US Census Bureau. Presented at the ASA/SRM SIPP Working Group September 16, 2008 Alexandria, VA. Timeline for SIPP Development.

gerard
Download Presentation

Initial Plans for the Re-engineered SIPP 2010 Electronic Prototype Field Test

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Initial Plans for the Re-engineered SIPP 2010 Electronic Prototype Field Test Jason M. Fields, Housing and Household Economic Statistics Division, US Census Bureau Presented at the ASA/SRM SIPP Working Group September 16, 2008 Alexandria, VA

  2. Timeline for SIPP Development 2007 2008 2009 2010 2011 2012 Sep --- Jan --- May --- Sep --- Jan --- May --- Sep --- Jan --- May --- Sep --- Jan --- May --- Sep --- Jan --- May --- Sep --- Jan 2013 SIPP 2004 Data Gap SIPP 2008 Panel – Waves 1 – 10 collection Waves 11 – 13 SIPP 2004 Panel data release SIPP 2008 Panel – Waves 1 – 13 data release 2008 paper EHC Eval. Analysis 2009 SIPP Re-Engineering Instrument Dev. Systems Tests - Preparation Field Activities Processing and Evaluation 2009 Re-engineered SIPP automated Prototype Reference Period Field Act. 2nd automated prototype Reference Period 2012/13 SIPP Re-Engineering Instrument Refinement Systems Tests - Preparation Field Activities 2013 Reengineered SIPP Reference Period

  3. Re-engineered SIPP Instrument • Survey Instrument – • Designed for annual administration • Plan to continue to follow movers • Significant reduction in feedback compared with 2004 • Programmed in Blaise and C# • Calendar – • Learning from the experience of past designs and integrating the more closely with Blaise and utilizing a single Blaise database to store data.

  4. Stakeholders and Content Review • Responding users indicated a broad need for most of SIPP core content. About 40 stakeholders completed matrix • Select areas were added based on lost topical module content. • Held five subject area meetings to discuss specific content (Health, General income/Government programs, Assets and wealth, Labor force, Demographics and other items)

  5. Content Crosswalk • What was gained and lost between SIPP 2004 and Re-SIPP? • Frequency for topical content – 2004 vs. Re-SIPP • Blaise Demographics • Time of interview information • Collected for whole household at once similar to the 2004 panel • EHC • Launches from Blaise directly once the interviewer chooses which household member will be interviewed next • Post-EHC Blaise items • Person level (person responding to EHC section continues through the Blaise items) before the next person is selected – returning to the EHC launch).

  6. EHC Sections • Landmark Events • NEW – Significant life events collected to assist in recall across 12-month reference period • Residence History – Migration History • Detailed address information for up to 5 spells per person, tenure status, living quarters type, public housing subsidy/voucher info for each address not owned, month/year began living in residence recorded for January residence, migration information for previous residence to the January address • Marriage and Cohabitation – Marital History TM • Marital status info collected in spell format (up to 3 spells for each HH member 15 and over), spouse pointer, year married for marriage identified for January, number of times married, year of first marriage, presence of cohabiting partner for those not married and living with their spouse, partner pointer • Added – Marital status collected in spells, marital history items, cohabitation pointers • Removed – Mo/Yr widowed/divorced

  7. EHC Sections • Presence of Mom and Dad • Identification of R’s parents and whether or not they live in the HH, Type of parent/child relationship • Added – This is a in addition to the relationships recorded for the interview time • School Enrollment - Education • Spells of education enrollment, type and grade level of spell, full-time or part-time enrollment • Added – type of school (public, private or home) • Removed – Working towards a degree question, Educational expenses, Educational assistance

  8. EHC Sections • Labor Force – Labor Force 1 and Labor Force 2 • Employment spells for up to 5 employers (3 spells for each employer allowed for the reference year), • summary information for any additional employers/contingent work (in line 6), • Added - Allows for 5 employers with up to 3 employment spells each, up to 6 no-job spells with up to four stretches identifiable as time looking-for-work, summary information for all other employers (in line 6), address of employers • Moved to another section – disability, retirement, severance pay, rollovers, lump sum payments • Removed - Years in line of work question, employment assistance from state or local welfare office questions,

  9. EHC Sections • Programs – General Income 1 and General Income 2 • Up to 3 spells for receipt of SSI, Food Stamps, TANF (pass through child support), general assistance, WIC. • For each program: start year if outside reference year, reason/s for receipt, owner/beneficiaries, reasons for stop, state/Federal receipt, amount and amount changes • Added – Multiple spells of receipt • Health Insurance – Health Insurance • Two Private Health Insurance providers (up to 3 spells), Medicare (Up to 3 spells), Medicaid (Up to 3 spells), Military HI (Up to 3 spells), Other HI (Up to 3 spells), Up to 6 spells of no coverage, Type of private, military and other coverage, Owner, Other HH members covered, Cost (for private), No coverage reason • Added – Spells of no coverage, • Removed – year of coverage start, coverage of non-hh members, specific employer providing coverage

  10. Pause for quick demo if time allows

  11. Re-engineered SIPP Electronic Prototype Field TestObjectives • Systems testing • The production Case Management and Regional Office Survey Control systems will be evaluated as they will be required to handle additional log and program files. • Training development • Lessons learned from the training used in the paper field test will be modified, and applied to the electronic test. • Evaluations will consist of focus group debriefings as well as summary evaluations with interviewers and trainers.

  12. Re-engineered SIPP Electronic Prototype Field TestObjectives • Field data collection evaluation • As in the paper test, we will evaluate the reactions to the interview with a sample of respondents and interviewers, headquarters staff will collect evaluation information as observers, and focus groups will be conducted soon after the conclusion of field data collection with field representatives involved in the project. • Processing Development • The data collected will be the foundation for the processing system development, enabling systems and edits to be tested and evaluated before going into production. • Wave 2+ instrument development • Requirements and information necessary to develop the dependent interviewing systems for wave 2+ interviewing.

  13. Re-engineered SIPP Electronic Prototype Field TestObjectives • Content Evaluation • Address the primary concern voiced by most stakeholders • How comparable are the estimates, patterns and relationships in the data collected with the re-engineered SIPP instrument with those collected by the traditional SIPP data collection procedures? • Key Estimates from the EHC • The measurement of data for program receipt among the low-income population – Food Stamps as a key program – estimates and coverage units. • Social Security receipt and estimates, and the ability to provide necessary inputs to stakeholders models • Health Insurance Coverage – patterns of uninsurance, realationships between public and private insurance, and coverage units. • Poverty status during the reference period – ability to examine specific populations and transitions into and out of poverty.

  14. Re-engineered SIPP Electronic Prototype Field TestObjectives • Content Evaluation – Key Estimates from Blaise Topic Sections • The measurement of assets and wealth and thereby eligibility for various government programs • Disability status with a new sequence of questions • Medical expenses and utilization – estimates of MOOP. • Child and adult well-being • Work and commuting expenses by job and how this could be applied to alternative poverty estimates • Annual program receipt and lump-sum payments • Evaluate content to develop and refine edit and recode specifications in advance of the implementation of the production instrument

  15. Re-engineered SIPP Electronic Prototype Field TestSample • Sample of 5000 or more households (budget dependent) • Selected from the same frame as the current SIPP 2008 panel • Focused sample in selected areas with higher than average poverty rates – more efficient, financially than a national sample • Sub-select similar cases from the SIPP 2008 panel to match (Geographically and by poverty areas) the selected sample for the electronic prototype field test sample. • Ability to weight both samples comparable for monthly weights and the 2009 reference year.

  16. ASA-SRM Evaluation RecommendationsQuestions • What makes a successful test? • Instrument collects and returns with data • Usability in the field by FR’s and Region systems • Respondents and FR’s able to navigate and complete instrument • What are the key content characteristics which indicate success or failure? • What are the indications that dictate an alternative instrument should be pursued?

  17. ASA-SRM Evaluation RecommendationsQuestions • What are key comparisons? • We believe the primary comparisons will come from the 2008 SIPP panel – waves 2-5, collected during the same 2009 reference year. • Are there additional comparisons? • What methodology would you recommend as most informative to evaluate the level of differences between these data sources? • Which differences should be considered acceptable? • Which comparisons are most meaningful?

  18. ASA-SRM Evaluation RecommendationsQuestions • What about using our imperfect standard as a metric? • SIPP has known problems with • Sample attrition • Reporting inconsistencies within households across a calendar year • Seam bias • Idiosyncrasies in measurement • Maybe different is good? • How can the evaluation using an imperfect standard indicate the success or failure of the new method? • What are your recommendations on: • how to evaluate the test? • the metrics of success? • the study design in light of these issues?

More Related