1 / 44

Data capturing strategies used in Istat to improve quality

Data capturing strategies used in Istat to improve quality. Conference of European Statisticians Work session on statistical data editing (Bonn, 25-27 September 2006) Editing nearer the source session Rossana Balestrino, Stefania Macchia, Manuela Murgia

Download Presentation

Data capturing strategies used in Istat to improve quality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data capturing strategies used in Istat to improve quality Conference of European Statisticians Work session on statistical data editing (Bonn, 25-27 September 2006) Editing nearer the sourcesession Rossana Balestrino, Stefania Macchia, Manuela Murgia ISTAT – Italian National Statistics Bureau Rome, Italy balestri@istat.it, macchia@istat.it, murgia@istat.it

  2. CASIC techniques have been introduced at Istat in the 1980s  CATI and CAPI were adopted first • nearly one decade later, CASI was taken into consideration • CATI/CAPI offer already mature and well tested solutions so have a higher rate of consolidation • CASI techniques are younger and more depending on the continuously evolving of IT solutions and network tools

  3. In Istat, for all the techniques : • the internal demand shows an increasing trend • the experience has taught that it is important that Istat plays a very active role and keeps at least the design and the monitoring phases of the process inside the Institute, in order to get standard solutions driven by quality requirements and enriched with suggestions coming from previous results

  4. Strategies for CATI and CAPI surveys Strategies for CASI

  5. CATI and CAPIadvantages • reduction of costs and time necessary to have data ready to be processed (Groves et al. 2001) • help in preventing from non sampling errors, through the management of vast consistency plans during the interviewing phase (CAPI is not so widely used as CATI in Istat, because is more expensive)

  6. Organisation for CATI surveys the content of the survey, made clear in the questionnaire, is designed in Istat, while private companies are charged with the entire data collection procedure.

  7. Frequent problems encountered with this organisation Private companies • had never faced in advance the development of electronic questionnaires so complicated in terms of skipping and consistency rules between variables • had never put in practice strategies to prevent and reduce non response errors • had not at their disposal a robust set of indicators to monitor the interviewing phase.

  8. New organisation for CATI surveys: in-house strategy It consists in relying on a private company for the call centre, the selection of interviewers and to carry out the interviews, but in giving it all the software procedure, developed in Istat, to manage the data capturing phase: • calls scheduler • electronic questionnaire • set of indicators to monitor the interviewing phase

  9. In-house strategy: the software procedure It integrates different software packages, but the core is developed with the Blaise system (produced by Statistics Netherlands and already used by a lot of National Statistics Administrations for data capturing carried out with different techniques)

  10. Quality oriented procedure planning Quality standards have been defined for: • the data capturing phase • the monitoring phase • the secure transmission of data

  11. Standards for the data capturing phase • the layout of the electronic questionnaire  to reduce the ‘segmentation effect’ • the customisation of questions’ wording  to make the interview more friendly and questions easy to be answered • the management of errors to prevent from all the possible type of errors without increasing the respondent burden and making the interviewers’ job easier

  12. Standards for the data capturing phase • the control of data with information from previous surveys or administrative archives  to improve the quality of the collected data • the assisted coding of textual answers  to improve the coding results and to speed up the coding process • the scheduling of contacts  to enhance the interviewers’ productivity and to avoid distortion on the probability of respondents to be contacted.

  13. Standards for the monitoring phase • A limited but exhaustive set of indicators to monitor the trend of contact results • Ad hoc instruments to monitor particular aspects of the survey

  14. Set of indicators to monitor the trend of contact results n-ways contingency tables useful to keep under control the interviewers’ productivity and the presence of odd behaviours in assigning contact results Visual Basic, based on an Access database, which produces Excel files for example, control charts to monitor the assisted coding of textual variables (if used), like the Occupation SAS QC procedure which produces ‘control charts’ for particular variables Ad hoc instruments to monitor particular aspects of the survey

  15. Standards for the secure transmission of data The aim is to assure both the secure transfer of survey data from the private company to Istat and vice versa, and the timeliness of the delivery The daily transmission is based on a ‘secure’ protocol (HTTPS) and puts data on an Istat server, INDATA, placed outside the firewall and devoted to data collection

  16. Surveys which used the in-house strategy

  17. Surveys which used the in-house strategy Characteristics of the questionnaires

  18. Checking rules in the data capturing phase with the in-house strategy The number checking rules included in the data capturing phase (together with the number of variables) are surely significant indicators of the complexity of the survey questionnaire This complexity has not negatively affected the response and refusal rates because

  19. the trade-off between the quality of data and the fluency of the interview has been taken into consideration • different treatments of the rules to detect errors have been implemented

  20. The trade-off between the quality of data and the fluency of the interview The consistency plans included in the electronic questionnaires comprised a great part, even if not all, of the rules proper of the edit and imputation plans  avoiding, during the interview, a too frequent display on the pc-screen of a dialog window asking for the confirmation of the given answer (including the complete edit plan in the data capturing phase would have guaranteed a high quality of the answer but would have definitely burdened the respondent and the interviewer, thus increasing the interruption rate)

  21. Different treatments of the rules to detect errors • ‘hard mode’ it is not possible to go on with the interview without solving the error • ‘soft mode’ the respondent can confirm his ‘inconsistent response’, without compromising the completion of the interview

  22. Performance of the in-house strategy in terms of quality Case study  two surveys • Upper secondary school graduates survey • University-to-work transition survey and perspectives Carried out in: • 2001  old strategy • 2004  in house strategy

  23. 2004 and 2001 response and refusalrates

  24. Prevention from non sampling errors • Upper secondary school graduates survey Errors per record

  25. Prevention from non sampling errors • Upper secondary school graduates survey Incidence of errors on the variables Most positive result  Occupation ‘in-house strategy’ - coded during the interview with an assisted coding function ‘external company strategy’ - manually coded after the interview -2001: 4.92% of raw data had to be corrected, during the edit and imputation phase -2004: 0.81% (with the new strategy) had to be corrected, during the edit and imputation phase

  26. Strategies for CATI and CAPI surveys Strategies for CASI

  27. CASI • prototypal experiences realised in the late 1990s • current situation comprises several Web sites, located at Istat side and dedicated to the capture of surveys data for approximately 30 surveys The need of designing a new environment and new rules aimed at introducing more standard solutions and effective security measures came out.

  28. Strategy for CASI surveys To set up a cross data capturing Web site to be used as a unique front-end for respondents to any survey INDATA (https://indata.istat.it) This new policy, already launched, is still in progress

  29. INDATA web site: aims • To present the Institute outside with a homogeneous and stable public image and identity; • To guarantee the mutual identity of data sender and receiver; • To guarantee data confidentiality in the data collection phase and comprehensive security of the production environment; • To minimize the impact on the technical environment of the respondent (it is not necessary to install SW on the client workstation).

  30. INDATA web site: aims • To reply to the user about the action carried out by him (confirmation e-mail); • To facilitate monitoring of collection activities; • To favour the internal management and contain cost of the operational environment dedicated to data capturing.

  31. Main functions offered to users • To be informed about the survey; • To get and print forms and instructions; • To fill in electronic forms online; • To download electronic forms; • To upload forms completed offline; • To transfer any dataset in a safe way.

  32. In synthesis • Both primary (single questionnaire, CSAQ = Computer Self Administrated Questionnaire ) and secondary data collection (collection of data) are dealt with. • Primary data collection is dealt in online and offline mode.

  33. The INDATA web platform The platform was initiated in the late ‘90s with prototype applications. Present Technological Features: • Operation system LINUX Red Hat 2.6.9; • Web server APACHE 2.0.52; • DBMS MYSQL and ORACLE 10; • Application language PHP 5.1.2; • Authenticity Certificate by Postecert; • Secure HTTP.

  34. INDATA architecture: requirements and constraints • Three level architecture ( WEB, APPLICATION, DB) • Secure system, safe back-end intranet • Balanced load • High level of reliability

  35. Firewall Load Balancer Load Balancer Web server Web server Front End Firewall Web application server Web application server Back End DB server DB server System Architecture

  36. Web Surveys and Directorates

  37. Electronic Questionnaire Type

  38. CSAQ and Editing Rules PDF questionnaire: editing rules are implemented in javascript language and comprise both range and consistency rules; the outcome of the editing activity is presented to the respondent globally, as a sequence of error messages, at the end of the compilation after pressing the submit button; EXCEL questionnaire: no editing macro is implemented in order not to discourage the respondent with alarm messages; all the cells are blocked apart from the input ones; data validation in single cells and default formulas in calculated variables are available; no or minimum consistency checking is performed.

  39. E-response rates for Structural Business Statistics

  40. Surveys and data capture mode

  41. Surveys and data capture mode

  42. Surveys and data capture mode

  43. Thanks

More Related