220 likes | 339 Views
Does mode matter? Comparing response burden and data quality of a paper and an electronic business questionnaire. Deirdre Giesen Statistics Netherlands Presentation for QUEST Ottawa, April 24th-26th 2007. Outline. Pilot electronic Structural Business Survey (eSBS)
E N D
Does mode matter? Comparing response burden and data quality of a paper and an electronic business questionnaire. Deirdre Giesen Statistics Netherlands Presentation for QUEST Ottawa, April 24th-26th 2007
Outline • Pilot electronic Structural Business Survey (eSBS) • Methods used for the evaluation • Main results and conclusions • Discussion
Sources and methods used • Telephone interviews with early respondents and respondents with doubtful data (N=17) • Retrospective interviews and observations on location (N=8) • Audit trails • Data of use website • Analyses of unit and item non response • Information call centre inbound (helpdesk)
Results: Do respondents accept e-form? • (situation November 2006) • Only 6% asks actively for paper version • Reasons for asking paper version (n=232) • 31% “prefers paper” • 27% download problems • 18% no internet • 9% no computer • 7% not enough knowledge about computers • 7% configuration not suited (incl. apple)
Results: response rates • (situation November 2006)
Results: problems with downloading en installing • Hardly any requests for technical support • Problematic that it is not evident that each downloaded questionnaire is unique • Tips and instruction on website are hardly viewed (about 20% opens tips-file)
Results: respondent friendliness of questionnaire • Small error in questionnaire for temp offices with large consequences • General impression: • Very positive reactions • Similarity with tax forms is appreciated • Easier than paper to make corrections • Easier to find instruction • Easy to find questionnaire • Automated counting reduces response burden
Results interviews with respondents • Vertical scrolling dangerous if approve button is visible but last question is not • Different presentation of related questions can cause mode effects • Calculation aid option not visible and use problematic • Sometimes fields incorrectly defined as allowing only positive amounts • Not obvious that changes in approved screens have to be approved again • Respondents expect more controls • Routing might reduce response burden • Explanation texts should also be printable • It must be possible to submit an improved questionnaire • Questions should be numbered
Results data quality: unit and item response • Overall unit response better in 2006 than 2005, due to earlier reminders. • Item non response pilot groups was 58% in 2005 and 60% in 2006. • “Scroll questions” don’t show higher INR. • Strange outlier with high INR in 2006 for some variables in temp offices.
Recommendations • Keep • Method for downloading • First, only offer electronic form • Paper form on request • Send reminders quickly • Change • Do not send paper form with second reminder • Make clear that questionnaire is unique for each firm • Offer tips in questionnaire and not on webpage • Make it possible to submit an improved questionnaire
Recommendations for questionnaires • Overall: instrument works • Change • development process • present essential clarification next to question (not behind button) • make clarifications printable • improve spread sheet • give clear visual signal (with colour) that changed field should be approved again • give questions numbers • add controls
Mode effects? • Qualitative indications, so far not seen in item or unit response, further research will be done with data. • Possible effect, then probably higher quality because of automation of calculation.