270 likes | 441 Views
Cross-Cultural Survey Guidelines and Quality Monitoring. Beth-Ellen Pennell 2009 International Total Survey Error Workshop (ITSEW 2009) Tällberg, Sweden. Survey Research Operations Survey Research Center Institute for Social Research. Unique Challenges. Locating and engaging respondents.
E N D
Cross-Cultural Survey Guidelines andQuality Monitoring Beth-Ellen Pennell 2009 International Total Survey Error Workshop (ITSEW 2009) Tällberg, Sweden Survey Research Operations Survey Research Center Institute for Social Research
Unique Challenges Locating and engaging respondents ITSEW 2009 Beth-Ellen Pennell
Literacy ITSEW 2009 Beth-Ellen Pennell
Gatekeepers/Privacy From the Institute of Social Research’s Population and Ecology Laboratory in Nepal ITSEW 2009 Beth-Ellen Pennell
Infrastructure *From the International Telecommunications Union (http://www.itu.int/ITU-D/ict/statistics/ict/index.html) ITSEW 2009 Beth-Ellen Pennell
Unique Data Collection Challenges • Other: • Research traditions • Languages • Seasonal • Political • Religious • Geographical ITSEW 2009 Beth-Ellen Pennell
Standardization issues • Concentrate on key design aspects • Decide when to be rigid and when to be flexible • Experience is that adherence to standards and regulations must be checked • ISO 20252
What is special about quality in an international setting? • Procedural equivalence is important • Concepts must have a uniform meaning • Scientific and administrative challenge • Risk management differs • Financial and methodological resources differ • National pride is at stake • Conflicts of interest
Examples of Quality Assurance • Central planning and support organization • Deep bench of senior experts • Up-to-date translation procedure • Pretesting of questions and questionnaires • Interviewer training • Probability sampling design • Call scheduling algorithm • Formulas for calculating base weights • Documentation system • User communication channels • A set of operational specifications
Deviations from specificationsExamples from 1995 IALS • Average interviewer workload varied between 6 and 30 • Two countries chose sampling control instead of 100% keystroke validation • One country did not calculate the base weights correctly • One country informed respondents that the survey was just a pretest
Evaluations and peer reviews • Recently ESS • Examples of recommendations: • Develop quantitative indicators for all process steps • Standardize contact forms • Set bounds on effective sample size • Improve capacity building • Expand the user base
Some thoughts • For some countries it is a challenge to reach minimum standard • Process stability is difficult to obtain in decentralized survey environments • Survey organizations must be aware of the meaning of specifications and the effects of certain methodological choices • Reasons for deviations must be checked • Vigorous monitoring and performance checks necessary
Cross-Cultural Survey GuidelinesContributors and Reviewers Contributors: • Kirsten Alcser, UM-SRC • Ipek Bilgen, UNL • Ashley Bowers, UM-SRC • Rachel Caspar, RTI • Judi Clemens, UM-SRC • Peter Granda, UM-ICPSR • Sue Ellen Hansen, UM-SRC • Janet Harkness, UNL • Frost Hubbard, UM-SRC • Rachel Levenstein, UM-SRC • Christina Lien, UM-SRC • Zeina Mneimneh, UM-SRC • Rachel Orlowski, UM-SRC • Beth-Ellen Pennell, UM-SRC • Emilia Peytcheva, UM-SRC • Ana Villar, UNL Graphic design assistance • Larry LaFerte, UM-SRC • Ruth Shamraj, UM-ICPSR Formatting and copy-editing • Gail Arnold, UM-SRC • Shaw Hubbard - independent consultant Programming and website maintenance • Tricia Blanchard, UM-SRC Reviewers: • Dorothee Behr, Gesis • Bill Blyth, TNS Europe • Pam Campanelli, independent consultant • Somnath Chatterji, WHO • Rory Fitzgerald, European Social Survey • Steve Heeringa, UM-SRC • Tim Johnson, University of Illinois, Chicago • Achim Koch, Gesis • Frauke Kreuter, University of Maryland • Paul Kussumaul, European Social Survey • Kristin Miller, National Center for Health Statistics • Peter Mohler, University of Mannheim • Meinhard Moschner, Gesis • José L. Padilla, University of Granada • Alisú Schoua-Glusberg, Research Support Services • Eleanor Singer, UM-SRC • Tom W. Smith, NORC • Jare Struwig, Human Sciences Research Council • Rick Valliant, University of Maryland • Gordon Willis, National Institutes of Health • Christine Wilson, Heriot-Watt University • Christof Wolf, Gesis ITSEW 2009 Beth-Ellen Pennell
Goal • To develop and promote internationally recognized guidelines that highlight best practice for the conduct of comparative survey research across cultures and countries • Initiative of Comparative Survey Design and Implementation (CSDI); 2005 annual meeting ITSEW 2009 Beth-Ellen Pennell
Guidelines Initiative Initiative in response to: • Increasing number and scope of cross-cultural surveys over past decade • Desire to increase operational equivalence and survey quality through harmonization • Within and across “units” (e.g. countries) • Across waves of panel study • Lack of published materials on implementation • Balance standardization versus localization ITSEW 2009 Beth-Ellen Pennell
Target Audience • Researchers and survey practitioners planning or engaged in cross-cultural or cross-national research • Basic to advanced information • References • Suggested further reading ITSEW 2009 Beth-Ellen Pennell
Process • Developed over two and a half years • Weekly meeting of core staff • Each guideline underwent iterative, internal reviews • Sent to selected external reviewers with expertise in topic area • Published last summer • Revised last fall ITSEW 2009 Beth-Ellen Pennell
Guideline Topics I. Study structure II. Tenders, bids, and contracts III. Ethical considerations IV. Sample design V. Questionnaire design (in development) VI. Translation VII. Adaptation (in development) VIII. Survey instrument design IX. Pretesting X. Interviewer recruitment and training XI. Data collection XII. Harmonization of data XIII. Data processing XIV. Dissemination XV. Assessing quality ITSEW 2009 Beth-Ellen Pennell
Components of Guidelines • Introduction to topic area • Goal of guideline • Guideline • Rationale • Procedural steps • Lessons learned • Glossary • References • Further suggested reading ITSEW 2009 Beth-Ellen Pennell
Format of Guidelines • “Drill-down” approach • Increasing level of detail • Links available to • Glossary • References • Other modules • External information ITSEW 2009 Beth-Ellen Pennell
Cost Cost Cost Cost Cost Cost Cost Burden Burden Burden Burden Burden Burden Burden Design Constraints Design Constraints Design Constraints Design Constraints Design Constraints Design Constraints Design Constraints Professionalism Professionalism Professionalism Professionalism Professionalism Professionalism Professionalism Low High Cost Burden Design Constraints Professionalism Quality Framework Fitness for Use Relevance Accuracy Timeliness Accessibility Interpretability Coherence Comparability ITSEW 2009 Beth-Ellen Pennell
Accuracy Cost Cost Cost Cost Cost Cost Cost Construct Validity Measurement Error Burden Burden Burden Burden Burden Burden Burden Sampling Error Total Survey Error Design Constraints Design Constraints Design Constraints Design Constraints Design Constraints Design Constraints Design Constraints Nonresponse Error Professionalism Professionalism Professionalism Professionalism Professionalism Professionalism Professionalism Processing Error Coverage Error Adjustment Error Quality Framework ITSEW 2009 Beth-Ellen Pennell
Quality Framework ITSEW 2009 Beth-Ellen Pennell
Summary • Website has had~105,214 hits since published in June, 2008 • ~288 hits per day • Evolving and dynamic: feedback and comments welcome • Provides framework for quality control monitoring ITSEW 2009 Beth-Ellen Pennell
Thank you. http://ccsg.isr.umich.edu/ ITSEW 2009 Beth-Ellen Pennell