310 likes | 509 Views
The Art & Science of Designing a Survey Instrument. Frederick C. Van Bennekom, Dr.B.A. Great Brook Consulting Enhancing Organizational Improvement Through Customer Feedback 421 Main Street Bolton, MA 01740
E N D
The Art & Science of Designing a Survey Instrument Frederick C. Van Bennekom,Dr.B.A. Great Brook Consulting Enhancing Organizational Improvement Through Customer Feedback 421 Main Street Bolton, MA 01740 (978) 779-6312 (877) GreatBr Toll Freefred@greatbrook.com www.greatbrook.com
Art versus Science • None – ignorance • Know a good outcome from bad • Know the characteristics of a quality outcome • Prioritization of these quality characteristics • Know the variables that lead to these outcomes • Know the impact of individual variables • Know the interaction effects among variables • Able to measure the variables • Able to control process to achieve quality outcomes – repeatedly & consistently Art Science
Art andScience? • The Art • Crafting the wording of the questions • The Science • The design process • Design of scales • Not to mention the survey administration
Population r x r Sample r Design & Administer Instrument to a Sample x x x x r r x x x x r x r x r x x x r x x r r x x x x x x x x x r x r r x x x x r Generalize Results to the Population x x x x r x x x x x What is a Survey? • Surveying a Sample is More Efficient Than a Full Census Instrument Validity + Administration Accuracy = Reliability
Good service delivery Value added chain Design... Replication… Sales… Service... Continuous improvement Effective problem handling Problem solicitation The Role of Surveying in Achieving Loyal Behavior
5) Revision Iterations A Rigorous Instrument Design Process 1) Interview management 2) Identify questions to ask 3) Draft survey instrument 4) Review by project team 6) Conduct pilot 7) Redraft & finalize instrument Science
Identifying Questions to Ask • Attributes of Service Delivery • that need to be understood & tracked • Attitudinal Outcomes • driven by perceptions of service delivery performance • Demographic Segmentations • for data analysis
Identify the Attributes • Draw a Service Blueprint • Process flow diagram • Highlights theMoments of Truth = where we “touch” the customer • Review complaint data, conduct focus groups, interviews, or other critical incident studies • What are critical service attributes? • What are customers major concerns? Art
Classify the Attributes – Service Quality Dimensions • Reliability: Delivering on promises • Responsiveness: Being willing to help • Assurance: Inspiring trust and confidence • Empathy: Treating customers as individuals • Tangibles: Representing the service physically • A useful framework for thinking about the instrument design – and analyzing the data Science
Instrument Design – Attitudinal Outcomes • Perception of service delivery leads to attitudes • Likelihood of repurchase • Willingness to provide reference • Overall satisfaction • Any others? • Use of attitudinal measures • Summary measure for the survey • Dependent variable for regression tests • Link attributes to true behavioral outcomes if data are available
Drafting the Survey Instrument • Overall form of the survey instrument • Issues with the construction of the questions • Selecting a scale • Question formats • Question sequencing
Overall Form of the Instrument • Pre-Administration Announcement Letter • Letter or email from senior executive • Survey Introduction • Set the mental state & be consistent • Define critical terms • Initiation - First Questions • Engage the respondent • Get the respondent thinking • Instructions • Even if it seems silly... Science With each contact, motivate the respondent!
Overall Form of the Instrument • Grouping strategies • By topic, by scale, by chronology • Conditional branching – “Skip & Hit” • Routine – a response rut • Long series of questions that read in a rhythm • Respondents just give the same answer • Fatigue – caused by long list of choices • Leads to choosing first or last item • Especially important for telephone surveys Art
Critical Criterion: Common Interpretation Otherwise... You’re Asking the Respondents Different Questions Focus 3 Key Attributes Brevity Clarity Issues with the Construction of Questions • Control for • Instrumentation Bias • Response Bias
Avoiding Instrumentation Bias • Clearly Stated Criteria for Evaluation • Question Must Apply to Respondent • Examples Should Not Lead Response • Reasonable Recall Expectations • Unambiguous Word Choice • Ask One Question at a Time • Don’t Ask Leading or Loaded Questions • Bias Introduced by the Survey Instrument
Scale Anchoring Options • Fully Anchored Extremely Extremely Satisfied Satisfied Undecided Dissatisfied Dissatisfied 1 2 3 4 5 Is this an interval scale or just an ordinal scale?? • Endpoint Anchored Extremely Extremely Satisfied Dissatisfied 1 2 3 4 5
Unstructured Free-form or open-ended response “Please describe...” “Is there anything else...” Structured Response on pre-determined list or scale “Check all that apply...” “Please rate...” How Should I Solicit a Response? • Remember the Objective of a Survey: • Maximize Information Gained while ... • Minimizing Respondent Burden
Advantages Response Not Constrained to Predetermined Categories May Uncover Unexpected Answers Disadvantages Very Long to Complete Respondent Burden Cost to Administer Textual Data Difficult to Analyze and Summarize Question Format: Unstructured • Free-Form or Open-Ended Response
Advantages Clearer responses Easy to summarize & analyze Easy to administer Disadvantages Limits responses May bias responses Requires more investment in question design Question Format: Structured • Coded Response • Multiple Choice & Scaled Data
Instructions Anchors Question Item Scale Interval Rating Scales – Elements Listed below are several statements. Please indicate your agreement with each by selecting a number from 1 to 5 where 1 represents Strongly Disagree and 5 represents Strongly Agree. Strongly Strongly DisagreeAgree I was on hold for a short time N/A 1 2 3 4 5
Wrong How would you rate the ability of the project team to define business requirements? Right Compared to other projects done for you, how would you rate the ability of the project team to define business requirements? Clearly Stated Criteria for Evaluation
Wrong How effective did you find the FAX-Back support system? Right If you used the FAX-Back support system, how effective did you find it? Applicability to Respondent • Include a “not applicable” response choice • Multiple NAs may lead to non-response. Use skip & hit.
Wrong What aspect of our service is most critical to you, for example, the speed of response? Right What aspect of our service is most critical to you? Do Not Lead With Examples • Most critical with open-ended question format
Wrong In your support calls over the past year, how many minutes was it before the phone was answered? Right During the past three months, has the time for a support representative to answer the phone been reasonable? Reasonable Recall Expectations
Wrong In your last support call, was the response time reasonable? Right Consider your last request for support. How reasonable was the time from when you called until you spoke with a support representative? Unambiguous Wording • Major source of construction flaws • Avoid jargon
Examples of Ambiguous Phrasing • The ability of the help desk to resolve problems on the first try • The promptness with which you received the Service Engineer’s estimated time of arrival… • Satisfaction with the functionality of the equipment • Responsiveness of the Customer Support Personnel • Have you received service of consistent quality? • Was your call answered promptly?
Wrong Was the staff technically competent and courteous? Right Was the staff member who handled your issue technically competent? Was the staff member who handled your issue courteous? Ask One Question at a Time
Wrong How did our interest in you, our customer, match your expectations? Right To what extent did our concern for you match your expectations? Avoid Loaded & Leading Wording
Thanks for Attending Any Questions??