390 likes | 489 Views
Multi-Mode Data Collection: Why, When, How. International Conference on Establishment Surveys Montreal June 18-21, 2007 Richard Rosen US. Bureau of Labor Statistics rosen.richard@bls.gov 202-691-6524
E N D
Multi-Mode Data Collection: Why, When, How International Conference on Establishment Surveys Montreal June 18-21, 2007 Richard Rosen US. Bureau of Labor Statistics rosen.richard@bls.gov 202-691-6524 Any opinions expressed in this paper are those of the authors and do not constitute policy of the Bureau of Labor Statistics.
Automated Collection in the Current Employment Statistics Survey
Goals for Paper • To provide insights/experiences based on CES • To review the collective experience of all BLS Establishment-based surveys • To draw conclusions about the relative benefits of multi-mode collection
Current Employment Statistics Program • Monthly survey of employment, payroll, and hours conducted by US Bureau of Labor Statistics • A Federal-State cooperative system • A sample of 300,000 business establishments • Data are published after only 12 collection days • Limited number of data elements (but greatly increased in 2006) • Multi-mode since 1984
Automated Collection Methods in the CES In 1984, BLS began to examine alternative collection methods. New automated collection modes: Year Computer Assisted Telephone Interview (CATI) 1984 Touchtone Data Entry (TDE) 1986 Voice Recognition (VR) 1988 Electronic Data Interchange (EDI) 1994 FAX 1995 Electronic Mail/Internet/WWW 1996
Multi-mode Myths • It will lower my collection cost • It will improve my response rate • Implementation will be easy • It will solve all of my problems
Cost • Depends on what mode is being replaced • Development costs • Initial start-up costs • Economies of scale • Cost structure change over time
Response Rates • Depends on the mode being replaced • Mail vs any mode = improved response • Electronic Data Interchange = generally lower initial response; time lag • CATI can achieve high response rates • TDE, FAX, Web, E-mail: Self-response • Lower response than CATI • Higher response than Mail
Easy • Requires R&D • Acquisition of new HW/SW • IT Support • Prototype • Testing
Solve All Problems • Create new set of problems • New protocols • Integration with other collection modes • Integration with other survey operations • Impact on current staffing/workflow
TDE Respondent Contact Program CES has implemented procedures designed to maximize response Contact Type:Contact Method: • Advance Notice FAX or Postcard • Nonresponse Prompt FAX or Call • “Last Chance” prompt FAX • Secondary NRP Call • Long-term NRP Call • Refusal Conversion Letter/Call
Example of Protocol Effect • Effect of Advance Notice and Nonresponse prompting on TDE Response • Elimination of Advance Notice: • Reduced response by 10 % points • Elimination of Nonresponse message: • Reduced response by 10 % points Rosen and Hertwig, “The Impact of Prompting on Response Rates: Experience with Touchtone Reporting in the CES Program,” American Statistical Association, August 2002.
What Factors to Consider? • Characteristics of your survey • Periodicity • Survey length/complexity • Sample composition • Characteristics of your respondents • Knowledge/education • Environment (office vs mobile) • Commitment/willingness to report
Distribution of CES Sample by Collection Mode March 2000 May 2003
Profile of CES Population and Collection Methods TDE/Web/E-mail Fax/XLS/Fillable Form EDI
Web Collection: Some Advice • KISS (keep it simple stupid) • Respondents are not sophisticated computer users • Beware of overbearing security requirements • Digital Certificates, complicated passwords, passwords that must be updated, pose significant barrier • Keep edits simple • Don’t try to replicate all of your edit checks on-line • Edit failures when data are correct will frustrate respondents • In April 2007, BLS launched a more streamlined Website for data reporting (Web-lite).
E-mail vs Web • Easier for respondent • Eliminates login process • Can’t forget your account number of password • Can be done securely with HTTPS • Can embed HTML form or PDF directly into E-mail • Respondent fills out form and hits a “submit” button • Data sent via Browser to agency server using HTTPS • Several products on market
E-mail Drawbacks • Limited editing capability • With PDF, size of file may be an issue • With HTML, • Number of data items/questions is limited (single page) • Not all respondents have HTML E-mail • Products available don’t offer “total solution” so must develop back-end support
TDE vs Web vs E-mail Collection Rate Comparison *CES experienced web server problems during April 2007
EDI: Some Comments • Takes time to work with firms • Most Gov’t surveys are voluntary • Must “get in line” for IT resources • Have a standard file format but be prepared to take what they have • Data item response can be an issue • New directions: XML
Review of BLS Establishment Surveys • Eight surveys • Four monthly • Two Quarterly • Two Annual • Over the past 10 years, all have adopted multi-mode collection
Survey Questions • Distribution of modes used • Reason for using multiple modes • Do you target certain populations for specific modes • Benefits and Drawbacks • Process used to determine new modes • New modes being considered • Key factors when considering new modes • Advice to other survey organizations
Do you Target Modes? • CES: Take into account the characteristics of the firm in terms of size, number of units/reports, past reporting history in terms of timeliness and ability to self-report. • MWR: Large multi-state employers encourage to use EDI Center; small multi-unit respondents 2-30 worksites offered Web. Also target software developers and outsourcing firms to include electronic reporting in their systems or services. • NCS: Mode of collection is determined by size of establishment, location of establishment, reporting capability of the firm, and level of cooperation. • SOII: All modes are available to all respondents. Certain respondents get booklets designed to encourage internet and other electronic methods or reporting. • IPP: During the visit respondents are told that web is the preferred mode of collection but that mail/fax is also available if that is their preference. IPP wants the respondent to choose whatever method they feel most comfortable with. Since September 2005, just over 70% of IPP respondents have selected web. • PPI: prefers electronic data collection, which currently is limited to Fax. For important respondents, we allow them to email in spreadsheet with pricing data. • JOLTS: With the exception of our largest respondents, we normally encourage our respondents to provide data via TDE after a 6 month period of data collection using CATI. • OES: COCs or establishment with high weight; previous response via a particular mode; availability of email or web site addresses
Drawbacks • CES: Time spent in researching and developing new mode. Need to develop new protocols and procedures for each mode. Often end up with separate databases for each mode that need to be managed. Almost like running multiple/separate survey operations. • MWR: Requires sizable amount of staff resources to maintain systems and deal with coordination and timing issues. Keeping States and State systems and employers in sync can be a challenge. • NCS: It is virtually impossible to isolate and evaluate the effectiveness of any single mode. Have to maintain instructions and enforce protocols in each area. Additional tracking and maintenance is required to keep up with each firm’s collection mode. Need to customize update materials and data requests to match the mode used by each firm. Electronic collection carries risks of confidentiality breaches. • SOII: For some reason electronic collection has a slightly lower response rate than respondents receiving the standard booklets. • IPP: Additional costs/resources to maintain multiple modes. • PPI: The major drawback for PPI is the BLS limitations on email, not having web repricing and systems limitations on broadcast fax pricing. • JOLTS: None. • OES: Increased occupational coding burden on State analysts; Internal and external security/confidentiality issues; Costs.
Process to Determine New Mode • CES: Initial research on mode, possible use/benefit. Proof on concept project. If appears to be successful, limited production. Then full implementation. Constant evaluation and monitoring. • MWR: Evaluated pros and cons of various collection modes in terms of expected costs, accuracy, security, periodicity of collection. For example, if survey is only done every 3 years do you want to invest in new costly collection mode. • NCS: The last mode to be added, electronic collection through email to a secure server, arose due to demand from respondents and field economists. It was then developed and tested. • SOII: Does the new mode reduce cost and/or help capture additional narrative. • IPP: More respondents started requesting additional electronic alternatives, such as Internet-based collection. Initially piloted the web survey with a small sample of reporters. Results of the pilot were very favorable. • PPI: Broadcast fax was a pilot project for several years with limited respondents. When the anthrax problem disrupted mail pricing, then we started repricing forms by fax and gradually increased our broadcast fax capabilities • OES: Feedback; Research and IT consultation; Security review; Pilot testing and process refinement
New Modes Under Consideration • CES: E-mail collection either with embedded HTML form or fillable XLS/PDF • MWR: Fillable forms • NCS: secure, encrypted data files • SOII: None • IPP: Fillable PDF survey forms via email • PPI: Web repricing • JOLTS: Secure E-mail • OES: Fillable forms, Web
Factors to Implement • CES: Ease of use for respondent; respondent acceptance. Data security. • MWR: Cost, employer acceptance, State acceptance, employer familiarity with the survey. • NCS: If the test is successful, it is relatively easy to use, and respondents find it acceptable. • IPP: Respondent demand, costs and other measurable compared to IPPs current collection alternatives. • JOLTS: Cost of using the central BLS facility for secure email. Ease of use for respondent. • OES: Feasibility to collect meaningful data; Cost; Security
Study Mode Effects • CES: Mostly response rate effects and data item response. Some review of data quality. • MWR: Web collection has higher response rates, but this may be somewhat biased as we restricted initial solicitation to good reporters. • NCS: No • SOII: response rates, processing times • IPP: With web repricing the IPP gets substantially faster data turnaround than mail/fax; Clerical and quality edits on the front-end of the web survey yield better quality and more usable data than mail/fax; response rates have been consistently greater for web than mail/fax respondents. • PPI: No • JOLTS: No • OES: Response Analysis Survey will include review of mode effects
Advice • CES: Surveys need to modernize their collection. Carefully consider alternative collection modes, do the needed development work and evaluation. Don’t expect the new mode to solve all of your problems. If you can solve just one problem or meet one objective that may be enough. Constantly evaluate your “mix” as things change, and you have to adapt and add new modes and de-emphasize others. • MWR: Consider costs, employer familiarity with the survey, timing and coordination issues. • NCS: Evaluate how secure any new mode of collection will be and whether any policy precludes a specific mode from use. Assess what information is available in more than one mode and whether respondent burden or collection costs can be reduced, and whether efficiency or quality is improved by adding a new mode. Need to have technology support and staff training on any new collection mode employed. • SOII: Do it. • IPP: Know your respondents; get support from upper management; Iterative development and pilot testing; Usability testing a must; keep stakeholders informed; provide resources for ongoing project management. • PPI: Be prepared to budget in advance as developing electronic collection methods is costly. • OES: Talk to others to find out their lessons learned; establish and maintain a good working relationship with IT; lots of research; have a tech close by; Pilot testing; Training
Summary • Most BLS surveys are using multi-mode • Respondent preference/acceptance is top priority • Cost should NOT be primary goal (secondary or side benefit) • Improved response is a possible benefit • Improved timeliness is possible/likely • Additional modes add complexity to survey operations • Continuous evaluation of effectiveness; can’t stand still