330 likes | 590 Views
Surveys are Becoming Increasingly Popular. Surveys Offer:Participant AnonymitySystematic Data CollectionEase of AdministrationRelative Cost-Effectiveness. Surveys are Becoming Increasingly Popular. Surveys provide an efficient means for collecting data on a broad range of issues Within the Assessment FunctionJob AnalysisEvaluation of Rater TrainingEvaluation of New AssessmentsEvaluation of Assessment ProgramWithin HRMSkills AssessmentApplicant FeedbackReasons for Attrition .
E N D
1. Developing and Administering Web-based Surveys: A Tool-Kit for Assessment Professionals Presentation to the Mid-Atlantic Personnel Assessment Consortium
November 12, 2008
2. Surveys offer a very practical way to get information about important organizational issues very efficiently and are relatively cost effective, compared to other methods (e.g., interview, focus group).
In addition,
Surveys allow organizational members can express their views on sensitive issues anonymously.
Further, with the development of new survey software packages, it has become increasingly easy to produce, publish, and disseminate surveys.
.
Surveys offer a very practical way to get information about important organizational issues very efficiently and are relatively cost effective, compared to other methods (e.g., interview, focus group).
In addition,
Surveys allow organizational members can express their views on sensitive issues anonymously.
Further, with the development of new survey software packages, it has become increasingly easy to produce, publish, and disseminate surveys.
.
3. Surveys are Becoming Increasingly Popular Surveys provide an efficient means for collecting data on a broad range of issues
Within the Assessment Function
Job Analysis
Evaluation of Rater Training
Evaluation of New Assessments
Evaluation of Assessment Program
Within HRM
Skills Assessment
Applicant Feedback
Reasons for Attrition
Surveys are becoming increasingly popular for a variety of reasons.
Surveys offer a very practical way to get information about important organizational issues very efficiently and are relatively cost effective, compared to other methods (e.g., interview, focus group).
In addition,
Surveys allow organizational members can express their views on sensitive issues anonymously.
Further, with the development of new survey software packages, it has become increasingly easy to produce, publish, and disseminate surveys.
Aside:
I’m curious, how many of you got a web-based survey within the past week?
This year I have been involved either as a developer, reviewer, or manager, in about 10 survey-related projects within CBP—and I’m not the only one doing surveys in my agency.
Surveys are becoming increasingly popular for a variety of reasons.
Surveys offer a very practical way to get information about important organizational issues very efficiently and are relatively cost effective, compared to other methods (e.g., interview, focus group).
In addition,
Surveys allow organizational members can express their views on sensitive issues anonymously.
Further, with the development of new survey software packages, it has become increasingly easy to produce, publish, and disseminate surveys.
Aside:
I’m curious, how many of you got a web-based survey within the past week?
This year I have been involved either as a developer, reviewer, or manager, in about 10 survey-related projects within CBP—and I’m not the only one doing surveys in my agency.
4. Surveys are Becoming Increasingly Popular Surveys provide an efficient means for collecting data on a broad range of issues
Within the Organization
Customer Satisfaction
Employee Attitudes
Succession Planning
Policy and Program Evaluation Organizations use surveys to examine a broad range of issues.
Customer Satisfaction: Is the organization meeting the needs of its customer base?
LER, BMW, CPRO
Employee Attitude Surveys: Do employees in the organization have the values and attitudes they need to sustain organizational effectiveness?
An evaluation of the merger of different Department of Homeland Security agencies (BUIS)
OPM’s Federal Human Capital Survey, and the off-year administration by DHS/CBP
Program Evaluation Surveys:
Are broad policies and practices having the intended effect?
MSPB surveys on personnel policies
Communications pulse survey
Are specific programs and procedures working as intended?
Job Fair
Evaluation of switch to computer-based testing
Reasons from attrition from BP during the first year
Organizations use surveys to examine a broad range of issues.
Customer Satisfaction: Is the organization meeting the needs of its customer base?
LER, BMW, CPRO
Employee Attitude Surveys: Do employees in the organization have the values and attitudes they need to sustain organizational effectiveness?
An evaluation of the merger of different Department of Homeland Security agencies (BUIS)
OPM’s Federal Human Capital Survey, and the off-year administration by DHS/CBP
Program Evaluation Surveys:
Are broad policies and practices having the intended effect?
MSPB surveys on personnel policies
Communications pulse survey
Are specific programs and procedures working as intended?
Job Fair
Evaluation of switch to computer-based testing
Reasons from attrition from BP during the first year
5. Conducting a Survey is Easier than Ever! Software facilitates survey design
Software libraries include examples
Questions and scales
Customizable templates
Links to organizational databases simplify administration and analysis
Demographic information
Sample selection
E-mail addresses
Publication is automatic
Errors can be corrected “on the fly”
Web-based surveys make it easier than ever than ever to launch a survey.
It is easier than ever to design a survey.
Most software libraries include examples if questions, scales, and templates for different kinds of surveys
Some include extensive help libraries that have white papers on survey topics, and tips and tricks.
Most software packages come with tips, tricks, and sample survey questions
Because they are electronic, they make all aspects of survey administration
Some surveys can link to demographic information, so you may not need to include it on your survey
You can also link to your organizations’ e-mail system and automatically send out the survey and reminders.
Finally, publication is automatic—once you are happy with the format and content, the survey is done. There are no publication costs or delays. Web-based surveys make it easier than ever than ever to launch a survey.
It is easier than ever to design a survey.
Most software libraries include examples if questions, scales, and templates for different kinds of surveys
Some include extensive help libraries that have white papers on survey topics, and tips and tricks.
Most software packages come with tips, tricks, and sample survey questions
Because they are electronic, they make all aspects of survey administration
Some surveys can link to demographic information, so you may not need to include it on your survey
You can also link to your organizations’ e-mail system and automatically send out the survey and reminders.
Finally, publication is automatic—once you are happy with the format and content, the survey is done. There are no publication costs or delays.
6. Conducting a Survey is Easier than Ever! Electronic distribution and return save time and resources
Internal e-mail system handles delivery
Respondents click on “submit” to return surveys
Reminders are easy to send out
Responses can be monitored in “real time”
Response rate
Question functioning
Data are handled efficiently and with precision
Responses are automatically collected, scored, and stored
Survey software analyzes data and generates reports Electronic distribution is a real plus. I remember, about 10 years ago, my office was conducting a survey that went to all Federal supervisors, managers, and executives. Everyone on the staff sat in the basement stuffing envelopes for what seemed like weeks.
Also, once you have obtained the electronic distribution list from your IT department, it is very easy to send reminders to about the survey deadline or to extend the survey to increase the response rate.
As the returns come in, they are automatically counted. This means that you can check your response rate in real time.
Also, as returns come in, the data are automatically entered into a file that can be imported into SPSS or Excel. Many survey packages have built in analysis tools.
Electronic distribution is a real plus. I remember, about 10 years ago, my office was conducting a survey that went to all Federal supervisors, managers, and executives. Everyone on the staff sat in the basement stuffing envelopes for what seemed like weeks.
Also, once you have obtained the electronic distribution list from your IT department, it is very easy to send reminders to about the survey deadline or to extend the survey to increase the response rate.
As the returns come in, they are automatically counted. This means that you can check your response rate in real time.
Also, as returns come in, the data are automatically entered into a file that can be imported into SPSS or Excel. Many survey packages have built in analysis tools.
7. Why Get into the Survey Business? How many of you have already developed a survey?
Job analysis surveys count.
How about selection procedures?
Multiple choice items?
Structured interviews?
T&E questionnaires?
Performance Rating Questionnaires?
Survey and Test development have a lot in common. Both require the same basic skills, knowledge, and abilities.
Think about it . . .as assessment professionals, our basic business is asking questions.
We also have a lot of other skills that we’ve acquired because of the work involved in developing assessments.
Because of the skills we already have as assessment professionals, we have the opportunity to make a real contribution.
How many of you have already developed a survey?
Job analysis surveys count.
How about selection procedures?
Multiple choice items?
Structured interviews?
T&E questionnaires?
Performance Rating Questionnaires?
Survey and Test development have a lot in common. Both require the same basic skills, knowledge, and abilities.
Think about it . . .as assessment professionals, our basic business is asking questions.
We also have a lot of other skills that we’ve acquired because of the work involved in developing assessments.
Because of the skills we already have as assessment professionals, we have the opportunity to make a real contribution.
8. We Have the Knowledge, Skills, and Abilities Job Analysis
Ability to run SME panels
Ability to write clearly & concisely
Ability to format a survey
Knowledge of descriptive statistics
Assessment Design
Ability to define operationally define constructs
Ability to select appropriate assessment method Most of us have had some experience with survey methodology, especially in the areas of design, implementation, and analysis.
Job Analysis
Ability to work with SMEs
Ability to write clearly & concisely
Ability to format a survey
Knowledge of descriptive statistics
We also know how to define research issues and operationalize constructs.
Assessment Design
Ability to operationally define constructs
Ability to develop or select appropriate measures
Most of us have had some experience with survey methodology, especially in the areas of design, implementation, and analysis.
Job Analysis
Ability to work with SMEs
Ability to write clearly & concisely
Ability to format a survey
Knowledge of descriptive statistics
We also know how to define research issues and operationalize constructs.
Assessment Design
Ability to operationally define constructs
Ability to develop or select appropriate measures
9. We Have the Knowledge, Skills, and Abilities Assessment Development
Ability to write instructions to participants
Ability to write multiple-choice item stems and response options
Ability to sequence assessment content logically
Assessment Validation
Knowledge of research design
Ability to coordinate large-scale research projects
Knowledge of sampling issues and procedures
Ability to draw appropriate conclusions from statistical data
Knowledge of descriptive statistics We have the basic skills needed to develop survey questions, and to assemble surveys.
Assessment Development
Ability to write directions to participants
Ability to write multiple-choice item stems
Ability to write multiple-choice response options
Ability to sequence assessment content logically
We know how to evaluate what we do.
Test Validation
Knowledge of research design
Ability to coordinate large-scale research projects
Knowledge of sampling issues and procedures
We have the basic skills needed to develop survey questions, and to assemble surveys.
Assessment Development
Ability to write directions to participants
Ability to write multiple-choice item stems
Ability to write multiple-choice response options
Ability to sequence assessment content logically
We know how to evaluate what we do.
Test Validation
Knowledge of research design
Ability to coordinate large-scale research projects
Knowledge of sampling issues and procedures
10. We Have the Knowledge, Skills, and Abilities Data Analysis
Knowledge of sampling issues and procedures
Knowledge of descriptive and inferential statistics
Ability to draw appropriate conclusions from data
Report Preparation
Ability to write complex, technical reports
Ability to brief higher-level officials on critical issues or problems requiring resolution
Ability to provide constructive feedback to individuals who have been assessed
Analysis of test results
Knowledge of inferential and descriptive statistics
Ability to draw appropriate conclusions from statistical data
And we are able to communicate the findings to others.
Reporting Test Results
Ability to write complex, technical reports
Ability to brief higher-level officials on critical issues or problems requiring resolution
Ability to provide constructive feedback to individuals who have been assessed
Can you think of anything else??
So as you can see—as assessment professionals, even if we’ve never technically put together an organizational survey, we are way ahead of the game.
Analysis of test results
Knowledge of inferential and descriptive statistics
Ability to draw appropriate conclusions from statistical data
And we are able to communicate the findings to others.
Reporting Test Results
Ability to write complex, technical reports
Ability to brief higher-level officials on critical issues or problems requiring resolution
Ability to provide constructive feedback to individuals who have been assessed
Can you think of anything else??
So as you can see—as assessment professionals, even if we’ve never technically put together an organizational survey, we are way ahead of the game.
11. Some Things You Should Know Before You Begin . . . We’ve talked about the things that assessment development and survey development have in common.
But there are also some important differences. Some are obvious, others are not.
We’ve talked about the things that assessment development and survey development have in common.
But there are also some important differences. Some are obvious, others are not.
12. Tests vs. Surveys Your Role
Necessary customer involvement affects your role
Expert vs. Internal Consultant
Director vs. Developer
Analyst vs. Communicator
Implications
Good news: customers have ownership from the outset
Bad news: Everyone knows how to ask questions; therefore, everyone is an expert When you are involved in survey design your role shifts from prime mover to facilitator.
As a test developer, although you get input from program managers and SMEs, you are the recognized expert in assessment design. The the ultimate decisions about the design and content of the assessment typically reside with the assessment professionals.
This is not the case with surveys. Your job is to shape your customer’s ideas into an instrument that is likely to address them. Sometimes you wind up with some very interesting questions.
The good news is that your customer, if appropriately involved really owns the final product. (I can’t say that the same is true for some of the assessments we’ve developed.)
Another important difference is how you spend your time.
As a test developer, I spend a lot of time looking at item statistics, scoring tests, equating scores to past examinations, and documenting test validity.
As the survey wizard, I spend a lot of time communicating. I work with customers to develop the instrument, brief them on the results, and develop briefing materials for them to use.When you are involved in survey design your role shifts from prime mover to facilitator.
As a test developer, although you get input from program managers and SMEs, you are the recognized expert in assessment design. The the ultimate decisions about the design and content of the assessment typically reside with the assessment professionals.
This is not the case with surveys. Your job is to shape your customer’s ideas into an instrument that is likely to address them. Sometimes you wind up with some very interesting questions.
The good news is that your customer, if appropriately involved really owns the final product. (I can’t say that the same is true for some of the assessments we’ve developed.)
Another important difference is how you spend your time.
As a test developer, I spend a lot of time looking at item statistics, scoring tests, equating scores to past examinations, and documenting test validity.
As the survey wizard, I spend a lot of time communicating. I work with customers to develop the instrument, brief them on the results, and develop briefing materials for them to use.
13. This slide pretty much sums up the process of survey construction.
It is a picture of the procession of the Ringling Brothers Circus proceeding through New York City around 1919.
Ringling Bros. Down sized their camel hitch team and went to eight Camels pulling the Circus Wagon. As seen in this 1919 photo of them going through town
Bottom line, you have camels doing what horses would have been doing. They may not look as elegant, but they get the job done.
Source: http://camelphotos.com/GraphicsP7/RinglingBros1919.jpgThis slide pretty much sums up the process of survey construction.
It is a picture of the procession of the Ringling Brothers Circus proceeding through New York City around 1919.
Ringling Bros. Down sized their camel hitch team and went to eight Camels pulling the Circus Wagon. As seen in this 1919 photo of them going through town
Bottom line, you have camels doing what horses would have been doing. They may not look as elegant, but they get the job done.
Source: http://camelphotos.com/GraphicsP7/RinglingBros1919.jpg
14. Tests vs. Surveys The Stakeholders
Some stakeholders are the same, but their concerns are not
External Policy & Decision-makers
Organizational Trustees
Government Officials
Oversight Agencies (e.g., GAO, OPM. OMB)
Internal Policy & Decision-makers
Executives, Managers, and Supervisors
Union(s)
Respondents
Applicants
Current employees
Some stakeholders are the same, but their concerns are not.
With employee selection, your primary customers are the hiring managers and the prospective applicants.
Managers want the best people for the jobs.
Applicants want a job that is suited to their capabilities and needs. Current employees may be involved in assessment, development, and validation.
Occasionally, an external organizations (EEOC, DOJ) will get into involved, as would be the case if your test is found to have adverse impact
With surveys, some of the stakeholders may look the same, but their role differs, because they have different concerns.
You are more likely to have close involvement by top level agency managers, unions, and current employees.
Also, if you are doing a large, high impact survey, external policy and decision makers are likely to be involved.
Some stakeholders are the same, but their concerns are not.
With employee selection, your primary customers are the hiring managers and the prospective applicants.
Managers want the best people for the jobs.
Applicants want a job that is suited to their capabilities and needs. Current employees may be involved in assessment, development, and validation.
Occasionally, an external organizations (EEOC, DOJ) will get into involved, as would be the case if your test is found to have adverse impact
With surveys, some of the stakeholders may look the same, but their role differs, because they have different concerns.
You are more likely to have close involvement by top level agency managers, unions, and current employees.
Also, if you are doing a large, high impact survey, external policy and decision makers are likely to be involved.
15. Tests vs. Surveys The Stakes
Tests have high stakes for the individual
Who gets the job?
Who gets promoted?
Individuals are highly motivated to complete tests
Surveys have high stakes for the organization
What intervention is plausible?
What gets funded?
What gets scrutinized?
Is the agency green, red, or yellow?
Did we make the top fold of the Washington Post?
Individuals are less motivated to complete surveys As I mentioned, the stakes are quite different.
Tests have high stakes for the individual. If an individual wants a particular job, they will be motivated to complete the assessment procedure.
Tests have major effects on individuals’ lives
Of course, a good test can have a dramatic affect on organizational productivity.
Individuals may be less motivated to complete surveys. OPM did a follow-up study of non-respondents on the FCHS and found that “being too busy” was the top reason for non-response.
Survey results can have much broader impact and can affect:
Public perception—large Federal Surveys are likely to hit the front fold of the Washington Post.
Funding—based on survey results, executives may decide some initiatives and not others.
Policies—provide input into which policies are working and which ones are in need of change
Day-to-day operations
Implementation of organizational improvements
Program plansAs I mentioned, the stakes are quite different.
Tests have high stakes for the individual. If an individual wants a particular job, they will be motivated to complete the assessment procedure.
Tests have major effects on individuals’ lives
Of course, a good test can have a dramatic affect on organizational productivity.
Individuals may be less motivated to complete surveys. OPM did a follow-up study of non-respondents on the FCHS and found that “being too busy” was the top reason for non-response.
Survey results can have much broader impact and can affect:
Public perception—large Federal Surveys are likely to hit the front fold of the Washington Post.
Funding—based on survey results, executives may decide some initiatives and not others.
Policies—provide input into which policies are working and which ones are in need of change
Day-to-day operations
Implementation of organizational improvements
Program plans
16. Tests vs. Surveys The Design & Development Process
Tests
Determine what to measure
Select/develop the most appropriate measure
Evaluate consequences as part of your final validation report
Surveys
Determine what your customer wants to know
Envision the “final report”
Consider the consequences of having that information
If you don’t ask, you won’t find out
If you do ask, you can’t put it back
Select/develop the most appropriate measure There are important differences in the design and development process
Tests focus on skills, abilities and other job related attributes
When selecting or developing selection procedures, the job analysis tells you what skills and attributes are important to measure.
You look at past research and practice and organizational constraints and decide on the best to measurement procedure to use.
With Surveys, your focus is on attitudes and perceptions
First you determine what your customer wants to know, then you work backwards from the final product. It’s helpful to ask what they want the final report to say.
You have to think about the consequences of having this information.
There is information that, if released, could damage your agency. Surveys are public, in the sense that they are not secure. If you ask the question, respondents know that the information exists in a file somewhere.
Because you have to continually focus on consequences it’s not a bad idea to think about what should appear in the final report and work backward to get the necessary information.There are important differences in the design and development process
Tests focus on skills, abilities and other job related attributes
When selecting or developing selection procedures, the job analysis tells you what skills and attributes are important to measure.
You look at past research and practice and organizational constraints and decide on the best to measurement procedure to use.
With Surveys, your focus is on attitudes and perceptions
First you determine what your customer wants to know, then you work backwards from the final product. It’s helpful to ask what they want the final report to say.
You have to think about the consequences of having this information.
There is information that, if released, could damage your agency. Surveys are public, in the sense that they are not secure. If you ask the question, respondents know that the information exists in a file somewhere.
Because you have to continually focus on consequences it’s not a bad idea to think about what should appear in the final report and work backward to get the necessary information.
17. Tests vs. Surveys Administration
Security and Confidentiality
For tests, both security and confidentiality are critical
For surveys, security is not an issue, but anonymity and confidentiality of results are critical
Conditions of Administration
Surveys can be easily administered over the internet because compromise is not an issue
Assessments of ability typically require proctoring or verification With tests, you must keep the test material secure and the results confidential. Of course, access to test results is restricted to those who need to know—the examinee and the hiring official.
However, the consequences of poor test security are major.
If a test is compromised, at a minimum, your organization must expend the time and staff needed developing a new test.
How many of you work with law enforcement personnel. If we could sequester the item writing panel we would.
No one wants to hear the the new firefighters promotional exam has been leaked prior to administration.
How many of you don’t let your test takers out of the exam room until everyone is finished?
With surveys, the security of the survey instrument is not an issue.
Once you administer a survey, particularly if you put it on the web or administer it by mail, it’s out there. It’s the answers that must be kept secure. Anonymity is critical and must be protected. Your results cannot be released until your customer says they can.
Internet administration creates some challenges, which I will be talking about, but nothing like the ones we are experiencing in the assessment field.With tests, you must keep the test material secure and the results confidential. Of course, access to test results is restricted to those who need to know—the examinee and the hiring official.
However, the consequences of poor test security are major.
If a test is compromised, at a minimum, your organization must expend the time and staff needed developing a new test.
How many of you work with law enforcement personnel. If we could sequester the item writing panel we would.
No one wants to hear the the new firefighters promotional exam has been leaked prior to administration.
How many of you don’t let your test takers out of the exam room until everyone is finished?
With surveys, the security of the survey instrument is not an issue.
Once you administer a survey, particularly if you put it on the web or administer it by mail, it’s out there. It’s the answers that must be kept secure. Anonymity is critical and must be protected. Your results cannot be released until your customer says they can.
Internet administration creates some challenges, which I will be talking about, but nothing like the ones we are experiencing in the assessment field.
18. Tests vs. Surveys Communicating the Results
Tests
Results go to the individual
Individuals may become sensitive
“Bad news” becomes personal
Litigation is possible
Surveys
Results go to your customer
Results can be politically sensitive
You may have to deliver “bad news” to influential people
Public scrutiny is possible
The “top fold” factor: Would you want to see this on the top fold of the Washington Post?
Another difference is in how the results are transmitted and how they are used.
Detailed test results go to the individual. The hiring official may receive more general results—such as a list of qualified applicants.
Bad news becomes personal. There are a few winners and a lot of losers. Typically losers are not happy campers, but you deal with them. Sometimes you get sued.
Survey results go to your customer, and it is your customer who decides how and when they are released.
Results can be very politically sensitive—you may find yourself giving bad news to some very important people—people who can hurt you.
Survey results can easily wind up in the public eye. You have to be very careful with the results. The last thing you want is for the unfavorable results to be leaked to the media before your customer has a chance to develop an action plan.
Another difference is in how the results are transmitted and how they are used.
Detailed test results go to the individual. The hiring official may receive more general results—such as a list of qualified applicants.
Bad news becomes personal. There are a few winners and a lot of losers. Typically losers are not happy campers, but you deal with them. Sometimes you get sued.
Survey results go to your customer, and it is your customer who decides how and when they are released.
Results can be very politically sensitive—you may find yourself giving bad news to some very important people—people who can hurt you.
Survey results can easily wind up in the public eye. You have to be very careful with the results. The last thing you want is for the unfavorable results to be leaked to the media before your customer has a chance to develop an action plan.
19. Some Things You Should Know Before You Begin . . . But there are also some important differences. Some are obvious, others are not.
But there are also some important differences. Some are obvious, others are not.
20. Additional Considerations Resources: Personnel Costs
Assessment Staff
Specialized skills
Train or procure?
Current workload
IT Support
Subject Matter Experts
Survey Audience
Employees complete surveys on work time Before you begin, assess your in-house capabilities
Can your staff “hit the ground running”
If they can, do they have the time to take on another project. Analysis and reporting are particularly time intensive.
Can you get the support you need from your IT people?
E.g., preparing databases, getting e-mail addresses, handling firewall issues.
Or helping with hardware requirements, or software malfunctions if you decide to purchase survey software.
For some projects you will also need to meet with subject matter experts to define key issues, or to ensure that your terminology is correct.
Also, consider the personnel costs associated with survey response. Surveys are generally completed on work time.
Employees typically complete surveys on work time. Therefore survey administration directly affects productivity!
It helps to calculate the cost per respondent.
Last year I did the math for a 10 minute survey of all supervisors and managers in CBP. If everyone responded (not gonna happen) it would equal about 1 staff year.
Before you begin, assess your in-house capabilities
Can your staff “hit the ground running”
If they can, do they have the time to take on another project. Analysis and reporting are particularly time intensive.
Can you get the support you need from your IT people?
E.g., preparing databases, getting e-mail addresses, handling firewall issues.
Or helping with hardware requirements, or software malfunctions if you decide to purchase survey software.
For some projects you will also need to meet with subject matter experts to define key issues, or to ensure that your terminology is correct.
Also, consider the personnel costs associated with survey response. Surveys are generally completed on work time.
Employees typically complete surveys on work time. Therefore survey administration directly affects productivity!
It helps to calculate the cost per respondent.
Last year I did the math for a 10 minute survey of all supervisors and managers in CBP. If everyone responded (not gonna happen) it would equal about 1 staff year.
21. Additional Considerations Resources: Hardware and Software
Hardware
Does the survey audience have easy computer access?
Consider alternate arrangements (e.g., hard copy, telephone)
Software
Does the software have the features you need?
(e.g., question templates, data analysis, reporting flexibility)
Is it compatible with other software systems (SPSS, MS Excel)
Hardware/Software Compatibility
Will the software work with your organization’s hardware?
Will the software run on your network? You also need to determine whether a web-based survey will work for you. Or would the low tech solution be better?
Hardware
Does your organization have computers that can run a web-based survey? For most organizations, this is not an issue.
You also need to consider whether your organization’s network will handle the demands.
We ran a DHS survey earlier this year. Our fire wall considered it an intruder
Finally, can the majority of your employees get access to computers during work hours?
If not, can you make alternate arrangements?
We provided a phone number or hard copy to of an attitude to BP Agents who were in the field all day
Software
Will the software package do what you need to do?
Question templates
Data Analysis/compatibility with data analysis software (SPSS, Excel)
Data reporting (charts, graphs)You also need to determine whether a web-based survey will work for you. Or would the low tech solution be better?
Hardware
Does your organization have computers that can run a web-based survey? For most organizations, this is not an issue.
You also need to consider whether your organization’s network will handle the demands.
We ran a DHS survey earlier this year. Our fire wall considered it an intruder
Finally, can the majority of your employees get access to computers during work hours?
If not, can you make alternate arrangements?
We provided a phone number or hard copy to of an attitude to BP Agents who were in the field all day
Software
Will the software package do what you need to do?
Question templates
Data Analysis/compatibility with data analysis software (SPSS, Excel)
Data reporting (charts, graphs)
22. Organizational Impact
Surveys raise expectations
Participants want feedback
Participants expect changes to be made
Surveys can be overused
Without coordination, organizations may stage multiple, competing surveys
Employees get survey fatigue
Response rate decreases
Administering a survey is in itself an organizational intervention.
Surveys raise expectations
If employees complete a survey, especially one that asks them about potentially sensitive issues, they expect action to be taken. At a minimum they want a summary of results.
Over-surveying is a problem.
Coordination is critical.
We need to keep tabs on the number of surveys that go out and the timing of these surveys
The more surveys that go out, the fewer that come back.Administering a survey is in itself an organizational intervention.
Surveys raise expectations
If employees complete a survey, especially one that asks them about potentially sensitive issues, they expect action to be taken. At a minimum they want a summary of results.
Over-surveying is a problem.
Coordination is critical.
We need to keep tabs on the number of surveys that go out and the timing of these surveys
The more surveys that go out, the fewer that come back.
23. Seven Simple Steps for Survey Design 0. Kill the Survey
Assess Needs and Objectives
Develop an Action Plan
Design the Data Collection Strategy
Develop the Survey Materials
Conduct the Survey
Analyze the Responses
Report the Results I’d now like walk you through the life cycle of a survey.
I’d now like walk you through the life cycle of a survey.
24. 0. Kill the Survey Is a survey the best way of getting information? I told a friend of mine that I was doing a tutorial on survey development.
She said—”I thought that your job was killing surveys!”
I told a friend of mine that I was doing a tutorial on survey development.
She said—”I thought that your job was killing surveys!”
25. 0. Kill the Survey What are the goals of your survey?
What broad questions do you want answered?
Use of existing programs
Need for new programs
How will you use the answers?
Who will participate in your survey?
Who will be involved?
Organizations, organizational level
How many will be involved?
Sample or Census Survey
The proliferation of surveys can become a problem, so before we begin, we first find out whether about the customer’s need for information.
Do they really need more information than they already have?
I s a survey the best way to get that information?
Before we begin the survey process, we meet with potential customers and hand out a sheet with a series of questions.
By the time we finish, if we are lucky, the customer has We focus on why they want the survey, who will get the survey . . .
You will also find these questions on page 1 of your handout.The proliferation of surveys can become a problem, so before we begin, we first find out whether about the customer’s need for information.
Do they really need more information than they already have?
I s a survey the best way to get that information?
Before we begin the survey process, we meet with potential customers and hand out a sheet with a series of questions.
By the time we finish, if we are lucky, the customer has We focus on why they want the survey, who will get the survey . . .
You will also find these questions on page 1 of your handout.
26. 0. Kill the Survey What is your time frame for survey administration and reporting?
When do you need the answers?
What other surveys will be taking place at the same time?
What is your communications plan for this survey?
Notify chain of command
Notify respondents
Invite Respondents
Follow up if needed The time frame for the survey process—If everything works well, 3-4 months is comfortable.
How they plan to get the word out.The time frame for the survey process—If everything works well, 3-4 months is comfortable.
How they plan to get the word out.
27. 0. Kill the Survey How will you be delivering the survey to participants?
Will you require primary and secondary survey delivery strategies?
Web-Based
Paper & Pencil
Telephone
Other?
What strategy will you use to follow up with non-responders?
What is your target response rate?
What if you don’t reach it?
Reminders
Extensions
The delivery system
Following upThe delivery system
Following up
28. 0. Kill the Survey How will you handle the incoming data?
Who will be analyzing the data
What software will be used to analyze the data?
How will the data be maintained?
Who will have access to the data?
What are your plans for reporting the data?
Who is the audience for survey results
How will results be delivered?
Formal report
Conference
Focus groups
How do you plan to follow up? Data. People tend not to think about what they are going to do with the data, and how long analysis and reporting can take. Actually, we’ve found that these latter stages can take longer than everything else combined.
Following up. In some cases, it involves providing feedback to participants. However if you are monitoring long-term trends, as the Office of Personnel Management is with its Federal Human Capital Survey, you will need to plan for repeated administration at regular intervals. You will also have to consider the best way to get results out to a wide audience.
At the end of this process, many potential customers decide that it’s really too much work to do a survey—maybe an interview, a few focus groups, or a search of archival data might be less labor intensive.
After getting through this list of questions and doing the survey, one customer said, I know you warned us, but I still did not imagine that it would be this much work!
However, if you get through this process, and they still want to do a survey, you’ve done the essential groundwork of establishing the goals and objectives of your survey
These objectives will drive decisions that you will be making later steps in the survey processData. People tend not to think about what they are going to do with the data, and how long analysis and reporting can take. Actually, we’ve found that these latter stages can take longer than everything else combined.
Following up. In some cases, it involves providing feedback to participants. However if you are monitoring long-term trends, as the Office of Personnel Management is with its Federal Human Capital Survey, you will need to plan for repeated administration at regular intervals. You will also have to consider the best way to get results out to a wide audience.
At the end of this process, many potential customers decide that it’s really too much work to do a survey—maybe an interview, a few focus groups, or a search of archival data might be less labor intensive.
After getting through this list of questions and doing the survey, one customer said, I know you warned us, but I still did not imagine that it would be this much work!
However, if you get through this process, and they still want to do a survey, you’ve done the essential groundwork of establishing the goals and objectives of your survey
These objectives will drive decisions that you will be making later steps in the survey process
29. 1. Clarify Expectations Clarify expectations
Project Budget/Costs
Deliverables
Ownership of results
What data will be reported, and to whom?
Working relationships
The steering committee
Your role
Put them in writing
Prepare a written proposal Early in the process, we clarify the scope of the project and the role our customer wants us to play.
Sometimes we help with reviewing or designing the questions and our involvement ends there.
Other projects have included analysis and interpretation of findings.
Depending on your skills and your role, you might even be asked to implement an organizational intervention.
In one project, we designed the survey, interpreted the findings, prepared briefings for management to give. We then facilitated focus groups. These groups developed the recommendations.
More recently, we got involved in setting up award ceremonies and designing and distributing pins.
It doesn’t hurt to put things in writing. A long formal proposal may not be required, but at the minimum, submit an action plan showing the major activities and deliverables.
This actually takes us to the next step!Early in the process, we clarify the scope of the project and the role our customer wants us to play.
Sometimes we help with reviewing or designing the questions and our involvement ends there.
Other projects have included analysis and interpretation of findings.
Depending on your skills and your role, you might even be asked to implement an organizational intervention.
In one project, we designed the survey, interpreted the findings, prepared briefings for management to give. We then facilitated focus groups. These groups developed the recommendations.
More recently, we got involved in setting up award ceremonies and designing and distributing pins.
It doesn’t hurt to put things in writing. A long formal proposal may not be required, but at the minimum, submit an action plan showing the major activities and deliverables.
This actually takes us to the next step!
30. 2. Develop an Action Plan Considerations
How much information do you have about the issues to be investigated?
Who needs to review the plan, the survey, and the survey results?
Who will participate on the survey development team?
Does your staff have all of the required skills?
What steps must be coordinated with other organizations?
These considerations should look familiar to those of you who have managed assessment projects.
How much information do you have—how much information do you still need.
Who is going to be involved on the survey development, and what other work commitments do they have?
Who needs to review the survey and its results?
Do you need to allow time for external reviews (Senior Management, Union)
What steps must be coordinated with other organizations?
I need to coordinate dissemination of large scale surveys with the Commissioner’s staff, the Office of Information Technology.
These considerations should look familiar to those of you who have managed assessment projects.
How much information do you have—how much information do you still need.
Who is going to be involved on the survey development, and what other work commitments do they have?
Who needs to review the survey and its results?
Do you need to allow time for external reviews (Senior Management, Union)
What steps must be coordinated with other organizations?
I need to coordinate dissemination of large scale surveys with the Commissioner’s staff, the Office of Information Technology.
31. 2. Develop an Action Plan This sample plan is based loosely on our plan to develop and administer an organizational survey. Initially, we planned about 4 months. In reality, our involvement in the process began in July of 2005 and started winding down in March of 2006. We are not done yet.
Although we deviated from the time frame, the basic steps remained the same. Whenever we had a major deviation from our timeframe, we submitted a revised plan.
For example,
You may find that coordination takes longer than expected—which it can, especially if the survey involves sensitive content.
Or if you work with law enforcement, like we do, you need to be open to operational needs. For example, we’ve had to postpone surveys during hurricane season because we’re involved in rescue missions. We also have to make sure that we do not administer surveys during peak travel season.
This sample plan is based loosely on our plan to develop and administer an organizational survey. Initially, we planned about 4 months. In reality, our involvement in the process began in July of 2005 and started winding down in March of 2006. We are not done yet.
Although we deviated from the time frame, the basic steps remained the same. Whenever we had a major deviation from our timeframe, we submitted a revised plan.
For example,
You may find that coordination takes longer than expected—which it can, especially if the survey involves sensitive content.
Or if you work with law enforcement, like we do, you need to be open to operational needs. For example, we’ve had to postpone surveys during hurricane season because we’re involved in rescue missions. We also have to make sure that we do not administer surveys during peak travel season.
32. 3. Design the Data Collection Strategy What
Survey Dimensions
Demographic Variables
Who
Population/Sample
When
Timing of Launch
Extensions
How
Delivery System Your next step is designing the data collection strategy.
You may find yourself working out many the details of your data collection strategy at the same time that you are developing your action plan.
By the time you get to the design phase, you should have a pretty good idea of what you will be doing.
What you will be asking about
Who will be getting the survey
When you plan to launch the survey
And,
What your delivery platform will look like—face-to face, telephone, paper-and-pencil, or web-based.
Your next step is designing the data collection strategy.
You may find yourself working out many the details of your data collection strategy at the same time that you are developing your action plan.
By the time you get to the design phase, you should have a pretty good idea of what you will be doing.
What you will be asking about
Who will be getting the survey
When you plan to launch the survey
And,
What your delivery platform will look like—face-to face, telephone, paper-and-pencil, or web-based.
33. 3. Design the Data Collection Strategy Identify the Dimensions to be Measured
Research the issue
Consult existing research and best practices
Collect questionnaires on similar topics
Propose dimensions
Coordinate with your customer, steering committee During this phase you should also be working with your customer or steering committee to define your survey dimensions. You will need these dimensions in order to write survey questions (your next step).
You want to have some ideas to propose to your customer.
They will have some ideas of their own—some may involve things that can’t be measured, or shouldn’t be measured.
If you can get them to agree on things that we know we can measure, you are ahead of the game.
Also, if you can draw on past efforts, you will have answers for the inevitable question, “how are we doing compared to other organizations.”
Lets start with what you plan to measure—the survey dimensions, because that will drive decisions about the sample size and composition and the timing of the survey.During this phase you should also be working with your customer or steering committee to define your survey dimensions. You will need these dimensions in order to write survey questions (your next step).
You want to have some ideas to propose to your customer.
They will have some ideas of their own—some may involve things that can’t be measured, or shouldn’t be measured.
If you can get them to agree on things that we know we can measure, you are ahead of the game.
Also, if you can draw on past efforts, you will have answers for the inevitable question, “how are we doing compared to other organizations.”
Lets start with what you plan to measure—the survey dimensions, because that will drive decisions about the sample size and composition and the timing of the survey.
34. 3. Design the Data Collection Strategy Proposed Dimensions
Mission and Strategy
Leadership
Change Management
Information Sharing
Shared Values
Expertise Retention and Development
Commitment to the Organization Final Dimensions
Mission and Goal Clarity
Communication of Changes
Training and Development
Your Job and Career Remember the organizational survey I mentioned earlier . .
This is an example of how our dimensions changed as we vetted through our steering committee.
Once I proposed a series of dimensions to them, they were able to focus more narrowly on what really made a difference to them.Remember the organizational survey I mentioned earlier . .
This is an example of how our dimensions changed as we vetted through our steering committee.
Once I proposed a series of dimensions to them, they were able to focus more narrowly on what really made a difference to them.
35. 3. Design the Data Collection Strategy Demographic information is sensitive!
Plan your final report: Anticipate important breakouts or comparisons
If you don’t collect it you can’t analyze it!
Only collect information that you plan to use
If you don’t need it, don’t collect it!
When possible, rely on the organization’s database
More accurate
Less threatening You should also decide in advance how you want to break out your data during your analyses—your dependent variables. This determines the demographic data that you need to collect.
Demographic information is sensitive. Respondents will give you information about their attitudes, but not if they think there is a chance of a breach in anonymity.
If you don’t need it, don’t collect it.
In paper-and-pencil surveys, demographic questions typically appear in the back. This is because people are more likely to send back the survey without demographic information, than they are to skip the demographic section and then to complete the survey.
With Web-based surveys, especially those tied to your internal organizational database, you can collect demographic information automatically.
One advantage, is that the survey becomes shorter.
Also, data from the organization’s database tends to be very accurate. So often people don’t remember their service computation dates, or their job titles or series.You should also decide in advance how you want to break out your data during your analyses—your dependent variables. This determines the demographic data that you need to collect.
Demographic information is sensitive. Respondents will give you information about their attitudes, but not if they think there is a chance of a breach in anonymity.
If you don’t need it, don’t collect it.
In paper-and-pencil surveys, demographic questions typically appear in the back. This is because people are more likely to send back the survey without demographic information, than they are to skip the demographic section and then to complete the survey.
With Web-based surveys, especially those tied to your internal organizational database, you can collect demographic information automatically.
One advantage, is that the survey becomes shorter.
Also, data from the organization’s database tends to be very accurate. So often people don’t remember their service computation dates, or their job titles or series.
36. 3. Design the Data Collection Strategy Demographic Information: Some Examples
Organizational Unit
Location
Headquarters vs. Field
Geographic Region
Organizational Level
Executive, Manager, Supervisor, Team Lead
Occupational Series
Office Automation Clerk
Personnel Research Psychologist
IT Specialist
Years of Service The demographic variables you include will depend on the nature of your questions.
On a recent customer satisfaction survey, we were most interested in regional differences so that we could compare the effectiveness of different field units and their specific needs for services.
Organizational Level, which translates into salary, and tenure are important when you are looking at job satisfaction. People who earn more money tend to be more satisfied.
Occupational series was important in our survey assessing an organizational change. The magnitude of the change was greater for some of the job series. The demographic variables you include will depend on the nature of your questions.
On a recent customer satisfaction survey, we were most interested in regional differences so that we could compare the effectiveness of different field units and their specific needs for services.
Organizational Level, which translates into salary, and tenure are important when you are looking at job satisfaction. People who earn more money tend to be more satisfied.
Occupational series was important in our survey assessing an organizational change. The magnitude of the change was greater for some of the job series.
37. 3. Design the Data Collection Strategy When would you ask . . . ?
What is your age? ________
What is your gender? ? Yes ? No
Do you have children living at home? ? Yes ? No
Do you have elderly parents living at home? ? Yes ? No
Are you a full-time employee? ? Yes ? No
How long have you worked for the Federal Government? _______
How long have you worked for this agency? _______
How long have you worked in your current position? _______
What is your marital status? (check one)
Married/Living with partner ?
Single ?
Separated/Divorced ?
Widow/Widower ?
1. Do you really need to know how old they are. If so, DOB is available on the organization’s database
2. Need to know? Gender is also on the organization’s database.
3. Not a yes/no question. Shared custody. You don’t want to go there. If you want to know whether an on-site daycare facility would be used, ask that.
4. Same as #3
5. available on the organization’s database
6. available on the organization’s database
7. available on the organization’s database
8. Depends on the survey. For some surveys, it is not important. If it is, it is a reasonable question to ask.
9. Sensitive. Not sure why they needed to know this.
The purpose of this survey was to find out what Worklife Programs should be established or expanded. Many of these questions, while appropriate for an Employee Assistance Program, seemed overly intrusive.
Now I’d like to look, a little more in detail at some of the considerations that go into deciding who will get the survey—A sample, or the population.
See Page 2 of your handout.1. Do you really need to know how old they are. If so, DOB is available on the organization’s database
2. Need to know? Gender is also on the organization’s database.
3. Not a yes/no question. Shared custody. You don’t want to go there. If you want to know whether an on-site daycare facility would be used, ask that.
4. Same as #3
5. available on the organization’s database
6. available on the organization’s database
7. available on the organization’s database
8. Depends on the survey. For some surveys, it is not important. If it is, it is a reasonable question to ask.
9. Sensitive. Not sure why they needed to know this.
The purpose of this survey was to find out what Worklife Programs should be established or expanded. Many of these questions, while appropriate for an Employee Assistance Program, seemed overly intrusive.
Now I’d like to look, a little more in detail at some of the considerations that go into deciding who will get the survey—A sample, or the population.
See Page 2 of your handout.
38. 3. Design the Data Collection Strategy Census
More resource intensive
More accurate
Greater precision; lower “margin of error”
Risks over-surveying
Necessary for some purposes
Organizational communication
Feedback to smaller groups Sample
Less resource intensive
Accuracy may questioned
Judicious sampling overcomes objections
Avoids over-surveying
Sufficient for many purposes
Estimating employee attitudes
Following up on prior survey
The Population
There are occasions when you will want to survey the entire population. We recently did a population survey for a customer satisfaction survey. Our population consisted of about 2,000 supervisors and our customer wanted breakouts by organization and by region. Given an expected response rate of about 35%, sampling would not have given him the accuracy he needed to examine even the largest subgroups.
You might also want to do send the survey to the population if you are using the survey as a communication tool. Our Office of Information and Technology used a survey about one of its modernization initiatives as a means of raising employees awareness—As a result of reading about and completing the survey, I now know a lot more about how CBP processes commercial trade.
One downside a population survey is that you risk over-surveying.
A Sample
Samples have many benefits and for most purposes, a sample will be sufficient for estimating employee attitudes—provided your organization is sufficiently large.
However, be assured that you will get the question, “How can we be sure that these findings are accurate?” at least once.The Population
There are occasions when you will want to survey the entire population. We recently did a population survey for a customer satisfaction survey. Our population consisted of about 2,000 supervisors and our customer wanted breakouts by organization and by region. Given an expected response rate of about 35%, sampling would not have given him the accuracy he needed to examine even the largest subgroups.
You might also want to do send the survey to the population if you are using the survey as a communication tool. Our Office of Information and Technology used a survey about one of its modernization initiatives as a means of raising employees awareness—As a result of reading about and completing the survey, I now know a lot more about how CBP processes commercial trade.
One downside a population survey is that you risk over-surveying.
A Sample
Samples have many benefits and for most purposes, a sample will be sufficient for estimating employee attitudes—provided your organization is sufficiently large.
However, be assured that you will get the question, “How can we be sure that these findings are accurate?” at least once.
39. 3. Design the Data Collection Strategy Determining the Sample Size
Margin of Error
Determines sample size for a specified level of precision
Greater precision requires larger samples
Confidence Level
Likelihood that population value is within the specified margin of error
Higher confidence levels require larger samples I will touch briefly on some of the considerations that go into drawing a sample.
The first is, margin of error, which is basically “wobble.”
If you say that 50% of the employees in your organization are satisfied with health benefits, is that 50% plus or minus 3 percentage points or 50% plus or minus 10 percentage points.
The second consideration is how sure do you want to be that the real value lies within that that the range.
E.g., let’s say that 35% of your respondents say that they have the resources they need to do the job.
With a 5% margin of error, you can estimate that the value is between 30% and 40%.
If you have set your confidence level at .05, you know that there is a 95% chance that the percentage falls in that 10 point range.I will touch briefly on some of the considerations that go into drawing a sample.
The first is, margin of error, which is basically “wobble.”
If you say that 50% of the employees in your organization are satisfied with health benefits, is that 50% plus or minus 3 percentage points or 50% plus or minus 10 percentage points.
The second consideration is how sure do you want to be that the real value lies within that that the range.
E.g., let’s say that 35% of your respondents say that they have the resources they need to do the job.
With a 5% margin of error, you can estimate that the value is between 30% and 40%.
If you have set your confidence level at .05, you know that there is a 95% chance that the percentage falls in that 10 point range.
40. 3. Design the Data Collection Strategy Determining the Sample Size
Population Size
Larger populations require proportionately smaller samples
Expected Response Rate
Higher response rates require smaller samples
In deciding how large a sample you need,
You will also want to consider whether you need to do subgroup analyses
How many people you expect to respond—if you get about 40% you are doing well. If you get in the high 50’s it’s cause for celebration. Lately we’ve been estimating about 20%.In deciding how large a sample you need,
You will also want to consider whether you need to do subgroup analyses
How many people you expect to respond—if you get about 40% you are doing well. If you get in the high 50’s it’s cause for celebration. Lately we’ve been estimating about 20%.
41. 3. Design the Data Collection Strategy Determining the Sample Size
Stratification
Specifying subgroups and selecting from them requires larger samples
Check for balanced representation if not stratifying
Occurrence of the Characteristic or Attitude
Consider how frequently a characteristic (e.g., occupation, gender) is observed or an attitude (e.g., support/disapproval) is expressed
Uneven splits (e.g., 90/10) are harder to detect and require larger samples In deciding how large a sample you need,
You will also want to consider whether you need to do subgroup analyses—in which case you may choose to stratify.
Also, you need to consider how many people are likely to have an attitude on a subject.
Recent customer satisfaction survey for the office of Equal opportunity. During the past year, about 1% of our nearly 50,000 employees (about 500) had dealings with that office. Because some of the questions dealt with this “low probabability event” we had to increase the size of our sample accordingly.In deciding how large a sample you need,
You will also want to consider whether you need to do subgroup analyses—in which case you may choose to stratify.
Also, you need to consider how many people are likely to have an attitude on a subject.
Recent customer satisfaction survey for the office of Equal opportunity. During the past year, about 1% of our nearly 50,000 employees (about 500) had dealings with that office. Because some of the questions dealt with this “low probabability event” we had to increase the size of our sample accordingly.
42. 3. Design the Data Collection Strategy This is based on an actual sampling plan we used. I’ve given the groups new names so that I could share this with you. We assumed a 40% response rate.
HQ is relatively small, so we sent the survey to everyone.
In Region 1, sent out 2672 which is what we needed for an acceptable margin of error.
In Region 2, which has about twice the number of employees, we only needed about 40 more for the same margin of error.
The links at the bottom of the page are both a basic, but easy to use web-based sample size estimators.This is based on an actual sampling plan we used. I’ve given the groups new names so that I could share this with you. We assumed a 40% response rate.
HQ is relatively small, so we sent the survey to everyone.
In Region 1, sent out 2672 which is what we needed for an acceptable margin of error.
In Region 2, which has about twice the number of employees, we only needed about 40 more for the same margin of error.
The links at the bottom of the page are both a basic, but easy to use web-based sample size estimators.
43. 3. Design the Data Collection Strategy Determine When to Administer the Survey
Coordinate Survey Administration
Other surveys
Holidays, seasonal events
Peak vacation times
Major organizational changes
RIFs, reorganizations
Establish the Timeframe
Initial period
Extensions
Reminders increase participation
One last thing you want to consider in designing your strategy is when you will be administering your survey. Timing is critical.
You definitely need to coordinate with others who are administering surveys.
Also, consider seasonal events such as holidays, or predictable busy periods. (At CBP the two are synonymous.)
You don’t want to administer a survey to Customs and Border Protection Officers during peak travel time—summer vacation. They are responsible for inspecting passengers at the airport.
You also don’t want to administer a survey after a major disruption, unless you are assessing the impact of the disruption.
If you are conducting an organizational survey for research purposes, you may not want to collect data right after a major organizational change or downsizing. One last thing you want to consider in designing your strategy is when you will be administering your survey. Timing is critical.
You definitely need to coordinate with others who are administering surveys.
Also, consider seasonal events such as holidays, or predictable busy periods. (At CBP the two are synonymous.)
You don’t want to administer a survey to Customs and Border Protection Officers during peak travel time—summer vacation. They are responsible for inspecting passengers at the airport.
You also don’t want to administer a survey after a major disruption, unless you are assessing the impact of the disruption.
If you are conducting an organizational survey for research purposes, you may not want to collect data right after a major organizational change or downsizing.
44. 3. Design the Data Collection Strategy Evaluate the Burden on the Organization
Monitor overall number of surveys
Try to limit surveys to 3-5 per employee per year
Minimize survey length
20-40 multiple choice questions, 1-3 open-ended questions
30-minute completion time I can’t overemphasize the problem of proliferation of surveys. The more surveys people get, the less likely they are to respond.
You can reduce the perception of oversurveying through careful coordination and judicious timing.
Limit the number of survey’s you send out and avoid overlap. 3-5 is a good rule of thumb. If you must overlap, communicate the importance of the data
Avoid holidays and periods of peak workload
E.g., no surveys during the summer peak travel period; survey postponed when Katrina hit—involved in rescue initiatives.
Keep the length of the survey down. People should be able to complete the survey—including demographic information, multiple choice questions and open-ended questions—in less than 30 minutes.
In terms of number of questions that’s a few demographic questions, 30-40 multiple choice questions and 1-3 open-ended questions.
I can’t overemphasize the problem of proliferation of surveys. The more surveys people get, the less likely they are to respond.
You can reduce the perception of oversurveying through careful coordination and judicious timing.
Limit the number of survey’s you send out and avoid overlap. 3-5 is a good rule of thumb. If you must overlap, communicate the importance of the data
Avoid holidays and periods of peak workload
E.g., no surveys during the summer peak travel period; survey postponed when Katrina hit—involved in rescue initiatives.
Keep the length of the survey down. People should be able to complete the survey—including demographic information, multiple choice questions and open-ended questions—in less than 30 minutes.
In terms of number of questions that’s a few demographic questions, 30-40 multiple choice questions and 1-3 open-ended questions.
45. 4. Develop the Survey Materials Develop the Survey
Write or select questions
Write instructions
Prepare official correspondence
Pre-test the survey
Some of the points that I will be making in this section will seem very familiar to many of you.
The principles for designing good measures stand, regardless of the kind of measure that you are developing.Some of the points that I will be making in this section will seem very familiar to many of you.
The principles for designing good measures stand, regardless of the kind of measure that you are developing.
46. 4. Develop the Survey Materials The Survey Questions
Write or select the questions
Sources of survey questions
Public domain (OPM, GAO, MSPB)
Software libraries
Researchers
Colleagues
Review the questions
Assessment staff
Customer/SMEs
There are a lot of good sources of well-written survey questions out there and I encourage you to take advantage of them. Good starting point.
OPM has done a lot of good work—the Federal Human Capital Survey is on their web site and is worth looking at. MSPB has also done a number of surveys of interest to those of us in the Human Resources field.
You may know someone in a similar organization who is doing similar work. There used to be working groups sharing survey questions for Total Quality Management.
I’ve also shared questions with other agencies—Bureau of Prisons and FAA.
There are also a lot of surveys in the I/O psychology literature, in journal articles and in books.
Also web-survey software tends to include a questionnaire library. They have good suggestions for survey topics and for response options.
Whether you grow your own or order out, you still want to get both a technical and a customer review.There are a lot of good sources of well-written survey questions out there and I encourage you to take advantage of them. Good starting point.
OPM has done a lot of good work—the Federal Human Capital Survey is on their web site and is worth looking at. MSPB has also done a number of surveys of interest to those of us in the Human Resources field.
You may know someone in a similar organization who is doing similar work. There used to be working groups sharing survey questions for Total Quality Management.
I’ve also shared questions with other agencies—Bureau of Prisons and FAA.
There are also a lot of surveys in the I/O psychology literature, in journal articles and in books.
Also web-survey software tends to include a questionnaire library. They have good suggestions for survey topics and for response options.
Whether you grow your own or order out, you still want to get both a technical and a customer review.
47. 4. Develop the Survey Materials Guidelines for Good Survey Questions
Word items as clearly as possible
Use simple, direct, familiar language
Provide a frame of reference
Use unambiguous terms
Use negatives sparingly; avoid double negatives
Keep the questions short
If it doesn’t fit on one line—it may be too long” I’m sure that many of you recognize these tips. Many of them also appear in guides to writing multiple choice-item stems.
Clear, simple wording is the best. If the stem is much longer than one line, think about rewording it.
Also try to ask respondents about issues that are likely to be familiar to them, in terms that they are likely to understand.
See Page 3 of your handout.I’m sure that many of you recognize these tips. Many of them also appear in guides to writing multiple choice-item stems.
Clear, simple wording is the best. If the stem is much longer than one line, think about rewording it.
Also try to ask respondents about issues that are likely to be familiar to them, in terms that they are likely to understand.
See Page 3 of your handout.
48. 4. Develop the Survey Materials Guidelines for Good Survey Questions (continued)
Include only one issue per question
Tip-offs: and, but, because
Special case: the “double-barreled” questions
Questions that address two contradictory issues
Avoid “loaded” questions
Questions that contain a false, disputed, or question-begging assumption
Don’t ask respondents to “tell more than they know” To avoid confusion, stick to one issue per question. If you see a conjunction (and/or) see if you can go one level up. E.g., instead of boys and girls, use children.
My organization offers comp time or pay when we have to work overtime. Vs workers are compensated for overtime.
A double-barreled question is a special case because it addresses two contradictory issues. You will see some concrete examples shortly.
A "loaded question", like a loaded gun, is a dangerous thing. A loaded question is a question with a false or questionable presupposition, and it is "loaded" with that presumption. The question "Have you stopped beating your wife?" presupposes that you have beaten your wife prior to its asking, as well as that you have a wife. If you are unmarried, or have never beaten your wife, then the question is loaded.
Don’t ask for opinions on things that people are not likely to understand or have not experienced personally.To avoid confusion, stick to one issue per question. If you see a conjunction (and/or) see if you can go one level up. E.g., instead of boys and girls, use children.
My organization offers comp time or pay when we have to work overtime. Vs workers are compensated for overtime.
A double-barreled question is a special case because it addresses two contradictory issues. You will see some concrete examples shortly.
A "loaded question", like a loaded gun, is a dangerous thing. A loaded question is a question with a false or questionable presupposition, and it is "loaded" with that presumption. The question "Have you stopped beating your wife?" presupposes that you have beaten your wife prior to its asking, as well as that you have a wife. If you are unmarried, or have never beaten your wife, then the question is loaded.
Don’t ask for opinions on things that people are not likely to understand or have not experienced personally.
49. Some Good Survey Questions The people I work with cooperate to get the job done
I have trust and confidence in my supervisor.
My work unit is able to recruit people with the right skills.
My workload is reasonable.
Physical conditions allow employees to perform their jobs well.
My performance appraisal is a fair reflection of my performance.
Strongly Agree . . . Strongly Disagree, Do Not Know
How satisfied are you with paid vacation time?
Considering everything, how satisfied are you with your pay?
Very Satisfied . . . Very Dissatisfied, Do Not Know These are questions come to you courtesy of the U.S. Office of Personnel Management. They were included in the 2007 Federal Human Capital Survey.
Notice how succinct they are. I counted—the average length is 10 words, which would take the average person a couple of seconds to read.
(12, 9, 13, 5, 10, 11, 9, 10)/250=.04*60=about 2 seconds
Also, using the reading level calculator in MS Word (Flesch Kinkaid Reading Ease) The reading level came out at about the 7th grade level, which is a good target for survey questions.
MS word will do this for you. Tools/Options/show readability statistics.These are questions come to you courtesy of the U.S. Office of Personnel Management. They were included in the 2007 Federal Human Capital Survey.
Notice how succinct they are. I counted—the average length is 10 words, which would take the average person a couple of seconds to read.
(12, 9, 13, 5, 10, 11, 9, 10)/250=.04*60=about 2 seconds
Also, using the reading level calculator in MS Word (Flesch Kinkaid Reading Ease) The reading level came out at about the 7th grade level, which is a good target for survey questions.
MS word will do this for you. Tools/Options/show readability statistics.
50. 4. Develop the Survey Materials Guidelines for Good Response Options
The response should fit the question
Response options should be on the same continuum
Don’t make it easy to select the most desirable response
For example, begin the response options with “1-Strongly Disagree” rather than “5-Strongly Agree”
Make the most positive or negative response viable
If possible, avoid universals -- all, always, none, never
Use “almost always” or “rarely or never”
Give respondents an escape hatch
Does not apply; neither agree nor disagree
Think carefully about how many options you will include
You will also recognize some of these tips for writing good response options.
Basically, the options should answer the questions, and all of the options should be on the same scale.
There is some debate about whether you should force a choice or give respondents an escape hatch. My preference is to let them indicate when they don’t have an opinion—or don’t know. Their responses are more likely to be accurate. Also, they will be less likely to skip questions, or to get frustrated and opt out of the survey early.You will also recognize some of these tips for writing good response options.
Basically, the options should answer the questions, and all of the options should be on the same scale.
There is some debate about whether you should force a choice or give respondents an escape hatch. My preference is to let them indicate when they don’t have an opinion—or don’t know. Their responses are more likely to be accurate. Also, they will be less likely to skip questions, or to get frustrated and opt out of the survey early.
51. 4. Develop the Survey Materials Satisfaction
Very dissatisfied
Dissatisfied
Neither satisfied nor dissatisfied
Satisfied
Very satisfied
Adequacy
Very poor
Poor
Adequate
Very good
Excellent
Agreement
Strongly disagree
Disagree
Neither agree nor disagree
Agree
Strongly agree
Frequency
Very infrequently
Infrequently
As often as not
Frequently
Very frequently These are examples of some commonly used response scales.
I tend to use “Agreement” or “Satisfaction” most frequently.
Sticking to one or two basic response scale makes it easier on the respondent.
It also simplifies data analysis by making it easier to compare attitudes across questions.
You will notice that each of these examples uses a 5-point scale.
I’ll get into some of the reasons, shortly.
These are examples of some commonly used response scales.
I tend to use “Agreement” or “Satisfaction” most frequently.
Sticking to one or two basic response scale makes it easier on the respondent.
It also simplifies data analysis by making it easier to compare attitudes across questions.
You will notice that each of these examples uses a 5-point scale.
I’ll get into some of the reasons, shortly.
52. 4. Develop the Survey Materials I wanted to show you this picture because it is a very clever way to get in 10 scale points using 2 5-point groupings. You have to decide whether you really want that level of precision
A 5-point scale works well for most surveys. George Miller had a good idea in 1957 when he wrote about the magic number “7 plus or minus 2. ”
The optimal number of scale points depends on the complexity of the issue being examined. Three points are not enough. There are some questions that can be answered “yes” “No” or “Don’t know” most of the questions we ask in organizational surveys are more complicated. Survey respondents want to be able to express this complexity.
Survey respondents also vary in the strength with which they hold opinions. If they are given the choice of “yes,” “no,” or “undecided,” they tend not to commit, opting instead for the “undecided” response. As a result you loose data.
Also, the number of options you include affects the accuracy of measurement. Research has shown that the reliability, or the accuracy of measures, increases up to the use of five points; after that reliability levels off
You can collapse 5 points into 3. If you start with only 3 scale points, respondents may opt for a neutral response because they don’t have the option of agreeing or disagreeing a little bit. I wanted to show you this picture because it is a very clever way to get in 10 scale points using 2 5-point groupings. You have to decide whether you really want that level of precision
A 5-point scale works well for most surveys. George Miller had a good idea in 1957 when he wrote about the magic number “7 plus or minus 2. ”
The optimal number of scale points depends on the complexity of the issue being examined. Three points are not enough. There are some questions that can be answered “yes” “No” or “Don’t know” most of the questions we ask in organizational surveys are more complicated. Survey respondents want to be able to express this complexity.
Survey respondents also vary in the strength with which they hold opinions. If they are given the choice of “yes,” “no,” or “undecided,” they tend not to commit, opting instead for the “undecided” response. As a result you loose data.
Also, the number of options you include affects the accuracy of measurement. Research has shown that the reliability, or the accuracy of measures, increases up to the use of five points; after that reliability levels off
You can collapse 5 points into 3. If you start with only 3 scale points, respondents may opt for a neutral response because they don’t have the option of agreeing or disagreeing a little bit.
53. 4. Develop the Survey Materials Other Response Options
Dichotomous Response (Yes/No)
Branching—a special case
If you answered “yes” the the previous question, . . .
If you are a supervisor, . . .
Open-ended Responses
Types
Specific, targeted questions
What recommendations do you have for improving the quality of future surveys?
General questions
Is there anything else that we should know about . . .
Web facilitates responding and analysis of responses In addition to the Likert-scale multiple-choice questions, There are some other kinds of questions, you may want to include.
For some questions a dichotomous response (yes or no) is sufficient. But make sure that the question is really dichotomous.
For others, you may want to provide a list of options, for example, if you were evaluating services provided by your Labor-Employee Relations group, you might ask “which of the following services did you use within the past 3 months” and then provide a list of the services.
You might also want to include branching questions.
Use branching questions when you want additional information from people who hold have responded in a particular way to a question.
We used a branching question to get at reasons for wanting to remain with CBP or wanting to leave CBP. People who intended to stay got one set of options to check. People who intended to leave got a second set of options.
The nice thing about Web-based surveys is that they will branch automatically. With paper-and-pencil surveys everyone gets all of the questions and response options.
No more “if you answered ‘yes to question 19” please answer the next 3 questions.
Finally, You can also include open-ended questions. Web makes responding easier and people tend to write more.
Years ago, the “standard practice” used to be to include one general question at the end.
Is there anything else you want to tell us about?
Now the trend seems to be to ask very specific questions and at the conclusion of a topic or section.
How can we improve mandatory supervisory training?
Although electronic files are easier analyze for key words and to sort by into categories, analysis is still fairly labor intensive.In addition to the Likert-scale multiple-choice questions, There are some other kinds of questions, you may want to include.
For some questions a dichotomous response (yes or no) is sufficient. But make sure that the question is really dichotomous.
For others, you may want to provide a list of options, for example, if you were evaluating services provided by your Labor-Employee Relations group, you might ask “which of the following services did you use within the past 3 months” and then provide a list of the services.
You might also want to include branching questions.
Use branching questions when you want additional information from people who hold have responded in a particular way to a question.
We used a branching question to get at reasons for wanting to remain with CBP or wanting to leave CBP. People who intended to stay got one set of options to check. People who intended to leave got a second set of options.
The nice thing about Web-based surveys is that they will branch automatically. With paper-and-pencil surveys everyone gets all of the questions and response options.
No more “if you answered ‘yes to question 19” please answer the next 3 questions.
Finally, You can also include open-ended questions. Web makes responding easier and people tend to write more.
Years ago, the “standard practice” used to be to include one general question at the end.
Is there anything else you want to tell us about?
Now the trend seems to be to ask very specific questions and at the conclusion of a topic or section.
How can we improve mandatory supervisory training?
Although electronic files are easier analyze for key words and to sort by into categories, analysis is still fairly labor intensive.
54. The Novice-Designed Survey Like anything scientific—you have to be careful what you try at home!
Like anything scientific—you have to be careful what you try at home!
55. Class Exercise: Item Review Review the items beginning on Page 4 of your handout. There are 19 items.
Some are OK. Some are good. Some are real stinkers. I’d like you to find these and tell me
What “rule” they violate.
What you’d do to improve them
Any suggestions for improving acceptable items are also welcome. We can always do better.Review the items beginning on Page 4 of your handout. There are 19 items.
Some are OK. Some are good. Some are real stinkers. I’d like you to find these and tell me
What “rule” they violate.
What you’d do to improve them
Any suggestions for improving acceptable items are also welcome. We can always do better.
56. Would you include this question on YOUR survey? Work delays are uncommon in this organization.
Strongly Disagree
Disagree
Neither Agree nor Disagree
Agree
Strongly Agree I’d now like to get your opinion of some questions that I’ve seen in the process of reviewing surveys.
[hand out 20 questions] Split, have group report out.
This is basically a good question.
One possible problem is that it is stated negatively.
Double negative provide too more complexity than you want if you have a low reading level.
When I worked for Army Research Institute several years ago, we did targeted our reading level for elementary school for enlisted personnel and for middle school for non-commissioned officers. I’d now like to get your opinion of some questions that I’ve seen in the process of reviewing surveys.
[hand out 20 questions] Split, have group report out.
This is basically a good question.
One possible problem is that it is stated negatively.
Double negative provide too more complexity than you want if you have a low reading level.
When I worked for Army Research Institute several years ago, we did targeted our reading level for elementary school for enlisted personnel and for middle school for non-commissioned officers.
57. Would you include this question on YOUR survey? Little feedback, mostly negative vs.Timely feedback, both good and bad.
Now: 1 2 3 4 5 6 7 8 9 10
Want: 1 2 3 4 5 6 7 8 9 10
I have no clue what possessed the person who wrote this. And there were many others just like this.
Double barreled
Multiple question
Scale overly complexI have no clue what possessed the person who wrote this. And there were many others just like this.
Double barreled
Multiple question
Scale overly complex
58. Would you include this question on YOUR survey? Do you believe that there was fair and open competition for the vacancies for which you have applied?
Always
Usually
Sometimes
Never
No recent hires or Don’t Know Double barreled stem
Fair/open—two ideas
Assumes that the person applied for a vacancy.
Response options do not match the stem.
Recent hires and vacancies are not the same thing
Could rewrite as a branching question
Have you applied for a vacancy in this agency within the past year?
List the name of the position
(Automatic Branch for yes) For this position, do you believe that competition was open to all who wanted to apply?
In your opinion, how fair was the selection process.
Double barreled stem
Fair/open—two ideas
Assumes that the person applied for a vacancy.
Response options do not match the stem.
Recent hires and vacancies are not the same thing
Could rewrite as a branching question
Have you applied for a vacancy in this agency within the past year?
List the name of the position
(Automatic Branch for yes) For this position, do you believe that competition was open to all who wanted to apply?
In your opinion, how fair was the selection process.
59. Would you include this question on YOUR survey? My work unit has enough resources (people and money) to accomplish its major tasks.
Fully
Mostly
Somewhat
Not at all
Don’t know The stem is on the right track.
Better not to say “people AND money.” and is a tip-off for a double-barreled question.
I’d probably put (for example, staff, funding, equipment) in parentheses.
The scale is a little odd, but not horrible. It also doesn’t quite answer the question.
Could rewrite as an agree/disagree question pretty easily
My work unit has the resources it needs to accomplish it’s major tasks.The stem is on the right track.
Better not to say “people AND money.” and is a tip-off for a double-barreled question.
I’d probably put (for example, staff, funding, equipment) in parentheses.
The scale is a little odd, but not horrible. It also doesn’t quite answer the question.
Could rewrite as an agree/disagree question pretty easily
My work unit has the resources it needs to accomplish it’s major tasks.
60. Would you include this question on YOUR survey? In my agency/department
Plans identify improvement priorities critical to organization’s mission that will be relatively difficult to attain; resources are not allocated to support these objectives
Plans identify improvement priorities central to organization’s mission; plans aim for higher objectives each year; resources are related to major goals.
Goals identify quality priorities that may or may not be central to organization’s mission; goals do not require major effort or change in organization.
Implementation strategy for introducing TQM in organization is underway.
None of these applies to my agency/organization.
Do not know whether or not any of these applies. Excessively wordy. I’d give up after one question like this. This is pretty representative of the questions in this survey. It would take entirely too much time to complete.
Response options are not on the same scale, and don’t appear to be mutually exclusive.
I wonder how many people selected option 6—either because the read all of the options, or because they didn’t want to.
11th grade reading levelExcessively wordy. I’d give up after one question like this. This is pretty representative of the questions in this survey. It would take entirely too much time to complete.
Response options are not on the same scale, and don’t appear to be mutually exclusive.
I wonder how many people selected option 6—either because the read all of the options, or because they didn’t want to.
11th grade reading level
61. Would you include this question on YOUR survey? Risk-taking is rewarded in this organization.
Strongly Disagree
Disagree
Undecided
Agree
Strongly Agree This is actually a pretty good question, but it is context dependent.
It would work well for some organizations, for example, it would be appropriate for a research and development group.
It would not work in a law enforcement environment. Officers risk their lives every day, but under most circumstances, an officer would not risk challenging an order.This is actually a pretty good question, but it is context dependent.
It would work well for some organizations, for example, it would be appropriate for a research and development group.
It would not work in a law enforcement environment. Officers risk their lives every day, but under most circumstances, an officer would not risk challenging an order.
62. Would you include this question on YOUR survey? Are you free to make decisions about joining or not joining a labor union without fear of reprisal?
Yes
No
Don’t know Double barreled on more than one level
Would the person fear reprisal for joining or for not joining a labor union?
Is the person free to join a labor union, even if reprisal is not an issue?
Double barreled on more than one level
Would the person fear reprisal for joining or for not joining a labor union?
Is the person free to join a labor union, even if reprisal is not an issue?
63. Would you include this question on YOUR survey? Given a choice, I would rather prefer to stay here than move to a similar job in a different organization.
Always
Usually
Seldom
Never
Don’t Know This is a reasonable topic, but there are much better ways to ask the question.
Double barreled.
Staying and leaving are really two separate issues.
I plan to have a long-term career with this organization
During the coming year, I intend to seek employment outside of this organization.
Scale does not match the question. Agree/Disagree would be a much better scale. This is a reasonable topic, but there are much better ways to ask the question.
Double barreled.
Staying and leaving are really two separate issues.
I plan to have a long-term career with this organization
During the coming year, I intend to seek employment outside of this organization.
Scale does not match the question. Agree/Disagree would be a much better scale.
64. Would you include this question on YOUR survey? So many things wrong. So little time. What’s worse is that the original list of programs was much longer.
Needless to say there are better ways of finding out whether a specific program is
(a) needed
(b) utilized
(c) supported by management
(d) communicated to employeesSo many things wrong. So little time. What’s worse is that the original list of programs was much longer.
Needless to say there are better ways of finding out whether a specific program is
(a) needed
(b) utilized
(c) supported by management
(d) communicated to employees
65. Would you include this question on YOUR survey? Over the past year I have experienced health related issues as a result of balancing my work and personal life.
Strongly Disagree
Disagree
Can’t Decide
Agree
Strongly Agree This seems a little personal to me.
It may also be a good example of asking more than people can reliably report. Who’s to say what really caused a “health issue.”
Again, I’m not sure how one would use this information in program planning or evaluation.This seems a little personal to me.
It may also be a good example of asking more than people can reliably report. Who’s to say what really caused a “health issue.”
Again, I’m not sure how one would use this information in program planning or evaluation.
66. Would you include this question on YOUR survey? My supervisor or I conduct trend analyses to proactively identify and address unlawful discrimination trends (e.g. trend analyses of the workforce's composition and reward system conducted by race, national origin, sex, and disability).
Agree Strongly
Agree Somewhat
Neither Agree nor Disagree
Disagree Somewhat
Disagree Strongly
Not Applicable Grammar (My supervisor conduct?)
In real life, sometimes the supervisor does it, sometimes it’s delegated to a staff member. The important thing is that someone in the immediate work unit conducts trend analysis.
If you agree, it’s not clear who actually conducted the trend analysis
Are you trying to get at whether someone knows how to do trend analysis, or whether measures are being taken to identify trends.
Q1: Avoid addressing multiple topics with a single question. The focus of this statement is unclear. The respondent can agree or disagree with each question.
Having the skills/ability to conduct trend analysis
Identifying discrimination trends before they become a problem.
Addressing discrimination trends before they become a problem
33 words
Reading level of 23 (0-30 best understood by college students)
Grammar (My supervisor conduct?)
In real life, sometimes the supervisor does it, sometimes it’s delegated to a staff member. The important thing is that someone in the immediate work unit conducts trend analysis.
If you agree, it’s not clear who actually conducted the trend analysis
Are you trying to get at whether someone knows how to do trend analysis, or whether measures are being taken to identify trends.
Q1: Avoid addressing multiple topics with a single question. The focus of this statement is unclear. The respondent can agree or disagree with each question.
Having the skills/ability to conduct trend analysis
Identifying discrimination trends before they become a problem.
Addressing discrimination trends before they become a problem
33 words
Reading level of 23 (0-30 best understood by college students)
67. Would you include this question on YOUR survey? How satisfied are you with the degree of follow-up by our staff to ensure your needs were met?
Very Satisfied
Satisfied
Somewhat Satisfied
Dissatisfied
Very Dissatisfied
Not Applicable (N/A) We used this one. Of course one can always do better, but it’s not bad.
Any suggestions?
Easy to read—7th grade; no egregious grammatical mistakes.We used this one. Of course one can always do better, but it’s not bad.
Any suggestions?
Easy to read—7th grade; no egregious grammatical mistakes.
68. This is what happens when you are given one week to design and deploy a survey.
One might be OK. Presentation could be improved. There were 16 of these. Fortunately branching was involved.
This is what happens when you have 1 week to design, program, and implement a web-based survey.
We had top level support. The project was one of a series of projects conducted for the Commissioner of CBP.
We got a 66% response rate.This is what happens when you are given one week to design and deploy a survey.
One might be OK. Presentation could be improved. There were 16 of these. Fortunately branching was involved.
This is what happens when you have 1 week to design, program, and implement a web-based survey.
We had top level support. The project was one of a series of projects conducted for the Commissioner of CBP.
We got a 66% response rate.
69. Would you include this question on YOUR survey? I have received ADR training emphasizing the benefits associated with utilizing ADR.
Agree Strongly
Agree Somewhat
Neither Agree nor Disagree
Disagree Somewhat
Disagree Strongly
Not Applicable What’s an ADR?
Alternative Dispute Resolution.
Simplified language a little. Also they defined ADR in the prior question.
I have received training that explained the benefits of using the ADR process.What’s an ADR?
Alternative Dispute Resolution.
Simplified language a little. Also they defined ADR in the prior question.
I have received training that explained the benefits of using the ADR process.
70. Would you include this question on YOUR survey? I am aware that my organization has an Office of Equal Opportunity (OEO).
Yes
No
Disagree Somewhat
Disagree Strongly Question is fine. Response options are goofy. Technical term.Question is fine. Response options are goofy. Technical term.
71. Would you include this question on YOUR survey? Is your organization satisfied with the employee performance rating system? Please explain.
Open ended question. Asks individual employee to give an opinion for the entire organization. (Telling more than you can know.) What about divergent viewpoints?
In general, how satisfied are you with your organization’s performance appraisal system? Please write a brief statement describing your general impression.Open ended question. Asks individual employee to give an opinion for the entire organization. (Telling more than you can know.) What about divergent viewpoints?
In general, how satisfied are you with your organization’s performance appraisal system? Please write a brief statement describing your general impression.
72. Would you include this question on YOUR survey? On an annual basis, approximately how many times do you receive support from OIT Field Support?
1-100 scale is unwieldy—who’s to say whether it’s 3 or 5
Give a range
Better yet: weekly, monthly, every few months, about twice a year or less.1-100 scale is unwieldy—who’s to say whether it’s 3 or 5
Give a range
Better yet: weekly, monthly, every few months, about twice a year or less.
73. Would you include this question on YOUR survey? Please identify specific areas where you would like to see improved service from OIT Field Support. *
Say what??????
My response to this question was “you have got to be kidding”
How can we improve our service to you? Is a better question. Also, it should not be required. Be glad that they responded at all.Say what??????
My response to this question was “you have got to be kidding”
How can we improve our service to you? Is a better question. Also, it should not be required. Be glad that they responded at all.
74. Would you include this question on YOUR survey? Do current scheduling practices allow for the regularization of schedules, enabling rating supervisors to work the same schedules as rated subordinates?
Yes
No After making some phone calls, I found out what they were trying to accomplish. I recommended the following 3 questions:
In your organization, what approximately how many employees work in the same location as their supervisors? (For example, telecommuters, workers who work in remote field locations, workers who cover a large territory).
90% or more
about 75%
Between 50% and 75%
Between 25% and 50%
Fewer than 25%
Below are common schedules that employees work. Please check all that apply in your organization.
Compressed work schedules (e.g., 4 10-hour days per week)
Flexible work schedule (e.g., Flexing arrival and departure time around fixed core hours.)
Shift work (e.g., 24/7 coverage)
Other (please indicate)
In your organization, approximately how many of your supervisors work a different shift than their employees?
90% or more
about 75%
Between 50% and 75%
Between 25% and 50%
Fewer than 25%After making some phone calls, I found out what they were trying to accomplish. I recommended the following 3 questions:
In your organization, what approximately how many employees work in the same location as their supervisors? (For example, telecommuters, workers who work in remote field locations, workers who cover a large territory).
90% or more
about 75%
Between 50% and 75%
Between 25% and 50%
Fewer than 25%
Below are common schedules that employees work. Please check all that apply in your organization.
Compressed work schedules (e.g., 4 10-hour days per week)
Flexible work schedule (e.g., Flexing arrival and departure time around fixed core hours.)
Shift work (e.g., 24/7 coverage)
Other (please indicate)
In your organization, approximately how many of your supervisors work a different shift than their employees?
90% or more
about 75%
Between 50% and 75%
Between 25% and 50%
Fewer than 25%
75. Exercise: Write a Survey Item Availability of Support – The degree to which the customer can contact the provider.
Responsiveness of Support – The degree to which the provider reacts promptly to the provider.
Timeliness of Support – The degree to which the job is accomplished within the customer’s stated time frame and/or within the negotiated time frame.
Completeness of Support – The degree to which the total job is finished.
Pleasantness of Support – The degree to which the provider uses suitable professional behavior and manners while working with the customer. Now it’s your turn. Here are 5 dimensions of Customer service. I’d like you to come up with appropriate survey questions for each dimension. They are also on Page 9 of your handout.
Assign groups to work on each dimension.
I’ll give you about 20 minutes. Each group read one question. Show you what the R.E. Hayes decided to ask.Now it’s your turn. Here are 5 dimensions of Customer service. I’d like you to come up with appropriate survey questions for each dimension. They are also on Page 9 of your handout.
Assign groups to work on each dimension.
I’ll give you about 20 minutes. Each group read one question. Show you what the R.E. Hayes decided to ask.
76. Exercise: Write a Survey Item Availability of Support – The degree to which the customer can contact the provider.
I could get help from the staff when I needed.
The staff was always available to help.
The staff was there when needed.
I could arrange convenient meeting times with the staff.
77. Exercise: Write a Survey Item Responsiveness of Support – The degree to which the provider reacts promptly to the provider.
They were quick to respond when I asked for help.
They immediately helped me me when I needed.
I waited a short period of time to get help after I asked for it.
78. Exercise: Write a Survey Item Timeliness of Support – The degree to which the job is accomplished within the customer’s stated time frame and/or within the negotiated time frame.
They completed the job when expected.
They met my deadline(s).
They finished their responsibilities within the stated time frame.
The project was completed on time.
79. Exercise: Write a Survey Item Completeness of Support – The degree to which the total job is finished.
They ensured that every aspect of the job was completed.
They completed everything they said they would do.
They were there to provide help from the begging to the end of the project.
80. Exercise: Write a Survey Item Pleasantness of Support – The degree to which the provider uses suitable professional behavior and manners while working with the customer.
The staff members conducted themselves in a professional manner.
The staff listened to me
The staff was courteous.
The staff cared about what I had to say.
81. 4. Develop the Survey Materials Next Steps
Assemble the Questionnaire
Instructions
Questions
Demographics
Prepare Correspondence to Respondents
Announcement
Invitation
Reminders
Pretest the Survey OK, now that we’ve had a taste of question writing, it’s time to move on. We need to
Assemble the questionnaire
Write instructions
Prepare correspondence to respondents
Make sure that the survey works.OK, now that we’ve had a taste of question writing, it’s time to move on. We need to
Assemble the questionnaire
Write instructions
Prepare correspondence to respondents
Make sure that the survey works.
82. 4. Develop the Survey Materials Assemble the Questionnaire
Sequence the fixed-response questions
Lead with a question that is likely to be non-threatening
Include a few questions that can be answered positively, especially if the overall news is not likely to be good
Sequence the material logically
Subheadings or lead-in statements help organize the material for respondents
The next section will ask you about . . .
Shorter is better! Start them off easy.
It’s a good idea to lead with a non-threatening question.
Also, make sure you include questions that can be answered positively, especially if the news is not likely to be good. A good example is “I work hard on my job”
Sequence the material logically. Group questions by topic.
Put all similar questions together.
Recently I reviewed a recent questionnaire about the Immigrations and Customs Inspection process. The author presented questions about the inspections facility and questions about the officer conducting the inspection in no particular order. Grouping questions about, say inspectors, together makes responding to the survey much easier.
You might want to begin the section with lead-in statement, or a header. Include any section-specific instructions.
Resist the temptation to add “just one more question.” Shorter really is better.
Start them off easy.
It’s a good idea to lead with a non-threatening question.
Also, make sure you include questions that can be answered positively, especially if the news is not likely to be good. A good example is “I work hard on my job”
Sequence the material logically. Group questions by topic.
Put all similar questions together.
Recently I reviewed a recent questionnaire about the Immigrations and Customs Inspection process. The author presented questions about the inspections facility and questions about the officer conducting the inspection in no particular order. Grouping questions about, say inspectors, together makes responding to the survey much easier.
You might want to begin the section with lead-in statement, or a header. Include any section-specific instructions.
Resist the temptation to add “just one more question.” Shorter really is better.
83. 4. Develop the Survey Materials Assemble the Questionnaire
Sequence the remaining content
Instructions
Provide general instructions for completing the survey
Provide any special instructions for sections and individual questions
Open-ended questions
If relevant to specific section, place at end of section
Place general questions at the end
Demographic questions
Place at end if information is not in your database
Thank You!
Always end by thanking the respondents
Instructions
Write any general instructions for completing the survey. Also, if your survey has multiple sections, prepare any specific instructions you might need.
Open-ended questions
Put section-specific open-ended questions at the end of the section.
General open-ended questions typically follow the last content section.
Demographics
If you need to ask demographic questions, place them after the end of the survey.
End by thanking respondents.
Instructions
Write any general instructions for completing the survey. Also, if your survey has multiple sections, prepare any specific instructions you might need.
Open-ended questions
Put section-specific open-ended questions at the end of the section.
General open-ended questions typically follow the last content section.
Demographics
If you need to ask demographic questions, place them after the end of the survey.
End by thanking respondents.
84. 4. Develop the Survey Materials Prepare Correspondence to Participants
The Announcement
The announcement should come from the highest possible level of management
The announcement should include:
The purpose of survey
The originator of the survey
Who will be asked to participate
How long it takes to complete the survey
When the survey will arrive
Assurance of confidentiality
How data will be used
Before you launch the survey, prepare all of the correspondence you will need. This includes the announcement, the invitation, and any follow-up messages.
The announcement should come from the highest possible level in the organization. Determine whether or not your customer wants to prepare this message personally. If so, you might want to provide a sample or draft.
The announcement should include:
The purpose of survey
The customer of the survey
Who will be asked to participate
How long it takes to complete the survey
When the survey will arrive
Assurance of confidentiality
How data will be usedBefore you launch the survey, prepare all of the correspondence you will need. This includes the announcement, the invitation, and any follow-up messages.
The announcement should come from the highest possible level in the organization. Determine whether or not your customer wants to prepare this message personally. If so, you might want to provide a sample or draft.
The announcement should include:
The purpose of survey
The customer of the survey
Who will be asked to participate
How long it takes to complete the survey
When the survey will arrive
Assurance of confidentiality
How data will be used
85. 4. Develop the Survey Materials This is a sample of an announcement I received recently from the U.S. Office of Personnel Management. I’ve highlighted the relevant information in green.
I got this, but did not receive the invitation. Although I was not among the chosen, I was aware of the survey.
It’s especially important to let managers know that some of their employees will be given a survey.
Speaking of invitations . . .This is a sample of an announcement I received recently from the U.S. Office of Personnel Management. I’ve highlighted the relevant information in green.
I got this, but did not receive the invitation. Although I was not among the chosen, I was aware of the survey.
It’s especially important to let managers know that some of their employees will be given a survey.
Speaking of invitations . . .
86. 4. Develop the Survey Materials Prepare Correspondence to Participants
The Invitation
State survey purpose, originating organization, participants, participation requirements
Assurance of anonymity and confidentiality
How results will be used
Whether and how feedback will be provided
General instructions for completing survey
Open period for completing the survey
Point of Contact
Electronic link to survey You’ll also want to prepare an invitation to participate. Important points to hit include:
State survey purpose, originating office, participants, participation requirements
Assurance of anonymity and confidentiality
Use of results.
Whether and how feedback will be provided
General instructions for completing survey
Open period for completing the survey
Point of Contact
Link to surveyYou’ll also want to prepare an invitation to participate. Important points to hit include:
State survey purpose, originating office, participants, participation requirements
Assurance of anonymity and confidentiality
Use of results.
Whether and how feedback will be provided
General instructions for completing survey
Open period for completing the survey
Point of Contact
Link to survey
87. 4. Develop the Survey Materials This is a sample invitation. We used a variation of this for a recent organizational survey we did at CBP.
Again salient points are highlighted in green.This is a sample invitation. We used a variation of this for a recent organizational survey we did at CBP.
Again salient points are highlighted in green.
88. 4. Develop the Survey Materials
Finally, you will want to prepare any reminders or extension messages.
It’s a good idea to send out an reminder to people a few days before the survey is about to close.
Reminders really do help boost participation. I think OPM used about three reminders for it’s Federal Human Capital Survey in 2004. The were able to boost participation from the 20% range to over 50%, which is excellent. After 3 reminders, the incremental value seems to level off.
Also, consider extending your survey if your response rate is lagging.
This reminder is patterned after one written by one of my survey customers. He really wanted everything in his own words. He left the survey open a couple of weeks extra and was able to get a 37% response rate—also pretty good considering that his boss expected a 10% response rate.Finally, you will want to prepare any reminders or extension messages.
It’s a good idea to send out an reminder to people a few days before the survey is about to close.
Reminders really do help boost participation. I think OPM used about three reminders for it’s Federal Human Capital Survey in 2004. The were able to boost participation from the 20% range to over 50%, which is excellent. After 3 reminders, the incremental value seems to level off.
Also, consider extending your survey if your response rate is lagging.
This reminder is patterned after one written by one of my survey customers. He really wanted everything in his own words. He left the survey open a couple of weeks extra and was able to get a 37% response rate—also pretty good considering that his boss expected a 10% response rate.
89. 4. Develop the Survey Materials
This is the letter you don’t want to have to send. There was a glitch in a survey I received recently. I got this after completing the survey. I didn’t have the time to go back and re-do it, but one of my colleagues did. He got this letter twice.
Just a word of warning. Web surveys—even those done by professionals-- are not foolproof.This is the letter you don’t want to have to send. There was a glitch in a survey I received recently. I got this after completing the survey. I didn’t have the time to go back and re-do it, but one of my colleagues did. He got this letter twice.
Just a word of warning. Web surveys—even those done by professionals-- are not foolproof.
90. 4. Develop the Survey Materials Web-based Surveys: Design Considerations
Format counts!
Is it readable?
Font size
Line spacing
Color (font and background)
Does the screen contain all pertinent information
The WHOLE question
ALL response options
Can the respondent see the response options when answering the question?
Pretest your format: Monitors differ! Before we move on to pre-testing, I’d like to talk a little about design considerations for Web Surveys.
Monitors differ. Be sure to check out font size, line spacing, and font color on monitors that are typical in your organization.
Also navigate through the survey to make sure that the whole question appears on a single screen, and that all response options are visible for each question.
I responded to a survey recently that listed the response options as a header on the first screen. That was the last time they appeared. Not good.Before we move on to pre-testing, I’d like to talk a little about design considerations for Web Surveys.
Monitors differ. Be sure to check out font size, line spacing, and font color on monitors that are typical in your organization.
Also navigate through the survey to make sure that the whole question appears on a single screen, and that all response options are visible for each question.
I responded to a survey recently that listed the response options as a header on the first screen. That was the last time they appeared. Not good.
91. 4. Develop the Survey Materials Section 508 Compatibility: Some Guidelines
If you include video or audio content, provide equivalent alternatives
Text equivalent for non-text information for the hearing impaired
Auditory presentation for the visually impaired
Provide alternative navigation for the motion impaired
If you use color to convey information, provide a black and white alternative
For tables, include clear row and column headers
http://www.section508.gov/ Those of us who work for the Federal Government or contract with the Federal government must ascertain that their survey and delivery platform are 508 compliant.
Section 508 applies to Federal agencies and organizations that are receiving federal funds or under contract with a Federal agency.
Section 508 deals with access to electronic and information technology includes any product used to acquire, store, manipulate, or transmit information.
Much of the guidance provided is geared toward the IT profession and provides explicit instructions on how to program applications.
I’ve tried to summarize the most relevant sections to those of us who are implementing a web-based survey. It goes a little further than the guidance I just provided. You need to attend to color, navigation controls, presentation of data, timing of presentation.
Ask your software vendor or contractor whether their application is 508 compliant. Recently we called Vovici, and the were, in fact, 508 compliant. Many web-based tools advertise themselves as 508 compliant. Those of us who work for the Federal Government or contract with the Federal government must ascertain that their survey and delivery platform are 508 compliant.
Section 508 applies to Federal agencies and organizations that are receiving federal funds or under contract with a Federal agency.
Section 508 deals with access to electronic and information technology includes any product used to acquire, store, manipulate, or transmit information.
Much of the guidance provided is geared toward the IT profession and provides explicit instructions on how to program applications.
I’ve tried to summarize the most relevant sections to those of us who are implementing a web-based survey. It goes a little further than the guidance I just provided. You need to attend to color, navigation controls, presentation of data, timing of presentation.
Ask your software vendor or contractor whether their application is 508 compliant. Recently we called Vovici, and the were, in fact, 508 compliant. Many web-based tools advertise themselves as 508 compliant.
92. 4. Develop the Survey Materials This is the message that OPM used in a recent survey that they administered.This is the message that OPM used in a recent survey that they administered.
93. 4. Develop the Survey Materials Web-based Surveys: Some Design Options
Exiting and and re-entering the survey
Opting out of the survey
Mandatory response fields
Allowing more than one answer
Branching
Item-level
“If you, are planning to leave the agency within the next year . . .”
Group-based
“If you supervise 3 or more employees . . .”
Sample-based
Randomly select respondents to answer a set of questions
When your put together your web-based survey you will have some additional decisions. Decisions that are not required for paper-and- pencil surveys.
Decide in advance:
Do you want the person to be able to exit and re-enter the survey, or can the survey be completed in one sitting? (Password, vs. default unfinished survey option)
Do you want to give the respondent the ability to exit the survey without viewing all of the questions?
Do you want to force respondents to respond to each question before proceeding to the next?
Each of these decisions have consequences which must be weighed.
Also, if you want to include branching questions, you should make sure that they are programmed correctly.When your put together your web-based survey you will have some additional decisions. Decisions that are not required for paper-and- pencil surveys.
Decide in advance:
Do you want the person to be able to exit and re-enter the survey, or can the survey be completed in one sitting? (Password, vs. default unfinished survey option)
Do you want to give the respondent the ability to exit the survey without viewing all of the questions?
Do you want to force respondents to respond to each question before proceeding to the next?
Each of these decisions have consequences which must be weighed.
Also, if you want to include branching questions, you should make sure that they are programmed correctly.
94. 4. Develop the Survey Materials Web-based Surveys: Delivery Considerations
Be aware of limitations imposed by your equipment, browser, or transmission capability
Complex designs require more computer memory and take longer to transmit
Some programming languages (e.g., Java) cannot be accessed by all computers
Some browsers will not support the technology
Make arrangements for respondents who do not have easy access to the internet or intranet
Phone
Paper-and-pencil Also think about limitations that may be imposed by your organization’s equipment, browser, and transmission capability.
If you include fancy graphics, it may take longer to transmit the questionnaire, which could cut down on the response rate.
Also, some programming languages cannot be accessed by all computers
Some browsers may be too old to support the technology.
Finally, if you work in an organization where respondents may not have easy access to the internet or to a computer, make arrangements for them to respond by telephone or to provide a hard copy.
In a recent CBP survey, we provided telephonic access because so many of our people work in the field, or at work stations without computers. Out of the 2100 respondents, 125 elected to respond by telephone. Also think about limitations that may be imposed by your organization’s equipment, browser, and transmission capability.
If you include fancy graphics, it may take longer to transmit the questionnaire, which could cut down on the response rate.
Also, some programming languages cannot be accessed by all computers
Some browsers may be too old to support the technology.
Finally, if you work in an organization where respondents may not have easy access to the internet or to a computer, make arrangements for them to respond by telephone or to provide a hard copy.
In a recent CBP survey, we provided telephonic access because so many of our people work in the field, or at work stations without computers. Out of the 2100 respondents, 125 elected to respond by telephone.
95. 4. Develop the Survey Materials Pretest the Survey
Content
Sensitivity
Reading level
Form
Layout
Attractiveness
Function
Access
Navigation
Technology Before you go live, it’s a good idea to pretest the survey. The pretest does not have to be elaborate.
Ask people who are similar to those in your sample to complete the survey.
Ask these people about the sensitivity of the content, the appropriateness of the reading level, and monitor the time needed to complete the survey.
Packaging is important. Does it look good on the page?
Also, people are less likely to complete a survey if it presents obstacles.
f you are administering via the internet, you also want to make sure that the technology works--and that the target population is able to make it work.
Can they get in, get through, and get out.
Before you go live, it’s a good idea to pretest the survey. The pretest does not have to be elaborate.
Ask people who are similar to those in your sample to complete the survey.
Ask these people about the sensitivity of the content, the appropriateness of the reading level, and monitor the time needed to complete the survey.
Packaging is important. Does it look good on the page?
Also, people are less likely to complete a survey if it presents obstacles.
f you are administering via the internet, you also want to make sure that the technology works--and that the target population is able to make it work.
Can they get in, get through, and get out.
96. 4. Develop the Survey Materials People will be more likely to respond if the survey is:
Meaningful
Survey addresses important issues
Respondents stand to benefit from results
Anonymous
Confidential
Minimally intrusive
Attractive
Easy to navigate and complete Remember, respondent motivation critical to an acceptable response rate. You want to do everything in your power to
Make sure that the survey is meaningful
Assure anonymity and confidentiality
Be minimally intrusive
You also want to make sure that the survey is has an attractive, professional look, and that it is easy to complete.
Remember, respondent motivation critical to an acceptable response rate. You want to do everything in your power to
Make sure that the survey is meaningful
Assure anonymity and confidentiality
Be minimally intrusive
You also want to make sure that the survey is has an attractive, professional look, and that it is easy to complete.
97. 5. Conduct the Survey Publicize the Survey
Announce the survey with an e-mail message
Communicate throughout the chain of command
Management meetings
Staff meetings or musters
Place announcements on electronic bulletin boards
Distribute posters
Do not underestimate the importance of publicizing a survey.
In addition to an announcement from top management, consider how you might communicate through the chain of command.
For example, encourage announcements in management meetings and staff meetings or musters.
Recent SES Survey—Amazing what support from # 2 in the agency will do for the response rate. 66%
OPM distributed posters for the most recent Human Capital Survey. We also got an e-mail from the Commissioner on the importance.
Its also good to publicize the survey on your organization’s internal intranet and in any publications your organization may have.
Do not underestimate the importance of publicizing a survey.
In addition to an announcement from top management, consider how you might communicate through the chain of command.
For example, encourage announcements in management meetings and staff meetings or musters.
Recent SES Survey—Amazing what support from # 2 in the agency will do for the response rate. 66%
OPM distributed posters for the most recent Human Capital Survey. We also got an e-mail from the Commissioner on the importance.
Its also good to publicize the survey on your organization’s internal intranet and in any publications your organization may have.
98. 5. Conduct the Survey Launch the Survey
Coordinate the launch through appropriate channels
Push the button!
Follow Up
Reminders improve response rate!
The Web makes follow-up easier
Extend the survey if necessary Once you’ve publicized the survey you’re ready to push the button.
In CBP, our Office of Information Technology is responsible for distributing the survey. We can only send out “mass mailings” when the Commissioner is not sending one out. The Commissioner doesn’t like more than 3 of these going out in any given week.
Think about distribution of reminders. At a minimum, you’ll want one to go out 1 week before the survey period is due to end.
Plan to extend if your response rate is not high enough to give you an acceptable margin of error.
OPM did some research. I think they decided that 2-3 reminders were optimal. After that they received diminishing returns.
Once you’ve publicized the survey you’re ready to push the button.
In CBP, our Office of Information Technology is responsible for distributing the survey. We can only send out “mass mailings” when the Commissioner is not sending one out. The Commissioner doesn’t like more than 3 of these going out in any given week.
Think about distribution of reminders. At a minimum, you’ll want one to go out 1 week before the survey period is due to end.
Plan to extend if your response rate is not high enough to give you an acceptable margin of error.
OPM did some research. I think they decided that 2-3 reminders were optimal. After that they received diminishing returns.
99. 6. Analyze the Responses Coordinate Analyses with Your Customer
When is the analysis needed (time frame)?
Which questions are most critical to analyze?
What breakouts and cross tabulations are needed?
Satisfaction with service quality
Supervisors vs. non supervisors
Headquarters vs. field offices
Who will be preparing recommendations?
You
Your customer
Working group/task force We try to meet with our customer just before the data comes and to confirm what analyses are needed and when they are needed.
Time Frame. Time frames differ—the higher the level, the shorter the timeframe.
The Assistant Commissioner wanted preliminary results for the organizational survey immediately.
We had more wiggle room with t results for the customer service survey
Focus. Some customers know exactly what analyses they want and ask for them. Others want more guidance from you.
CPBO vs. CBPAS (occupational comparisons)
SES hopefuls vs. happy where I am
Headquarters vs. Field.
What customer services are most tied overall satisfaction? A series of cross-tabs showed that written products were most important.
Plan to provide analyses and interpretation—Your opinion will be requested.
Level of involvement depends on the project. For the organizational survey, we provided analyses, interpretation and some preliminary recommendations. We also prepared materials that our customer used to brief executives and field managers. We then conducted focus groups in which they provided recommendations.
We worked with our Customer Service and helped him to prepare briefing materials, but he took charge of preparing recommendations.We try to meet with our customer just before the data comes and to confirm what analyses are needed and when they are needed.
Time Frame. Time frames differ—the higher the level, the shorter the timeframe.
The Assistant Commissioner wanted preliminary results for the organizational survey immediately.
We had more wiggle room with t results for the customer service survey
Focus. Some customers know exactly what analyses they want and ask for them. Others want more guidance from you.
CPBO vs. CBPAS (occupational comparisons)
SES hopefuls vs. happy where I am
Headquarters vs. Field.
What customer services are most tied overall satisfaction? A series of cross-tabs showed that written products were most important.
Plan to provide analyses and interpretation—Your opinion will be requested.
Level of involvement depends on the project. For the organizational survey, we provided analyses, interpretation and some preliminary recommendations. We also prepared materials that our customer used to brief executives and field managers. We then conducted focus groups in which they provided recommendations.
We worked with our Customer Service and helped him to prepare briefing materials, but he took charge of preparing recommendations.
100. 6. Analyze the Responses Respect the limits of your data—Keep it simple!
Most survey data relies on nominal or ordinal scales
Complicated statistical techniques are inappropriate
Frequencies are both sufficient and appropriate
Report the % positive, % neutral, and % negative
Decide how you will handle rounding error
Add a note if the total % doesn’t add to 100
Don’t average things that should not be averaged
“Don’t know” and “Not Applicable” are qualitatively different from “Satisfied” or “Dissatisfied’” What is most important when you are analyzing survey data is to keep it simple. The data you collected, most likely, relied on a nominal or ordinal scale.
Nominal scales merely identify the response. The response categories are cannot be ranked
Surveys use mostly ordinal scales. The categories can be ranked.
But because the points in ordinal scales are not equidistant. Complicated statistical techniques are inappropriate. You need an interval scale for that.
For most purposes, frequencies are sufficient.
You’ll want to decide in advance how to handle rounding error. You don’t have to force numbers to add to 100%, but it’s a good idea to add a note if they don’t.
Another common mistake in data analysis is averaging things that should not be averaged.
Scale points. Perhaps you assigned the scale point “3” to neither agree nor disagree, or “6” to don’t know. These responses should not be included in with the “disagree to agree” continuum. They are qualitatively different.
Differing N’s. Someone is going to ask you to average percentages that are based on different N’s. This doesn’t work either. What is most important when you are analyzing survey data is to keep it simple. The data you collected, most likely, relied on a nominal or ordinal scale.
Nominal scales merely identify the response. The response categories are cannot be ranked
Surveys use mostly ordinal scales. The categories can be ranked.
But because the points in ordinal scales are not equidistant. Complicated statistical techniques are inappropriate. You need an interval scale for that.
For most purposes, frequencies are sufficient.
You’ll want to decide in advance how to handle rounding error. You don’t have to force numbers to add to 100%, but it’s a good idea to add a note if they don’t.
Another common mistake in data analysis is averaging things that should not be averaged.
Scale points. Perhaps you assigned the scale point “3” to neither agree nor disagree, or “6” to don’t know. These responses should not be included in with the “disagree to agree” continuum. They are qualitatively different.
Differing N’s. Someone is going to ask you to average percentages that are based on different N’s. This doesn’t work either.
101. 6. Analyze the Responses Conduct subgroup analyses carefully!
Watch the margin of error
Smaller subgroups have larger margin of error
Weight total responses for unequal subgroup contribution
Large subgroups can obscure results
Small dissimilar groups can bias results
Protect respondents’ confidentiality
Aggregate data to protect confidentiality
Small subgroups (less than 10) risk respondent confidentiality Some other things to keep in mind. . .
Be very careful of response rates and subgroup sizes. The smaller the subgroup, the more room for error. I’ll show you a few examples in a minute.
Also, examine the contribution each subgroup makes to the total response.
It is a good practice to weight the total response for unequal subgroup contribution. In our organizational survey, our smallest group of respondents was the most extreme in their responses. If we didn’t weight their responses, it would have given an inaccurate picture of the overall results.
Finally, make sure that you aggregate data to protect confidentiality.
I was doing some organizational development work a while ago. I was working with groups that ranged from about 10-25. One of the managers in the smaller groups wanted me to break out the responses for the clerical staff—all 2 of them. If I had done that, It wouldn’t have taken much to figure out they responded to each question.Some other things to keep in mind. . .
Be very careful of response rates and subgroup sizes. The smaller the subgroup, the more room for error. I’ll show you a few examples in a minute.
Also, examine the contribution each subgroup makes to the total response.
It is a good practice to weight the total response for unequal subgroup contribution. In our organizational survey, our smallest group of respondents was the most extreme in their responses. If we didn’t weight their responses, it would have given an inaccurate picture of the overall results.
Finally, make sure that you aggregate data to protect confidentiality.
I was doing some organizational development work a while ago. I was working with groups that ranged from about 10-25. One of the managers in the smaller groups wanted me to break out the responses for the clerical staff—all 2 of them. If I had done that, It wouldn’t have taken much to figure out they responded to each question.
102. Analyze the Responses Analyzing Open-Ended Questions
Develop a coding scheme
Develop content categories or themes for grouping responses
Have 2 people work independently on the first draft
Test the categories by coding a sample of responses
Check inter-rater agreement
On categories
On coded responses
Apply the coding scheme to remaining responses If you included open-ended questions, you are going to want to content analyze them.
How many of you have done this before?
One way to do this is to sort a sample of the responses into content groupings.
This is a lot like establishing duty categories and sorting tasks by duty.
It’s better if you can have 2 people do this independently and then reach agreement on the categories.
Once the categories have been established, if possible, get others to see if they are able to apply the categories to an independent set of responses. Check your inter-rater agreement.If you included open-ended questions, you are going to want to content analyze them.
How many of you have done this before?
One way to do this is to sort a sample of the responses into content groupings.
This is a lot like establishing duty categories and sorting tasks by duty.
It’s better if you can have 2 people do this independently and then reach agreement on the categories.
Once the categories have been established, if possible, get others to see if they are able to apply the categories to an independent set of responses. Check your inter-rater agreement.
103. Analyze the Responses Analyzing Open-Ended Questions
Summarize findings
By % of total responses
By % of respondents
Plan for the final report/presentation
Don’t get swayed by “the evidence”
Example:
2000 people responded,
500 provided written comments.
100 gave similar responses for content category “X”
Is this (5% of respondents) really meaningful?
Select salient quotes to illustrate survey findings Once you’ve gotten everything sorted, you’ll need to summarize the responses.
You will find that open-ended responses are not neat—that some respondents will write a page or more. Each paragraph may be coded into different content category, so the answer could have 4 or 5 code marks on it. You have 2 choices—count the total number of comments, or count the number of people who make the comment. Both are accepted practice.
We opted for the number of people making a particular. We reported the percentage of respondents who made a particular comment.
We also read all of the comments.
Let’s say 2000 people responded, 500 provided comments, and 100 were negative comments for category “x.” After reading all of these comments, It is so easy to be swayed by the so-called evidence! But you have to step back and ask yourself whether these comments—in reality made by 5% of the respondents—is really meaningful.
As you are reading through the comments, keep your data tables handy so that you can earmark salient comments. Direct quotes are a good way to illustrate your major findings. Once you’ve gotten everything sorted, you’ll need to summarize the responses.
You will find that open-ended responses are not neat—that some respondents will write a page or more. Each paragraph may be coded into different content category, so the answer could have 4 or 5 code marks on it. You have 2 choices—count the total number of comments, or count the number of people who make the comment. Both are accepted practice.
We opted for the number of people making a particular. We reported the percentage of respondents who made a particular comment.
We also read all of the comments.
Let’s say 2000 people responded, 500 provided comments, and 100 were negative comments for category “x.” After reading all of these comments, It is so easy to be swayed by the so-called evidence! But you have to step back and ask yourself whether these comments—in reality made by 5% of the respondents—is really meaningful.
As you are reading through the comments, keep your data tables handy so that you can earmark salient comments. Direct quotes are a good way to illustrate your major findings.
104. 7. Report the Results Do not share results until your customer releases them
Present the results only to the agreed upon audiences
Who gets the results? And in how much detail?
Executives
Managers
Employees
The Union
Who absolutely does not get the results?
Keep the results secure
Shred the drafts—they are more sensitive than the questions
Things you never want to hear
The Commissioner seen the results yet. How did the Union get them?
How did this wind up on the “front fold” of the Washington Post? Your next step will be to present the results to the agreed upon audiences at an appropriate level of detail.
Do not share results until your customer releases them. If the results are particularly sensitive you may want to work with your organization’s public relations office You will get requests for results from a variety of sources.
Be very careful about what you release and to whom.
The last thing you want to happen is for your results to be spun in an inappropriate manner on the front fold of your local paper—which in our case is the Washington Post!
.
Finally, keep the results secure. We were diligent about shredding drafts and locking up analyses and reports.Your next step will be to present the results to the agreed upon audiences at an appropriate level of detail.
Do not share results until your customer releases them. If the results are particularly sensitive you may want to work with your organization’s public relations office You will get requests for results from a variety of sources.
Be very careful about what you release and to whom.
The last thing you want to happen is for your results to be spun in an inappropriate manner on the front fold of your local paper—which in our case is the Washington Post!
.
Finally, keep the results secure. We were diligent about shredding drafts and locking up analyses and reports.
105. 7. Report the Results Who will present results?
Executives
Unit managers/supervisors
HR staff
How will results be presented?
Executive Briefings
Supervisor/Manager coaching and counseling sessions
Action planning sessions for managers
Problem solving sessions for workgroups
All hands meetings for employees
Town Hall meetings for the community
Written report with baseline data for improving services You can save yourself a lot of work by discussing reporting requirements in advance.
Your customer will have definite ideas about
Who will be presenting the results
Where they are likely to be presented
And what materials you will be responsible for.You can save yourself a lot of work by discussing reporting requirements in advance.
Your customer will have definite ideas about
Who will be presenting the results
Where they are likely to be presented
And what materials you will be responsible for.
106. 7. Report the Results Tailor your presentation to your audience
General rules of thumb
Keep details to a minimum
Make materials self-explanatory
Support text with graphics
Find out what works best for your audience
Bullet points
Bar charts
Tables with highlighting on key findings
An Excel spread sheet
In general, it’s best to keep details to a minimum.
Also, the materials should require little or no explanation. You know that someone will pick the briefing package up next week—and you want the materials to make sense.
Also, it’s good to provide both words and pictures.
Finally, It’s important to know your audience.
One high-level didn’t read. Everything had to be a bullet—one line or less.
Another high-level manager was very quantitative—In addition to the briefing materials we prepared, she wanted data tables and raw responses to open-ended questions.
But for most folks, you will be safe with bullet points supported liberally by graphics.In general, it’s best to keep details to a minimum.
Also, the materials should require little or no explanation. You know that someone will pick the briefing package up next week—and you want the materials to make sense.
Also, it’s good to provide both words and pictures.
Finally, It’s important to know your audience.
One high-level didn’t read. Everything had to be a bullet—one line or less.
Another high-level manager was very quantitative—In addition to the briefing materials we prepared, she wanted data tables and raw responses to open-ended questions.
But for most folks, you will be safe with bullet points supported liberally by graphics.
107. 7. Report the Results Be responsive
Report results quickly as possible
Be aware of your customer’s external commitments
Be prepared to provide extra copies of reports And of course, everyone wanted the results yesterday. We did prepare preliminary findings within a few days of the end of the survey.
It is a good idea to keep a copy of any reports and briefings you produce.
You will get questions about a particular briefing package you prepared.
Also someone who is higher on the food chain than you are will lose their copy and want yours.
Now, lets at some of the ways in which you can present findings.And of course, everyone wanted the results yesterday. We did prepare preliminary findings within a few days of the end of the survey.
It is a good idea to keep a copy of any reports and briefings you produce.
You will get questions about a particular briefing package you prepared.
Also someone who is higher on the food chain than you are will lose their copy and want yours.
Now, lets at some of the ways in which you can present findings.
108. 7. Report the Results % Positive
27.40%
29.40%
31.50%
33.80%
35.30%
Key Concerns
31. In my work unit, differences in performance are recognized in a meaningful way.
24. In my work unit, steps are taken to deal with a poor performer who cannot or will not improve.
32. In my work unit, personnel decisions are based on merit.
23. Promotions in my work unit are based on merit.
63. How satisfied are you with your opportunity to get a better job in your organization? Would you use this format? The answer is “It depends on the customer’s needs.”
Level of detail is good
Findings are presented attractively
If each question represents a key concern, this may be enough information.
Unless the basis for comparison is understood (other years, other groups), the findings are hard to interpret
I would want to know, “What about the other 2/3 of the respondents.”
Or—how are we doing compared to other groups, or past years.Would you use this format? The answer is “It depends on the customer’s needs.”
Level of detail is good
Findings are presented attractively
If each question represents a key concern, this may be enough information.
Unless the basis for comparison is understood (other years, other groups), the findings are hard to interpret
I would want to know, “What about the other 2/3 of the respondents.”
Or—how are we doing compared to other groups, or past years.
109. 7. Report the Results This chart gives a more complete picture of how the group responded. Depending on your customer’s concerns, you may want to show how two groups compared (e.g., DHS vs. CBP, or two subgroups within the agency.)
As you can see, there are at least as many dissatisfied people and a fair number who may not be unhappy, but they are not happy either.This chart gives a more complete picture of how the group responded. Depending on your customer’s concerns, you may want to show how two groups compared (e.g., DHS vs. CBP, or two subgroups within the agency.)
As you can see, there are at least as many dissatisfied people and a fair number who may not be unhappy, but they are not happy either.
110. 7. Report the Results
This chart was prepared for a report on the results of the 2006 Federal Human Capital Survey.
Again, you see only the % positive. However, these results are included as part of a trend analysis including the two previous surveys.
I think that this tells a much more complete story. It might give too much detail for some people.
OPM also rank ordered the questions in terms of % positive. This also provides a context. It enables you to compare across questions.
This chart was prepared for a report on the results of the 2006 Federal Human Capital Survey.
Again, you see only the % positive. However, these results are included as part of a trend analysis including the two previous surveys.
I think that this tells a much more complete story. It might give too much detail for some people.
OPM also rank ordered the questions in terms of % positive. This also provides a context. It enables you to compare across questions.
111. 7. Report the Results Useful Euphemisms--Making your point without numbers
Overwhelming majority (more than 80%)
Widespread agreement (75-80%)
A large percentage (70%)
Significant or meaningful majority (About 60%)
More respondents than not (52%)
Nearly half (45-49%)
Some, but not the majority (35-40%)
Only a handful (10-15%)
Our Division Director is a real pro in communicating survey findings.
She emphasized how quickly managers glaze over when you present them with lots of numbers. For this particular audience, it was more important to verbally emphasize key findings.
Here are some of the phrases we used when we put together our bullet points.Our Division Director is a real pro in communicating survey findings.
She emphasized how quickly managers glaze over when you present them with lots of numbers. For this particular audience, it was more important to verbally emphasize key findings.
Here are some of the phrases we used when we put together our bullet points.
112. 7. Report the Results Explaining the Data
First, consider the baseline in your organization
Then consider these “rules of thumb”
Over 65% positive is good
Less than 35% positive is a red flag
More than 35% negative is a red flag
If you have 20% or more neutral responses, this is significant and worth reporting
If your margin of error is reasonable (2-3%), a 10% difference between groups is worth mentioning Of course, your customer is going to ask—What does this all mean? I’ve found that it’s useful to talk about “red flags.” This strategy also works well with bar charts that show % positive in green, % neutral in yellow, and % negative in red, like the one I showed you earlier.
In organizational attitude surveys, if about 2/3 of your people are positive—this is good.
If more than 1/3 are negative, you have a ‘red flag’ issue.
Less than 1/3 positive responses is a really serious ‘red flag’
It’s also worth noting if you have more than 20% who are neutral.
Finally, if your margin of error is reasonable (in the 2-3% range) a difference of at least 10% between groups is worth mentioning. Say for example, if 55% of supervisors responded positively, but only 43% of non-supervisors responded positively.
Finally, you need to consider what is reasonable in the context of your organization.
At CBP, we’ve just undergone a major merger, and for some topics, we considered 50% positive rating was pretty good.Of course, your customer is going to ask—What does this all mean? I’ve found that it’s useful to talk about “red flags.” This strategy also works well with bar charts that show % positive in green, % neutral in yellow, and % negative in red, like the one I showed you earlier.
In organizational attitude surveys, if about 2/3 of your people are positive—this is good.
If more than 1/3 are negative, you have a ‘red flag’ issue.
Less than 1/3 positive responses is a really serious ‘red flag’
It’s also worth noting if you have more than 20% who are neutral.
Finally, if your margin of error is reasonable (in the 2-3% range) a difference of at least 10% between groups is worth mentioning. Say for example, if 55% of supervisors responded positively, but only 43% of non-supervisors responded positively.
Finally, you need to consider what is reasonable in the context of your organization.
At CBP, we’ve just undergone a major merger, and for some topics, we considered 50% positive rating was pretty good.
113. 7. Report the Results In presenting findings, strategy is important, especially if the news is mostly bad. It’s good to start with something positive.
This is an example of a question used in the 2004 Federal Human Capital Survey. We also used it in our organizational survey. It’s a good one to include because most people, as you can see, tend to believe that the work they do is important.
Needless to say, this is not a red flag issue.
# of Respondents
Government wide 147,893
Non-Supervisors 74,546
Team Leaders 20,112
Supervisors 28,516
Managers 17,117
Executives 3,906
In presenting findings, strategy is important, especially if the news is mostly bad. It’s good to start with something positive.
This is an example of a question used in the 2004 Federal Human Capital Survey. We also used it in our organizational survey. It’s a good one to include because most people, as you can see, tend to believe that the work they do is important.
Needless to say, this is not a red flag issue.
# of Respondents
Government wide 147,893
Non-Supervisors 74,546
Team Leaders 20,112
Supervisors 28,516
Managers 17,117
Executives 3,906
114. 7. Report the Results Here is another question from the Federal Human Capital Survey. I want to use it to illustrate differences between groups.
This question deals with promotions. Looks like a red flag issue to me, at least for the worker bees.
There are also some pretty dramatic group differences.
It’s interesting how the higher up people go, the fairer they see promotions.
# of Respondents
Governmentwide 147,893
Non-Supervisors 74,546
Team Leaders 20,112
Supervisors 28,516
Managers 17,117
Executives 3,906Here is another question from the Federal Human Capital Survey. I want to use it to illustrate differences between groups.
This question deals with promotions. Looks like a red flag issue to me, at least for the worker bees.
There are also some pretty dramatic group differences.
It’s interesting how the higher up people go, the fairer they see promotions.
# of Respondents
Governmentwide 147,893
Non-Supervisors 74,546
Team Leaders 20,112
Supervisors 28,516
Managers 17,117
Executives 3,906
115. 7. Report the Results This is data from a recent customer satisfaction survey. The data are real, but the organizational names have been changed.
It shows % of people in each subgroup who are satisfied with customer satisfaction.
Based on this chart—where would you devote your efforts to improve service. Programs & Planning and Public Relations maybe?
But, wait . . .This is data from a recent customer satisfaction survey. The data are real, but the organizational names have been changed.
It shows % of people in each subgroup who are satisfied with customer satisfaction.
Based on this chart—where would you devote your efforts to improve service. Programs & Planning and Public Relations maybe?
But, wait . . .
116. 7. Report the Results The picture changes a bit when you look at the raw numbers.
Only a handful of people are disgruntled in HQ. Field Unit 3 probably should get the most attention based on the numbers.The picture changes a bit when you look at the raw numbers.
Only a handful of people are disgruntled in HQ. Field Unit 3 probably should get the most attention based on the numbers.
117. 7. Report the Results You can see where he needs to focus his attention more clearly from this table. HR’s big red blotch on the first chart was actually one person who was not satisfied.
There are times when the data table really is the best thing to present, provided you are there to explain it.
You can see where he needs to focus his attention more clearly from this table. HR’s big red blotch on the first chart was actually one person who was not satisfied.
There are times when the data table really is the best thing to present, provided you are there to explain it.
118. 7. Report the Results When you consider the margin of error, or how much wobble there is in the subgroup results, you realize that the most important “red flag’ really is Field Unit 3.
When you consider the margin of error, or how much wobble there is in the subgroup results, you realize that the most important “red flag’ really is Field Unit 3.
119. 7. Report the Results Next Steps:
Act on the Results
Briefings for respondents
Planning meetings
Focus groups
Interventions
Document the Findings
Final report to the customer
Technical reports*
Professional presentations*
*If permitted by the customer Over the past few hours, we’ve gone from survey conception, design, and administration to the analysis and presentation of results.
A final comment, commitment to action on the part of your customer is critical.
Without this commitment, the survey process is likely to boomerang.
Last year I did two surveys that are good examples of how customers can follow through on their commitments to use the results. In both cases, an organizational assessment survey, and a customer satisfaction survey, our customers were committed to take action based on the survey results.
Org survey—held focus groups at a nationwide managers conference and got managers to share successes, recommend changes.
Cust. Sat—used the survey to demonstrate how much more could be done with extra staff. Got 6 additional positions.
This commitment to action could be the most critical factors in judging the success of a survey!
We have a some time for a few questions. Over the past few hours, we’ve gone from survey conception, design, and administration to the analysis and presentation of results.
A final comment, commitment to action on the part of your customer is critical.
Without this commitment, the survey process is likely to boomerang.
Last year I did two surveys that are good examples of how customers can follow through on their commitments to use the results. In both cases, an organizational assessment survey, and a customer satisfaction survey, our customers were committed to take action based on the survey results.
Org survey—held focus groups at a nationwide managers conference and got managers to share successes, recommend changes.
Cust. Sat—used the survey to demonstrate how much more could be done with extra staff. Got 6 additional positions.
This commitment to action could be the most critical factors in judging the success of a survey!
We have a some time for a few questions.
120. Questions? References on Page 10 of your handout.References on Page 10 of your handout.
121. Contact Information Ilene Gast
Senior Personnel Research Psychologist
Personnel Research and Assessment Division
US Customs and Border Protection
1400 L Street, NW, Rm. 714
Washington, DC 20005
ilene.gast@dhs.gov
202-863-6291