310 likes | 324 Views
Qualitative Research Dr Ian Jones Centre for Event and Sport Research Bournemouth University. Today – NOT an outline of methods, but an outline of underlying considerations… 1. A few common myths/misconceptions 2. Some challenges facing the qualitative researcher
E N D
Qualitative Research Dr Ian Jones Centre for Event and Sport Research Bournemouth University
Today – NOT an outline of methods, but an outline of underlying considerations… 1. A few common myths/misconceptions 2. Some challenges facing the qualitative researcher 3. Some possible issues in fieldwork
Qualitative research is soft, unscientific and atheoretical True … …if done badly (as with all forms of research)
You can’t have a hypothesis with qualitative research False Can be a useful tool to guide you e.g. Cressey’s (1953) study of fraud Hypothesis 1 – people felt it was a technical offence. Rejected with initial data collection Hypothesis 2 – people undertook the behaviour when they felt other means unavailable - Rejected with further data collection Hypothesis 3 – people undertook the behaviour when they felt other means unavailable and problem non-shareable
You should have an ‘audit trail’ True Explain through an "audit trail" (Maykut & Morehouse 1994) aspects such as • theoretical decisions/choices • practical contingencies • rendering your research transparent • Allowing others to • Critique your research • Emulate your research
The researcher should be part of the research process True – part of the ‘audit trail’ Explaining your position clarifies to others your choices, analyses, interpretations etc Reflexivity a key element of any qualitative write up
You should be flexible with your choice of methods True Don’t restrict yourself to one method “Bricoleur” • Interviews • Observation • Autoethnography • Content analysis Let your methods emerge!
Analysis should commence as soon as data collection starts True Qualitative research should be emergent – early analysis will allow refinement of research questions/hypotheses
You should use computer software to analyse your data False AND True Each method has strengths and weaknesses Depends on purpose e.g. analysis software = objective, good for large data sets, reliable Manual analysis = feel for data, easier to identify idiosyncrasies
Krane, et al. (1997: 215) note with regard to computer versus manual analysis: none of these procedures directly affects the value of the study; they are merely ways for the inquirers to work with their data... If individuals use NUD*IST or Hyperqual computer programs, or 3 x 5 cards and paste them to the wall, they are really doing the same thing conceptually.
Numbers aren’t important True… placing a frequency count after a category of experiences is tantamount to saying how important it is; thus value is derived by number. In many cases, rare experiences are no less meaningful, useful, or important than common ones. In some cases, the rare experience may be the most enlightening one (Krane et al. 1997, p.214).
Your analysis should be a lone endeavour False e.g. Ask a fellow researcher to code the data, and compare findings. will identify problems in coding, and ensure a valid set of codes. Check reliability through comparing your coding with others. Miles & Huberman (1994) suggest the following: Reliability = number of agreements /(number of agreements + disagreements). you may begin with a low score (e.g. 60%) but with continual discussion and clarification you should achieve a score of up to 90% (if not higher). Use of ‘devil’s advocate’
Qualitative research must be generaliseable False many qualitative researchers don't even care about generalizing – focus is often upon generating rich descriptions of the phenomena
Challenges to the qualitative researcher (Gummesson 1991) • Access to ‘reality’ • Availability to the detailed ‘rich’ data required • Characteristics of the researcher
2. Pre-understanding and understanding What is your understanding of the topic before data collection? How does this influence your understanding developed during data collection?
3. Ensuring quality? • Reliability • Validity • Plausibility/Authenticity • Credibility • Relevance • Transparency
Ask yourself a number of questions to assist the analysis: • What type of behaviour is being demonstrated? • What is its structure? • How frequent is it? • What are its causes? • What are its processes? • What are its consequences? • What are people's strategies with dealing with the behaviour? Frankfort-Nachimas and Nachimas (1996)
Potential Errors Some errors are more ‘accidental’ • Selection bias • Measurement bias • Confirmation bias Hartman et al (2002) identified 64 sources of bias Need to be aware of the range, e.g…
Selection Bias You can easily get the results you want by biasing your sample Attitudes towards Low Cost airlines If you want a positive response, ask those waiting for an EasyJet flight?? What about a negative response?
Measurement Bias http://uk.youtube.com/watch?v=2yhN1IDLQjo
Positivity Effect • Was tourism better in the past? • Were tour operators more knowledgeable? As time progresses, our memories are distorted in a positive direction (positivity effect) So we don’t tend to remember the negatives… Impacts upon any question that requires recall
Lake Wobegon effect “How well do you get along with others?” Almost ALL respondents respond that they get along with others much better than average A place where "all the women are strong, all the men are good-looking, and all the children are above average". This is often how people perceive themselves!
You are • More sociable • More popular • More intelligent • Get on better with others • May relate to aspects of your research question
Researcher led bias The researcher can also influence behaviour through their (often unconscious) actions
“Clever Hans” A horse who could answer simple sums set by his owner, hence • Understanding language • Understanding mathematical concepts Importance of non verbal influence • Interviews • Focus groups
Confirmation Bias We can easily select data that supports our own point of view We can also reject data that goes against our point of view Egocentric thinking – especially with sport “if I do it /think /act this way, then everyone else does as well”
MANY other sources of bias. So… • …Think about all sources of potential bias before and during fieldwork and analysis
Finally… What do examiners look for? • How were the setting and the subjects selected? • What was the researcher's perspective, and has this been taken into account? • What methods did the researcher use for collecting data—and are these described in detail? • Were the data appropriately and systematically analysed? Is there discussion as to how themes were derived? • What conclusions were drawn, and are they justified by the results? • Is context presented? • Are the quotes representative or exemplary? • Have alternative interpretations been considered? • Is a clear distinction made between data and interpretation?