220 likes | 1.43k Views
Operationalization, Validity, & Reliability. Soc 302 Spring 2009. Review. Conceptualization. Operationalization. What is meant by this term? Specific definition. How is it measured? Specific way it is observed . Sometimes thought of as translation.
E N D
Operationalization, Validity, & Reliability Soc 302 Spring 2009
Review Conceptualization Operationalization What is meant by this term? Specific definition How is it measured? Specific way it is observed
Sometimes thought of as translation • Translating a conceptual definition into a functional construct (operationalization)
Prevalence of substance abuse study • Employing DSM-IV’s definition will indicate significantly different operationalization than using respondents’ personal definition of substance abuse even if using a survey method for both • DSM-IV • Have you ever been expelled from school or fired from a job because of using drugs or alcohol? • Has your drug and/or alcohol use caused arguments between you and your family/friends? • Respondent def • How would you define substance abuse? • What behaviors do you associate with substance abuse? • Do you now or have you ever had patterns of behavior that fit this definition?
Important components to consider • Detail • Variation range • Number of dimensions
Variation range • How wide does the net need to be? • If educational attainment is one variable: • Didn’t finish high school, graduated high school, some college, graduated college. • AS, BA/BS, MA, MD, JD, PhD as one category.
Detail • How fine does your measurement need to be? • If household size is one of your variables…. • Can you get away with a) exactly one, b) 2-5, c) 6 or more? • Do you need to ask exactly how many household members?
Number of dimensions • Are you focusing on one component of a concept or taking a global perspective? • Church attendance (one aspect of religiosity) impact on marital longevity
Solid conceptualization and operationalization • Necessary for good reliability and validity • Reliability-will your measures generate similar results again? • Validity-are your procedures really measuring what you think they are measuring?
Not separate issues • Reliability is a necessary but not sufficient condition for validity • Reliable measures are needed for valid measures but reliability does not automatically ensure validity
Types of reliability • Inter-rater reliability • Do two (or more) researchers see the same thing? • Used frequently in qualitative research • Our recent group observations in Student Center employed inter-rater reliability • Test-retest reliability • Does a repeat study generate similar results? • Do not have to be identical because of variations in population, sample, etc. • Used in qualitative and quantitative research
Internal reliability/consistency • How reliable are measures within one project? • Used frequently for assessing reliability of scales and typologies but only good for unidimensional constructs • Split half reliability- randomly divide measure items and compare outcomes. • Cronbach’s alpha- a average of all possible split-half scores. • Parallel forms reliability- divide questions into two and administer each separately to the same sample. • For all, the closer the score is to 1 the more reliable the scale, etc.
Two general types of validity Internal validity External validity The logic of the study design Accounting for alternative (or additional) explanations of causal relationships if study focuses on causal relationships Generalizable (quantitative) or transferable (qualitative)
Transferability • Not all studies are intended to be generalizable to an entire population • Refers to the ability to apply research results in another context or to inform other research • Also refers to the ability of the research to connect the reader with the research • Make study environment, respondents, social phenomena “come alive” • Solicits comparisons between reader’s own experiences and experiences described in the research
All of these measures • Of validity and reliability are conducted after research is conducted • Frustrating to have to report that your measures were invalid or unreliable • But that is still a legitimate finding! • Just as frustrating sometimes to have to report you found no support for your hypothesis!
No way to know a priori • Can’t know for certain how reliable or valid something is before you’ve conducted the research • Unless you are using something that has reliability/validity previously established • That’s why so much time and effort is put into research design • Conceptualization • Operationalization • Reviewing past research • Exploring theories • Exploring methods
Pretesting and preliminary investigation • Can also increased reliability and validity • As well as improving overall research design • Pre-testing • After research instrument/guidelines established • Involves giving your survey, using your observation guidelines in the field, doing a few interviews with respondents or informants • Analyzing data generated and soliciting feedback from respondents about instrument (if applicable) • Revising measurements, instrument
Preliminary investigation • Often occurs prior to creating research instrument/guidelines • May talk informally with individuals from the target population or otherwise associated with social phenomena • May do field observations • May collect and analyze social artifacts associated with research topic