240 likes | 252 Views
Explore the crucial concepts of reliability and validity in research measurement and learn how to translate ideas into valid and reliable measures. Discover ways to estimate reliability and compute correlation values, as well as understand the importance of validity in accurately measuring research outcomes. Find inspiration for unique and innovative research topics through curiosity, experience, assignments, scholarly articles, and serendipity.
E N D
Chapter 2 Finding Ideas to Research
Generating Topics • Translate ideas into valid and reliable ways of measuring them • Collect evidence • Unique Topics • Innovative but Difficult
Reliability and Validity • The collecting of data (measurement) and doing research always raises the issues of reliability and validity. The issue of reliability is essentially the same for both measurement and research design. Reliability attempts to answer our concerns about the consistency of the information collected, while validity focuses on accuracy.
Reliability and Validity -- Relationship • The relationship between reliability and validity can be confusing because measurements and research can be reliable without being valid, but they cannot be valid unless they are reliable. • For a study to be valid it must consistently (reliability) do what it purports to do (validity). • For a measurement to be judged reliable it should produce a consistent score. • For the research study to be considered reliable each time it is replicated it too should produce similar results.
Definition; Reliability • Reliability is the consistency of your measurement, or the degree to which an instrument measures the same way each time it is used under the same condition with the same subjects. • It is important to remember that reliability is not measured, it is estimated.
Ways to Estimate Reliability • Test/retest is a conservative method to estimate reliability. The three main components to this method are as follows: • implement your measurement instrument at two separate times for each subject • compute the correlation between the two separate measurements • assume there is no change in the underlying condition (or trait you are trying to measure) between implementation
Estimating Reliability • Internal consistency estimates reliability by grouping questions in a questionnaire that measure the same concept. • One common way of computing correlation values among the questions on your instruments is by using Cronbach's Alpha. • Cronbach's alpha splits all the questions on your instrument every possible way and computes correlation values for them all (SPSS will do this). • Like a correlation coefficient, the closer the value is to one, the higher the reliability estimate of your instrument.
Definition; Validity • Cook and Campbell (1979) define validity as the "best available approximation to the truth or falsity of a given inference, proposition or conclusion." • Basically, were we right?
Thought to Ponder • It is my belief that validity is more important than reliability because if an instrument does not accurately measure what it is supposed to, there is no reason to use it even if it measures consistently.
Measurement Concepts • Measurement is the process of assigning numbers to represent the amount of a variable (a characteristic, attribute, trait present in a person, object, situation under study). Measurement results that contain little error are said to be reliable. • Sources of measurement error include • the instrument (eg, improper calibration) • the environment (eg, noise level) • the researcher (eg, fatigue, mood) • data processing (eg, data entry error)
How to find a topic? • Curiosity and Experience • Too big or Narrow • Assignments, Theses, and Grants • RFPs, RFAs, Work related assignments • Other Research Findings • Scholarly articles, secondary sources, replication, ‘filling in the hole’ • Serendipity (by accident) • A finding that you were not expecting
What is a Research Grant? • Research Grants and contracts are written agreements with external sponsors. They contain one or more of the following provisions: • A research protocol or other statement of work • A designated principal investigator(s) • A designated period of performance • A budget • Obligation to account for costs & return unspent $ • Disposition of intellectual property rights
Searching for Research • Internet • Academic versus Nonacademic • Library Databases • Ask New Questions • Once have articles, use those references • Popular newspapers/magazines
Literature Reviews • Evaluate Previous Research • Create a database (or collection); pg. 26 • Attention to Methodology • Sampling • Questions/Hypotheses • Variables • Measurement • Analyses • Conclusions • Limitations
Once you assess • Do you want to replicate? • Same measures or modifications • Making changes limits the comparisons • Are there themes? Or links? • Organize the literature (Chapter 10)
Writing a Literature Review • Your literature review should reflect the important thinking in the area that will impact your work, and should provide a context for the background and importance of the question. You should identify existing knowledge and the gaps in the knowledge, and indicate methodologies that have been used in other similar research questions. The literature review is often included as part of your research proposal.
Two levels of Review • Conducting a literature review • Your research investigation of the literature • Writing a literature review • The review you write for your own project
UK COE Library • Looking for resources, start here: • http://www.uky.edu/Libraries/educ.html
Theory and Reasoning • Theory (pg. 27): a set of statements logically linked to explain some phenomena in the world around us
Deductive Reasoning • Deductive reasoning works from the more general to the more specific. • It is informally called a "top-down" approach. • We might begin with thinking up a theory about our topic of interest. We then narrow that down into more specific hypotheses. We narrow down even further when we collect observations to address the hypotheses. This leads us to be able to test the hypotheses with specific data; a confirmation (or not) of our original theories.
Inductive Reasoning • Inductive reasoning moves from specific observations to broader generalizations and theories. • Sometimes called a "bottom up" approach • In inductive reasoning, we begin with specific observations and measures, begin to detect patterns and regularities, formulate some tentative hypotheses that we can explore, and finally end up developing some general conclusions or theories.
Deductive Versus Inductive • "Deductive reasoning" refers to the process of concluding that something must be true because it is a special case of a general principle that is known to be true. • If you know the general principle that the sum of the angles in any triangle is always 180 degrees, and you have a particular triangle in mind, you can then conclude that the sum of the angles in your triangle is 180 degrees. • "Inductive reasoning" is the process of reasoning that a general principle is true because the special cases you've seen are true. • If all the people you've ever met from a particular town have been very strange, you might then say "all the residents of this town are strange".
The Ethics of Research • These are designed by your governing institution, granting agencies, organizations, and yourself • One such Ethics Standards are presented by AERA • http://www.aera.net/about/policy/ethics.htm
IRB • Institutions that conduct research set up IRB; Institutional Review Board • University of Kentucky • Office of Research Integrity • http://www.rgs.uky.edu/ori/