170 likes | 190 Views
Explore the systematic examination of facts, theories, and practices through basic, applied, and evaluation research approaches. Learn how inquiry shapes decision-making processes and investigative thinking, highlighting the key traits of a successful investigator.
E N D
What is inquiry?? • A studious, systematic examination of facts or principles; research • Investigation or experimentation aimed at the discovery and interpretation of facts, • Revision of accepted theories or laws in light of new facts • or practical application of such new or revised theories or laws
Inquiry -- from three different perspectives • Basic research • scientific investigation to develop or enhance theory • Applied research • testing theory to assess its ”usefulness” in solving (instructional or educational) problems • Evaluation research • determining whether a program, product, or process warrants improvement, has made a difference or impact, or has contributed to general knowledge or understanding
Inquiry calls for systematic thinking • What research (as we’ll study it) is not: • Mere information gathering • Mere information assembly • Mere rummaging for information • An abstraction (e.g., suggesting that “years of research” have led to ________________)
“Defining” evaluation • Merriam Webster Dictionary (online): “… to determine the significance, worth, or condition of something, usually by careful appraisal and study.” • Patton (1996): “… the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming” (p. 23).
The research/evaluation dichotomy: real or contrived? • Not method or subject matter • But intent--the purpose for which it is done • Evaluation leads to • conclusions, and to get to them requires identifying standards and performance data and the integration of the two.
The research/evaluation dichotomy: real or contrived? • We would argue that evaluation also involves studious inquiry … but its intents are far different. • Well-constructed evaluations • How theory translates to practice. • Evaluation research (in particular) is guided by explicit (not merely implicit) standards of conduct
The research/evaluation dichotomy: real or contrived? In summary, then, evaluation differs from other kinds of research in that... • central questions are derived from policymakers and practitioners • results are generally used to improve programs, projects, products, or processes • it tends to occur in turbulent action settings • results are often reported to nonresearch audiences
Working with complex terminology • Theory • Approach • Model • Principle • Guideline • Heuristic • Framework • Frame of reference • Orientation
Working with complex terminology • One can be dedicated to inquiry … while not invested in developing or enhancing theory, per se
Why conduct research (broadly) • To judge merit or worth • (accountability, accreditation/licensing, cost-benefit decisions) • To improve programs • (identify strengths and weaknesses, ensure quality, or check progress toward goals) • To generate knowledge • (make generalizations about effectiveness, build theory, make policy, extrapolate principles that may be applied to other settings)
Why conduct research (specifically) • To describe what happens • thus providing evidence regarding the short- and long-term effects of …. • To determine cost-effectiveness • To improve existing programs • To document successes and mistakes • To predict how variables might impact/affect specific situations • To explain or identify promising theories associated with specific phenomenon
Research and decision-making (1) • Applied research and evaluation help people make a wide array of instrumental action decisions, e.g.: • making midcourse corrections • continuing, expanding, or institutionalizing a program … or cutting, ending, or abandoning it • testing a new program idea • choosing the best of several alternatives • deciding whether or not to continue funding
Research and decision-making (2) Applied research and evaluation help people make a wide array of organizational decisions, e.g.: • recording program history • providing feedback to practitioners • highlighting program goals • establishing accountability • understanding social intervention
Investigative thinking: different strands • Inductive thinking • one draws on particular instances (patterns, results) to draw conclusions. • Deductive thinking • one’s conclusions about particulars are drawn from general or universal premises
Personal Traits of A “Good” Investigator • Methodical • Logical • Systematic • Organized • Able to manage time, to prioritize tasks • Good with people/able to connect, commands respect, inquisitive/curious, not easily swayed by rumor and innuendo, persistent, tenacious, calm, etc.
“Criteria” • As an researcher, then, you are expected... • to be competent • to be honest and demonstrate integrity • to show respect for people • to be politically savvy • to work systemically • to make data-based decisions
References • Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines (3rd ed.). Cambridge, MA: Perseus Publishing. • Kifer, E. (1995). Evaluation: A general view. In G. J. Anglin (Ed.), Instructional technology: past, present and future (2nd ed.) (pp. 384-392). Westport, CT: Libraries Unlimited, Inc. • Patton, M. Q. (1996). Utilization-focused evaluation (3rd ed.). Thousand Oaks, Sage Publications. • Patton, M. Q. (2002). Qualitative evaluation and research methods (3rd ed.). Thousand Oaks, CA: Sage Publications. • Russ-Eft, D., & Preskill, H. (2001). Evaluation in organizations: A systematic approach to enhancing learning, performance, and change. Cambridge, MA: Perseus Publishing. • Scriven, M. (1991). Evaluation thesaurus (4th ed.). Thousand Oaks, CA: Sage Publications. • Weiss, C. H. (1998). Evaluation. Upper Saddle River, NJ: Prentice-Hall.