290 likes | 415 Views
Toolbox of a professional researcher. Master Class on the Development of Analytical skills Žilvinas Martinaitis, 2009. Professional experience: 5 years at Public Policy and Management Institute; Over 30 applied research projects for the Government and the EU Commission;
E N D
Toolbox of a professional researcher Master Class on the Development of Analytical skills Žilvinas Martinaitis, 2009
Professional experience: 5 years at Public Policy and Management Institute; Over 30 applied research projects for the Government and the EU Commission; Academic experience: 4 years of teaching experience at VU; PhD Student at VU. How does a researcher look like?
Key questions: • What is the difference between a researcher, fiction writer and Georg Wilhelm Friedrich Hegel? • What is the meaning of life for a researcher? • What does the toolbox contain: • Objectives, questions and problems; • The hypothesis; • Research methods.
How to recognize a researcher? • They create value added. Value added = asking important questions + providing generalizable answers + testing, if the answers are correct! • You can actually read their papers! Rules of thumb: KISS! Grandmother test! 5 tells; 1 and 8 rules.
The meaning of life: • Although the debate about THE TRUTH goes on, we attempt to get closer to it; • We raise the questions, develop and systematically test the answers (hypothesis, theories and laws). • What is the best indicator of your success?
The toolbox: • Objectives, problems and questions; • The solutions and the answers: hypothesis; • Research methods.
Highway to hell: I want to write about.. I focus on Estonian foreign policy; Foreign policy is a very important policy Highway to paradise: I seek to explain why..? I seek to explain, why Estonian – Russian relationships changed over the past 10 years. I seek to explain empirical and theoretical puzzles. The purpose statement of research paper
The problem/puzzle: • Why bother: no problems, no solutions! “We fail more often not because we find the wrong solution, but because we solve the wrong problems” (Ackoff, 1974). • A good problem: • Doesn’t lead to a question “so what?”; • Doesn’t provide obvious answers; • Looks like an interesting puzzle, which is worth solving; • Involves empirical and/or theoretical contradictions.: “Although ….., however……”.
The research question: • If you got the problem, the question is easy: “Why…? How…?” • Link your question with wider theoretical debates (the lit. review!).
A q B Developing a hypothesis • Once you have a question and a puzzle, you need an answer/solution. • The answers should create value added, because they are: • Based on the knowledge already available; • Empirically testable; • Generalizable to other cases. • A hypothesis is theoretically driven statement about causal relationships between variables (the causes and effects). • Good hypothesis could be written as follows:
Variables: Dependent v.= phenomenon to be explained; Independent v.= factors explaining the dep. v.; Intervening v.= is caused by indep. v. and causes dep. v.; Condition v. = frames antecedent conditions Examples: Sunshine causes grass to grow; Sunshine causes grass to grow; Sunshine causes photosynthesis, which causes grass to grow; Sunshine causes grass to grow, but only when there is enough of rainfall Key ingredients of a hypothesis:
Operationalizing your hypothesis • Clearly define your variables • Identify criteria for verification of the values of your variables: • Identify observable implications of your hypothesis: Example: left leaning coalitions promote employment security.
Research methods: • In principle the data for testing the hypothesis could be collected in following ways: • Experiment; • Observation: • Large n analysis (quantitative); • One or two case studies (qualitative).
Comparative analysis: method of difference The method of difference: • Choose two cases, which are similar in all respects except for the phenomenon you want to explain (DV) and the factors explaining it (IV). • Variables with the same values can not explain the difference in the results. • Hence the variation in the outcomes is explained by the difference in the values of IV.
Examples: reforms in the pension system in the Czech republic and Poland (Muller 2001)
Comparative analysis: method of similarity • It is exactly the opposite: find two cases, which are very different, except for the value of IV and DV;
Issues and problems in comparative analysis • Comparing apples with oranges • Additional variation: for e.g. when using the method of difference you find an additional variation? use the shadow case study.
Case studies: • “For example…” is not a case study! • The overall logic is the same as in comparative analysis: seek to explain variations! • Two strongest types of case studies: • Process tracing; • Critical case studies.
Example of process tracing: electoral barrier and the number of effective parties in Poland
Example of process tracing: electoral barrier and the number of effective parties in Lithuania
Critical case studies Key steps: • Clearly defined hypothesis; • Find a case, which “perfectly” matches the conditions set out in the hypothesis; • Show that despite the “perfect match” the hypothesis is wrong. • Add additional antecedent conditions to the theory Two variations: “the best suited cases”; and “the worst cases”.
Summary: what is a good academic or policy paper? • Solves puzzles, which are embedded in academic discussion and are relevant for the “real world people”; • Raises questions and provides answers (the hypothesis); • Creates value added on top of the existing knowledge; • Performs systemic empirical tests to assess the validity of the answers (hypothesis); • It is readable and understandable!
Further readings: • Stephen Van Evera, Guide To Methods For Students Of Political Science, Ithaca and London: Cornell University Press, 1997; • Gary King, Robert O. Koehane ir Sidney Verba, Designing Social Inquiry. Scientific Inference In Qualitative Research, Princeton, New Jersey: Princeton University Press, 1994.