1 / 22

Research Design

Research Design. Quantitative Study Design - B. Back to Class 9. Descriptive Study Designs.

linck
Download Presentation

Research Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Design Quantitative Study Design - B. Back to Class 9

  2. Descriptive Study Designs • These studies are conducted to examine variables in naturally occurring situations. They look at relationships between variables as part of the overall descriptions but they do not examine the type or degrees of relationships. They protect against bias through conceptual and operational definitions of variables, sample selection, valid and reliable instruments, and control of the environment in which the data are collected.

  3. Types of Descriptive Studies • Purely descriptive studies • study the variables within a particular situation with a single sample of subjects • Comparative descriptive studies • examine the difference in variables between two or more groups that occur in a particular situation • Time dimensional studies • Longitudinal – changes in same subjects • Cross-sectional – changes in groups of subjects at different stages of development, simultaneously • Trend – take samples of population at pre-set intervals • Event partitioning

  4. Descriptive Study Designs • Time dimensional cont. • Retrospective studies – a manifestation of some phenomena existing in the present is linked to phenomena occurring in the past. • Prospective studies – examine a presumed cause then go forward in time to the presumed effect. Its more costly and the researcher may have to wait a long time, but the correlation is stronger.

  5. Descriptive Study Designs • Case study design • Investigation of an individual, group, institution or other social unit to determine the dynamics of what the subject thinks, behaves or develops in a particular manner. It requires detailed study over time. You can use any data collection method. • Strength – the depth of the study – it’s not superficial • Weakness – subjectivity of the researcher

  6. Descriptive Designs cont. • Survey Design • Research activity that focuses on the status quo of some situation. Information is collected directly from the group that is the object of the investigation by interview, telephone, internet, questionnaire. Purposes can be to: • describe – people’s characteristics, attitudes or beliefs – sub-samples may be compared • explain – a variable of interest by examining its relationship to other variables – nothing is manipulated • predict – people report their plans or intentions and extrapolations can be made • explore – use probing, loosely formulated questions to find out background data of subjects; to gain information to formulate research questions or hypotheses; to help develop theory for qualitative research • Strength – flexibility and broad scope • Weaknesses – superficial, ex post facto, time and resources

  7. Evaluation Research • An extremely applied form of Research that looks at how well a program, practice or policy is working. Its purposes are • To evaluate the success of a program, not why it succeeds, but whether it is succeeding • To answer practical problems for persons who must make decisions

  8. Evaluation Research cont. • The classical approach • Determine objectives of the program • Develop means of measuring attainment of objectives • Collect data • Interpret data vies-à-vies the objectives • Goal-free evaluation • Evaluation of the outcomes of a program in the absence of information about intended outcomes • Must describe the repercussions of a program or practice or various components of the overall system

  9. Categories of Evaluation • Formative evaluation – the ongoing process of providing evaluation feedback in the course of developing a program or policy (Process or Implementation) • Summative evaluation – the worth of a program after it is already in operation – to help decide whether it should be discarded, replaced, modified or continued (Outcome Analysis – a. Impact b. Cost • Comparative evaluation – assesses the worth of two or more programs or procedures • Absolute evaluation – assess the effects of a program in and of itself – no contrast with other programs – called criterion-referenced – measures against criteria

  10. Needs Assessment • Similar to evaluation research, it provides informational input in a planning process. It is usually done by an agency or group with a service component. It helps in establishing priorities. There are three approaches: • Key informant • Survey • Indicators

  11. Evaluation Research Weaknesses • Threatening to individuals • Seen as a waste of time • Role conflicts if researcher is in-house • Censor by “politicians” in-house • When some goals are satisfied and others are not, how is the whole thing evaluated • Goals may be for the future so can’t see outcome now

  12. Other Types of Research • Secondary Analysis –studying data that have been previously gathered • Strength – it is efficient and economical • Weakness – • Variables may have been under analyzed • You may want to look at different relationships among variables • You may want to change the unit of analysis • You may want data from a sub-sample • You may want to change the method of analysis

  13. Other Types of Research • Replication Studies • Meta-analysis – merging findings from many studies that have examined the same intervention, then using statistics to determine overall effects of intervention. • Methodological – designed to develop the validity and reliability of instruments that measure constructs/variables. They are controlled investigations of ways to obtain, organize and analyze data.

  14. Research Design Considerations • Research Control – the design should maximize the control an investigator has over the research situation and the variables. Rigor in quantitative control is exerted by the methodology used • Constancy of condition – conditions under which the data are collected must be as similar as possible • Environment • Time, day, year • One interviewer –if not minimize the variability • Communication and treatment should be constant (same)

  15. Research Design Considerations • Research control cont. • Manipulation as control – ability to manipulate the independent variable is very powerful • Assures that conditions under which information was obtained were constant or at least similar – can’t do that with ex-post facto research • Allows more difficult treatment because of control over it • Can test two independent variables at the same time as their effects

  16. Research Design Considerations • Research control cont. • Comparison groups as control – scientific knowledge requires some type of comparison – even case studies have an implied reference – “normal” • Randomization as control – if you can’t randomize the subjects, then at least vary the order in which questions are asked – especially for attitudes

  17. Research Design Considerations • Research control cont. • Control over extraneous individual characteristics of subjects • Use only homogeneous subjects • Include extraneous variables as independent variable – randomly assign them to sub-blocks • Matching – use knowledge of subjects from comparison groups – matching on more than three characteristics is difficult. Matching may be done after the fact • Use statistical procedures (ANOVA) after the fact • Use subjects themselves as their own controls • Use randomization.

  18. Research Design Considerations • Validity – the measure of truth or accuracy of a claim • Internal validity shows that the findings are due to the independent variable. It is maintained by using the controls on the previous slides, and by preventing threats to internal validity

  19. Research Design Considerations • Threats to internal validity • History – external threats which affect the dependent variable • Selection – biases from pre-treatment differences • Maturation – within the subject over time – not from the treatment • Testing – the effect of taking a pretest on posttest scores • Mortality – loss of subjects during the study • Other factors

  20. Research Design Considerations • External validity – the generalizability of research findings to other settings or samples specifically to the population from which the sample came – there is no problem generalizing to the accessible population. Threats to external validity are: • Population factors: • The Hawthorne effect – awareness of participation causes different behavior • Novelty effect – newness of the treatment might cause alteration in behavior • Placebo Effect

  21. Research Design Considerations • External validity cont. • Ecological Effects • Interaction between history and tresatment • Interaction between selection and treatment • Interaction between setting and treatment

  22. Research Design Considerations • Experimenter effects – research is affected by characteristics of the researcher • Paradigm effect – basic assumptions and ways of conceptualization • Loose protocol – step-by-step detail not planned • Miss-recording effect –especially if subjects record own responses • Unintentional expectancy effect – influences subjects response • Analysis effect – decide how to analyze after data collected • Fudging effect – reporting effects not obtained Back to Class 9

More Related