380 likes | 408 Views
Science in General. Science in General. Definition: a systematically organized body of knowledge Assumptions 1. there is order in nature 2. every event has an explanation 3. we will never know everything Definitions of important terms
E N D
Science in General • Definition: a systematically organized body of knowledge • Assumptions • 1. there is order in nature • 2. every event has an explanation • 3. we will never know everything • Definitions of important terms • Variable: any trait or characteristic which can take on a range of values
Science (continued) • Hypothesis: Question or statement about the relationship between two or more variables i.e., is there a relationship between number of police on the streets and the crime rate? • Independent variable (IV): a variable thought to have an effect • Dependent variable (DV): affected variable
Science (continued) • Theory: an explanation that systematically organizes observations and hypotheses • Basic vs. Applied Research • Basic--why questions; Applied--solve problems • Cross-sectional vs. Longitudinal Research • Experimental vs. Ex Post Facto Research
Some notes about research • Much research does not “pan out” • Some research gets results “accidentally” • Research which seems trivial sometimes turns out to be important (Golden Fleece award is sometimes undeserved) • We cannot assume that “commonsense” is correct • Study of patterns, not individuals
Errors in observation • Inaccurate observation (measure & record) • Overgeneralization (sufficient number of subjects, replication of studies) • Selective observation (sufficient number) • Illogical reasoning, such as gambler’s fallacy and ex post facto reasoning (logic) • Ego involvement, Premature closure, Reductionism
Research methods • Experiments (manipulation and control) • Surveys (written and interviews) • Field or observational research • Record or archival research (content analysis, secondary analysis) • Case study • Evaluation research
Theories • Importance of theories--they drive research • Criteria for a good theory 1. consistent with known facts 2. internally consistent, not contradictory 3. parsimonious 4. subject to empirical investigation 5. able to predict
Theory building • Deductive reasoning: start with an explanation, derive hypotheses and test them. ex: family instability as a result of social upheaval • Inductive reasoning: gather information and then develop a theory. ex: Durkheim and crime and suicide • Relationship between research and theory
Examples of research studies • Hirschi and social control theory • Policewomen on patrol • Kansas City Patrol Experiment • Group therapy in California prisons
Relationships vs. Causation • To be a cause, one variable must be necessary and sufficient to affect another variable. • Something may be necessary but not sufficient (intelligence and good grades) • Could be sufficient but not necessary (isolation in early life and mental retardation)
Three criteria to establish a cause • 1. Cause must precede effect • 2. Two variables must be empirically correlated (as one changes, the other changes, in a systematic fashion) 3. Relationship must not be explained away by a third variable. Storks and babies, polio and pavements Large family size and delinquency
Relationship: child abuse & Delinquency • Two methods of study • Retrospective • Private residential treatment center, 66% abused • Runaway shelter, Ohio, 75% abused • Juvenile delinquents, 40% abused, neglected or abandoned
Relationship (continued) • Prospective • 5000 children referred for abuse followed: after 5 years, 14% adjudicated, after 10 years, 32% • A N.Y. study found that 50% of families reported had at least one child taken to court as delinquent
Conclusions • Not cause and effect • Need for a base rate of comparison--how many children are abused, and how many go to juvenile court • There would appear to be a relationship • Abused children at greater risk, higher p • Other explanatory variables for the relationship
Purposes of research • Exploration--satisfy curiosity, test feasibility of a study, develop methods • Description (Census, polls) • Explanation • Units of analysis: units observed and described to create summary descriptions of all units and to explain differences among them
Units of analysis • Individuals • Groups (i.e., families, gangs) • Organizations (police departments) • Social artifacts (traffic accidents, court cases, prison riots)
Steps in designing research • Choosing a research problem • Reviewing the literature: abstracts and journals, books, collected readings, computer searches (NCJRS), CD ROMS, and the internet • Conceptualization of variables, hypotheses, questions
Steps in research (continued) • Selecting how to measure variables (operationalization) • Selecting subjects for the study: population and sample • Method: making observations and measurements • Data processing and analysis
Steps in research continued • Interpreting the results and their applications
Research articles • Who does research? • Process for getting into print • Abstract • Introduction--problem, literature review • Method--description of subjects, instruments, procedure, data analysis used • Results--descriptive and inferential statistics
Research article (continued) • Tables and graphs in results section • Discussion: interpretation of results, cautionary notes, directions for future research--the next questions
Research proposal • Abstract • Introduction (introduction of the topic, literature review, statement of what this study would do) • Method • Subjects: how many? What are their characteristics? How will they be selected? • Instruments: what questions will be asked?
Research proposal (continued) • Procedure--how will the study be carried out? • Schedule: List each step, and estimate how much time each would take (sometimes steps can be done simultaneously), and indicate the total length of the project • Budget: List all items, the cost of each, and the total costs.
Research proposal (continued) • Indicate supplies, travel, personnel costs, etc. and justify • Bibliography (based on Introduction--every reference in the introduction should have an entry in the bibliography, and vice versa • Appendices
Layout of research proposal Abstract xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Introduction (a few paragraphs)
Layout (continued) Method Subjects xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Instruments xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Layout (continued) Procedure xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Schedule xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx (a table is also useful)
Layout (continued) Budget xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx (again, a table may be used) Item Cost
Layout (continued) Bibliography author, year of publication, title, journal, volume #, pages. Author, year of publication, title, city of publisher, publisher.
Measurement • Concepts, hypothetical constructs: theoretical ideas based on observation but which cannot be observed directly • aggressiveness • intelligence • prejudice
Measurement • Difficulty of measurement of hypothetical constructs • LaPiere study • Interchangeability of indicators: if several different indicators follow the same pattern, they are measuring the same concept • Definition of concept: dictionary
Operational definition • definition that describes how a concept will be measured (intelligence will be measured by the scores on the Stanford-Binet and the WAIS-R) • Considerations for operational definitions: reliability, validity, norms, precision • Reliability: consistency of measurement. Different from accuracy
Assessing reliability • Test-retest: Scores should not change much over a short period of time • Split-half: divide test into two parts, scores should be the same on one part as on the other for the same individual • reliability affected by (1) reliability of observers and by (2) poor questions
Validity • Does the test measure what you want it to measure? • Four types of validity: face, criterion or predictive, content, construct • Face validity: does it appear to measure what you want it to? Do the questions appear relevant?
Validity (continued) • Criterion or predictive: does the measurement predict something we would like to predict? • Examples: ACT and success in college (GPA), Screening tests and future job performance • Determined by applying measure, and then determining how well it would have predicted
Validity (continued) • Content validity: degree to which a measure covers the range of meanings in the concept • Example: achievement test, senior test, attitude test, personality trait • Construct: based on way a measure relates to other variables within a system of theoretical relationships (Hirschi)
Other considerations • A measure could be reliable but not valid. It cannot be valid unless it is reasonably reliable • Norms: measures which provide a basis for comparison • Precision: fineness of distinction in measuring. How precise? In the social sciences, we are not very precise