1 / 14

MSA830: Introduction

MSA830: Introduction. Petter Mostad mostad@chalmers.se. MSA830 homepage. http://www.math.chalmers.se/Stat/Grundutb/GU/MSA830/H07/. Chapter 1: Scientific investigation. Scientific investigations In a research group In an organization, workplace, factory… Empirical basis of science

fagan
Download Presentation

MSA830: Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MSA830: Introduction Petter Mostad mostad@chalmers.se

  2. MSA830 homepage http://www.math.chalmers.se/Stat/Grundutb/GU/MSA830/H07/

  3. Chapter 1: Scientific investigation • Scientific investigations • In a research group • In an organization, workplace, factory… • Empirical basis of science • Making observations • Experimenting

  4. Inductive – Deductive learning Data Data • Model for science: Inductive – deductive iteration • Increasingly good prediction/explanation of data (i.e., the real world) Deduction Induction Deduction Induction Model, theory, idea.. Model, theory, idea.. …

  5. Alternative formulation: Updating knowledge • At any time, we have a “model for our knowledge about something” • The model may contain probabilities, to indicate uncertainties • When we make observations (either directly or after experimentation) we update our model about reality (changing probabilities, or changing the model) • We make observations to update the parts of the model that interest us • The process is iterative

  6. Different paths of discovery • Many different paths can lead to similar models • Goal: An efficient path • Example: 20-questions game • Very different sequences of questions can lead to same result • Efficient investigation: Each question should have equal probability of yes or no answer • No “objective” probabilities: They depend on your current model • Subject matter knowledge: In this case, your opponents cultural background and way of thinking

  7. Complexity • Any model models only a small part of reality • Challenge: To simplify away all that is irrelevant in this context • Essential feature: observable predictions • Identify the parts of the model you want to learn more about • Any model predicts only approximately

  8. Experimental error • Not that something has been done wrong! • The discrepancy between the model and the observed values • Good models have small experimental errors (while still being as simple as possible) • Statistical models can be used to formulate models that contain experimental errors

  9. Designing experiments • Idea of experiment: To learn as much as possible about what you have questions about • The effects you want to learn about must not be obscured (confounded) by experimental error • Objective: Design experiment so that observing outcomes is likely to give as much information as possible about the questions you have

  10. Link between experimental design and statistical analysis • In order to optimize the experiment, you necessarily have to consider how you are going to learn from the experiment • Considering the statistical analysis before the experiment is performed • Possible framework: Update of statistical model

  11. Correlation versus causation • Examples: • Storks cause births? • Smoking causes depression? • … • Problem: Several different causal models can explain the same data • Often: An underlying unobserved factor influence both the observed factors, making them correlated

  12. The advantage of experiments over just making observations • In an experiment, the experimenter goes in and decides some of the experimental parameters • The way this happens may make some causal explanatory models very unlikely • Example: The experimenter rolls a dice to decide which patients will receive which treatment. • The key is to refute certain explanatory models versus others. Example: Mendelian randomization.

  13. Experimental versus non-experimental inference • Randomized experiments are generally the “gold-standard” of science • Sometimes, randomized experiments are difficult. Examples: • Effects of social parameters on people • Clinical testing of treatments for fatal diseases • Questions in cosmology, geology, … • We must then find other ways to differentiate between different explanatory models

  14. Scientific investigation in practice • Iterative in nature • Models should predict as well as possible, while being as simple as possible • Statistics is a precise way to formulate models with experimental errors • Models should reflect relevant parts of all available knowledge • Define your objective! • The statistical analysis should be considered before designing any experiment

More Related