170 likes | 307 Views
Part 2: Evaluating your program. You can download this presentation at: http://faculty.smcm.edu/acjohnson/PREP/. Evaluating your program. What are your objectives for your program? What data would let you know you’re meeting those objectives?
E N D
Part 2: Evaluating your program • You can download this presentation at: • http://faculty.smcm.edu/acjohnson/PREP/
Evaluating your program • What are your objectives for your program? • What data would let you know you’re meeting those objectives? • What data would convince your administration to keep funding the program?
Evaluating your program • Informal assessments: To let you know that the program is working; to fine-tune it as you go along • Observations of student progress, conversations with students, informal surveys
Evaluating your program • Formal assessments • Of the first year • Longer-term
Evaluating your program • The basics: • Comparison groups • Independent variables • Dependent variables
Comparison groups • Historical: ESP participants vs similar students before ESP • Comparable: ESP participants vs similar students not in ESP • To the norm: ESP participants vs all non-participants • To decliners: People who rejected an invitation to ESP
Independent variables • ESP participation • Race • Gender • Academic preparation (SAT scores; CCI pre-test) • Financial need • Motivation
Dependent variables • CCI post-test scores • CCI growth scores • Calc grades • Raw data, % receiving A or B, % failing • Enrollment/grades in Calc II • Declaring SEM major • Graduating at all • Graduating with SEM major
Analyzing the data • Descriptive statistics: Simply compare the performances of the relevant groups • Are differences in grades or scores significant? Independent-samples t-tests • Are differences in percent of students doing something (getting As & Bs, graduating) significant? Chi-square
Analyzing the data • Controlling for preparation: Divide data into groups according to some measure of preparation
Analyzing the data • Controlling for preparation: Construct a regression equation using all your available independent variables; see whether ESP participation is a significant predictor of the dependent variable of interest • For continuous dependent variable: OLS; for binary: Logistic regression
Calculus concept inventory • Pros • This is the gold standard--did the students learn calculus? Did they learn more than other students? • This approach--at least the descriptive stats--can be used for small n • Cons • Limited number of test items--test might not be reliable or valid enough for your comfort • Requires access to all calculus students, not just ESP students
Calc grades, SAT & GPA data • Pros: • Lets you control for preparation • Administrators like statistical analyses • Cons: • Someone has to like stats--might need SPSS • You have to find someone in institutional research to let you have the data • Requires a substantial n
Examples • Fullilove & Treisman, 1990 • Comparison groups: • Historical--pre-MWP African Americans • African American accepters & decliners • Preparation measures: • Special admission? Math SAT scores • Dependent variables: • Calc performance, graduation
Examples • Johnson, 2007a • Comparison groups (all with 1st major in science): • White/Asian; Black/Latino/American Indian • Independent variables: • Financial need, predicted GPA • Dependent variables: Graduation with science/math major, grad GPA
Expanding your program • Evidence that matriculation-to-graduation programs produce even bigger benefits: • Johnson (2007a) • Maton, Hrabowski & Schmitt (2000) • Maton & Hrabowski (2004) • Gándara (1999)
Evaluating your program • What are your objectives for your program? • What data would let you know you’re meeting those objectives? • What data would convince your administration to keep funding the program? • Evaluating your program • Informal assessments: To let you know that the program is working; to fine-tune it as you go along • Observations of student progress, conversations with students, informal surveys • Evaluating your program • Formal assessments • Of the first year • Longer-term • Evaluating your program • The basics: • Comparison groups • Independent variables • Dependent variables • Comparison groups • Historical: ESP participants vs similar students before ESP • Comparable: ESP participants vs similar students not in ESP • To the norm: ESP participants vs all non-participants • To decliners: People who rejected an invitation to ESP • Independent variables • ESP participation • Race • Gender • Academic preparation (SAT scores; CCI pre-test) • Financial need • Motivation • Dependent variables • CCI post-test scores • CCI growth scores • Calc grades • Raw data, % receiving A or B, % failing • Enrollment/grades in Calc II • Declaring SEM major • Graduating at all • Graduating with SEM major • Analyzing the data • Descriptive statistics: Simply compare the performances of the relevant groups • Are differences in grades or scores significant? Independent-samples t-tests • Are differences in percent of students doing something (getting As & Bs, graduating) significant? Chi-square • Analyzing the data • Controlling for preparation: Divide data into groups according to some measure of preparation • Analyzing the data • Controlling for preparation: Construct a regression equation using all your available independent variables; see whether ESP participation is a significant predictor of the dependent variable of interest • For continuous dependent variable: OLS; for binary: Logistic regression • Calculus concept inventory • Pros • This is the gold standard--did the students learn calculus? Did they learn more than other students? • This approach--at least the descriptive stats--can be used for small n • Cons • Limited number of test items--test might not be reliable or valid enough for your comfort • Requires access to all calculus students, not just ESP students • Calc grades, SAT & GPA data • Pros: • Lets you control for preparation • Administrators like statistical analyses • Cons: • Someone has to like stats--might need SPSS • You have to find someone in institutional research to let you have the data • Requires a substantial n • Examples • Fullilove & Treisman, 1990 • Comparison groups: • Historical--pre-MWP African Americans • African American accepters & decliners • Preparation measures: • Special admission? Math SAT scores • Dependent variables: • Calc performance, graduation • Examples • Johnson, 2007a • Comparison groups (all with 1st major in science): • White/Asian; Black/Latino/American Indian • Independent variables: • Financial need, predicted GPA • Dependent variables: Graduation with science/math major, grad GPA • Expanding your program • Evidence that matriculation-to-graduation programs produce even bigger benefits: • Johnson (2007a) • Maton, Hrabowski & Schmitt (2000) • Maton & Hrabowski (2004) • Gándara (1999)