280 likes | 391 Views
Evolution of Institutional Capacity to Support the Assessment-Change-Effectiveness Cycle: An Undergraduate Science Program Case Study. Dr. Mary Spencer Dr. Chris Lobban Dr. María Schefter Dr. Greg Witteman University of Guam. Introduction Pre-assessment of attendees
E N D
Evolution of Institutional Capacity to Support the Assessment-Change-Effectiveness Cycle:An Undergraduate Science Program Case Study Dr. Mary Spencer Dr. Chris Lobban Dr. María Schefter Dr. Greg Witteman University of Guam
Introduction • Pre-assessment of attendees • Synthesis of 5 years of student outcomes assessment at UOG (Dr. Spencer) • Evaluation and the RISE Program • Background and framework (Dr. Schefter) • Example from the classroom (Dr. Lobban) • Assessment and information technology (Dr. Witteman) • Interactive sharing and discussion • Wrap-up • Post-assessment
UOG’s NIH RISE Program • RISE program is a broad and flexible grant for student, faculty, and institutional development. • Long-term goal is more minority PhDs in biomedical research. • Short-term goal is to increase motivation and capacity for biomedical research.
UOG’s NIH RISE Program • Evaluation is required. • Goals/objectives for NIH must be in terms of measurable outcomes for students, faculty, or the institution. • Present program includes student apprenticeships in research labs and a science technology classroom, plus faculty development opportunities.
How reluctant scientists are getting involved in evaluation • Evaluation standards • Engaging scientists • Goals and measurable objectives • Student input • Closing the loop
The “big picture” • In the present paradigm of biology, life is organized into “levels,” – or systems – with each level having “emergent” properties not seen in the parts. • Analogous to an institution of higher ed.? • The “big picture” of our assessment efforts. • Assessment of RISE is multi- level as well as multidisciplinary.
Levels of Organization Organism (individual bat) Body system (skeletal system) Organ (leg bone) Part of an illustration in Lobban & Schefter (1997)
Courses and workshops • o Factual conceptual knowledge of course content • o Lab skills appropriate to course • o Specific training in (e.g.) computer skills • o Specialized science reading / writing skills (e.g., lab reports) • o Links between course objectives and program / institutional / gen. ed. outcomes Educational outcomes: “levels” for assessment Community University Majors Courses & workshops
Majors (Discipline-specific training / education) • o Factual / conceptual knowledge of the field • o Proficiency in using scientific literature • o Ability to perform appropriate data collection /analysis • o Apprenticeship experiences in research labs Educational outcomes: “levels” for assessment Community University Majors Courses & workshops
University education • o Reading/Writing/Analytical skills (GRE) • o Critical Thinking skills • o Computer literacy • o Presentation skills • o General education outcomes Educational outcomes: “levels” for assessment Community University Majors Courses & workshops
Career /Community level • o Career success (as PhD researcher or other) • o # Scientific findings • o # of PhD researchers • o Community service – science ed., biota, • (endangered) species survey work Educational outcomes: “levels” for assessment Community University Majors Courses & workshops
NIH RISE Program Assessment—change—effectiveness cycle(s)? Community Change University Majors Courses & workshops Assessment
Grass roots • Learning objectives • 3 parts (Mager) • Observable behavior (esp. verb… Bloom) • Of what…? • Criteria, e.g., scoring rubric
COGNITIVE PROCESS DIMENSION Remember Understand Apply Analyze Evaluate Create KNOWLEDGE DIMENSION Factual knowledge Conceptual knowledge Procedural knowledge Metacognitive knowledge
Grass roots • Strengths and Weaknesses • Student self ratings • Faculty ranking of skills by courses
Faculty assessment of skills for courses: A. Students need this skill as a prerequisite. B. Students need basic skill and I help them with it. C. I teach students this. D. Could be helpful in the course but not necessary. Cross out if skill is not useful in your course Also please: Put a star by the number if you think students will need this skill in most graduate biomedical/behavioral programs
Introduction • Pre-assessment of attendees • Synthesis of 5 years of student outcomes assessment at UOG (Dr. Spencer) • Evaluation and the RISE Program • Background and framework (Dr. Schefter) • Example from the classroom (Dr. Lobban) • Assessment and information technology (Dr. Witteman) • Interactive sharing and discussion • Wrap-up • Post-assessment
To gain an understanding of Pacific Island environments and the ecological principles on which they operate: the ecosystems (reefs, forests, savanna, wetlands); the biological, physical, and chemical processes and interactions that regulate these systems; and the ways in which humans affect and are affected by the natural environment. • Your understanding will be tested through your skills in: • interpreting – e.g., changing classification diagrams into text or vice versa; reading graphs; • exemplifying – e.g., giving an example of … • classifying – e.g., being able to classify the trophic level of an animal from a food web diagram • summarizing – e.g., Be able to summarize the process by which Darwin arrived at his hypothesis of atoll formation. • inferring – e.g., draw a logical conclusion from presented information • comparing – e.g., determine how similar things are as a criterion for applying analogy; • explaining – e.g., explain the cause of drought during El Nino
Understand the scientific process Darwin used and how his hypothesis of atoll formation was tested. • Be able to summarize the process by which Darwin arrived at his hypothesis. (Do NOT state or explain his hypothesis.) • Be able to explain why Darwin’s model of atoll formation was a scientific hypothesis (i.e., not a belief/statement of faith, nor idle speculation); • Using Darwin’s hypothesis, be able to infer the relative ages of two oceanic islands given maps of them. • Be able to recall what was done to test Darwin’s hypothesis. Became…
Evaluation Plan to Determine Program Outcomes (NIGMS-MORE) • Describe formative evaluations--these are evaluations carried out during the course of implementing activities to assess its suitability for the need. • Describe summative evaluations--these evaluations are carried out at the end of the activity to assess the outcome. • Discuss the use of qualitative and quantitative data collection methods. • State when in the course of implementing the activity data will be collected. • State any plans to make a mid-course modification of activities if formative evaluations indicate a need to change. • Provide examples of questionnaires to be used to collect qualitative improvements such as perceptions of participants. • State how data will be analyzed and provide the types of statistical methods to be used, if any, to test the reliability of the data. • Identify who will collect and analyze the data and provide credentials of the person(s) selected for collection and analysis of data. • Source: NIGMS-MORE Division
Introduction • Pre-assessment of attendees • Synthesis of 5 years of student outcomes assessment at UOG (Dr. Spencer) • Evaluation and the RISE Program • Background and framework (Dr. Schefter) • Example from the classroom(Dr. Lobban) • Assessment and information technology (Dr. Witteman) [Click to continue slide show or to download next ppt file] • Interactive sharing and discussion • Wrap-up • Post-assessment