300 likes | 317 Views
PROGRAM REVIEW & ASSESSMENT. Overview SERC Program Assassment. Why assess your instructional program?. For all the reasons Cathy reviewed! May be required in program performance reviews or curricular reform proposals Part I: Program Performance Reviews
E N D
PROGRAM REVIEW & ASSESSMENT Overview SERC Program Assassment
Why assess your instructional program? • For all the reasons Cathy reviewed! • May be required in program performance reviews or curricular reform proposals • Part I: Program Performance Reviews • Part II: Instructional Program Assessment • Assessment strategies • Data collection
PROGRAM REVIEW & ASSESSMENT Part 1: Program Performance Reviews (PPR)
Common Elements of a PPR • Department Context • SWOT Analysis • Learning Assessment • Goals/strategies/actions for next review period
Background Work • A. Department Context • Mission statement • What we are • Vision statement • What we strive to be • Student Learning Outcomes (SLO) statement • What our graduates are capable of • Examples of these statements are available at • Assessment Planning Documents • Case Study: Cal State Fullerton’s Department of Geological Sciences
Vision Statement Details from Mission-Crafting disc. were crafted into a Vision:
Student Learning Outcomes Statement • Carleton's SLO
A. Department Context • Mission statement • What we are • Vision statement • What we strive to be • Student Learning Outcomes (SLO) statement • What our graduates are capable of • What is the basis for these claims? • The scientific “urge” to back Mission/SLO claims up with data motivates the program assessment. • How do we attain our Vision? • Problem-solving “urge” motivates identification of goals/strategies/action plans in response to Vision.
B. SWOT Analysis • An exercise in hearing all voices: • Arm yourself with a pen and sticky notes • IN SILENCE, visit all 4 stations (S-W-O-T) and post your ideas about OUR department • You may revisit stations and reflect on notes, but do so in silence • Each group of 3-4 should gather in front of one of the stations and, IN SILENCE, take turns grouping/rearranging notes into themes. • Once done, construct statements that summarize the various themes at your station. • Report back to whole group.
B. SWOT Analysis • Suggestion: Solicit alumni input prior to exercise re their perceptions of strengths/weaknesses. • Volunteer one member of each group to be on a SWOT subcommittee to finalize SWOT statement . • Subcommittee can also perform following analysis: • Leverage (O+S) • Vulnerability (T+S) • Constraints (O+W) • Problems (T+W)
C. Assessment of Student Learning Develop assessment plan to assess degree to which SLO and teaching-related Visions are achieved, e.g.
C. Assessment of Student Learning • UT Austin site on Assessment of Instructional Programs Assessment Resources • Planning steps (with worksheets!): • Describe program context • Identify stakeholders and their central questions • Students: Will this degree prepare me/help me get a job? • Determine the evaluation purpose • Identify intended uses of data • Create an evaluation plan
D. Goals • Goals can be identified by inspecting Vision statements and asking • What are we currently doing to embody this Vision? • Is there something we could do differently to more fully realize our vision? • WHAT IS YOUR THEORY OF CHANGE?? • Make a logic model! • Prioritize goals (1-3 years; 3-6 years) • Identify strategies to achieve goals • Develop an action plan • Assign something to everyone!
INSTRUCTIONAL PROGRAM ASSESSMENT (IPA) Part 2: Assessment Strategies
Common elements of an IPA • Assessment of Student Learning • Define learning objectives for individual courses • Analysis of degree curriculum • Benchmark exams • Capstone experiences • Student surveys • Employer surveys • There are LOTS of examples of these on the SERC site... • Program Metrics and Instruments
Student Learning: Individual Courses • Define learning objectives for individual courses • “Successful completion satisfactorily demonstrates student mastery of SLO” • assessed by pass/fail rate • Student ratings of instruction • Annual student achievement data for high-enrollment, “standardized” classes • Performance Exams • Pre-/Post- Testing • Assessed by changes in scores
Analysis of degree curriculum • “map” courses to the degree’s SLO statement using SLOs for individual courses • assessment consists of demonstration that SLO are met by students passing required coursework • Pt. #4, Carleton College's Assessment Plan • “maps” can also be made to the institution’s Mission and Goals • This is a useful springboard for redesigning the curriculum…
Student Learning: Benchmark exams • Annual exams • e.g., Winona State • http://serc.carleton.edu/departments/assessment/instruments/Winona_annual_exam.html • How to keep students serious about taking a demanding, ungraded exam? • Exit exams • e.g., ASBOG • Does the content reflect your degree’s SLO?
Student Learning: Capstone Experiences • Research experiences/theses, assessed by • satisfactory completion • Rubrics used by advisors • presentation at professional meetings • In-house oral presentation • Research Days http://serc.carleton.edu/departments/assessment/instruments.html • Judged by industry and/or colleagues from neighboring institutions • Field camp, assessed by • satisfactory completion • final field exam
PROGRAM ASSESSMENT Part 3: Collecting Data
Student Surveys • Exit surveys and interviews • Surveys given in classes populated by seniors • or by mail • Interviews conducted by Chair or by advisor • Alumni surveys • Collect similar data over extended time period • May want to separate data by graduation date • Can add one-time questions to guide curriculum development, scheduling, outreach, etc. • Collect data re graduate programs for use in advising seniors • Lists of Exit/Alumni Survey Questions
Employer Surveys • Brainstorm re possible survey questions • How many graduates employed? • Strengths/weaknesses? • Compare to graduates of other programs? • Incentives to participate? • Possibility of improving the “product”? • Opportunity to give expert advice • Role of Advisory Boards
Web-based Survey Instruments • http://serc.carleton.edu/departments/assessment/survey_tools.html • SurveyMethods, Zoomerang, SurveyMonkey • Note factors to consider • Cost (group rate for institution?) • Number of expected responses • Types of questions you’d like to use • Desired access to raw data • How to “code” qualitative data • http://www.utexas.edu/academic/diia/assessment/iar/programs/report/focus-QualCode.php?task=programs
Web-based Survey Instruments • Brainstorm: How could you get acceptable response rates? • Consider low/no cost options • Response rates (link/eval prog w/surveys) • Note: “Guidelines for maximizing response rates” • http://www.utexas.edu/academic/diia/assessment/iar/programs/gather/method/survey.php • On-line surveys: • forewarn; remind; provide incentives
Closing Thoughts • A possible key to obtaining faculty “buy-in” to Program Reviews/Learning Assessments is by crafting Mission, Vision and SLO statements together • Get their claims of success on record, then • let the scientist’s urge to collect supporting data and to solve problems kick in! • Retreat, and hire a professional facilitator • Successful implementation of quantitative assessment plan will increase the integration of your educational and research missions.
PROGRAM REVIEW & ASSESSMENT Overview SERC Program Assassment