440 likes | 646 Views
General Education Assessment. AAC&U GE and Assessment Conference March 1, 2007. Program Assessment. an on-going process designed to monitor and improve student learning. Faculty: Develop learning outcomes Verify alignment Collect assessment data Close the loop. Definitions.
E N D
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007
Program Assessment an on-going process designed to monitor and improve student learning. Faculty: • Develop learning outcomes • Verify alignment • Collect assessment data • Close the loop
Definitions • Direct vs. Indirect Assessment • Embedded Assessment • Authentic Assessment • Formative vs. Summative Assessment • Triangulation
Assessment Steps • Define learning outcomes • Check for alignment • Develop an assessment plan • Collect assessment data • Close the loop • Improve the assessment process
Learning Outcomes • Clarify what faculty want students to learn • Clarify how the assessment should be done
Types of Outcomes • Knowledge • Skills • Values
Levels of Outcomes • Course • Program • Institutional
GE Program Outcomes • Focus on how students can demonstrate their learning • Should be widely distributed • Should be known by all stakeholders • Guide course and curriculum planning • Encourage students to be intentional learners • Focus assessment efforts
Goals vs. Outcomes Examples
Types of GE Outcomes • Short list of more general outcomes • Longer list of outcomes related to specific requirements
Ensuring/Verifying Alignment • Course Certification • Course Recertification • Alignment Projects
GE Alignment Questions • Curriculum Cohesion • Pedagogy and Grading • Support Services • GE Instructors • Learning-Centered Campuses
Cohesive Curriculum • Coherence • Synthesizing experiences • On-going practice • Systematically created opportunities to develop increasing sophistication
Alignment Matrix (Curriculum Map) • I = Introduced • D = Developed & Practiced with Feedback • M = Demonstrated at the Mastery Level Appropriate for Graduation
Assessment Plan • Who? • What? • When? • Where? • How?
Assessment should be meaningful, manageable, and sustainable.
We don’t have to assess every outcome in every student every year.
Levels of GE Assessment • Course-Level • Program-Level • Institution-Level
Sampling • Relevant Samples • Representative Samples • Reasonably-Sized Samples
Ethical Issues to Consider • Anonymity • Confidentiality • Informed Consent • Privacy
Sample Assessment Plans Find examples of: • Direct assessment • Indirect assessment • Formative assessment • Summative assessment • Alignment-related assessment • Triangulation
Assessment Techniques • Direct Assessment • Indirect Assessment
Properties of Good Assessment Techniques • Valid • Reliable • Actionable • Efficient and Cost-Effective • Engaging to Respondents • Interesting to Us • Triangulation
Direct Assessment • Published Tests • Locally-Developed Tests • Embedded Assessment • Portfolios • Collective Portfolios
Indirect Assessment • Surveys • Interviews • Focus Groups
Rubrics • Holistic Rubrics • Analytic Rubrics
Rubric Strengths • Efficient use of faculty time • Precisely define faculty expectations • Training can be effective • Criterion-referenced judgments • Can be used by others
Using Rubrics for Grading and Assessment • Numbers for grading • Categories for assessment • Numbers and other criteria under individual faculty control • Speed up grading • Provide formative feedback
Using Rubrics in Courses 1. Hand out rubric with assignment. 2. Use rubric for grading. 3. Develop rubric with students. 4. Students apply rubric to examples. 5. Peer feedback using rubric. 6. Self-assessment using rubric.
Creating a Rubric • Adapt an existing rubric • Analytic approach • Expert-systems approach
Managing Group Readings • One reader/document • Two independent readers/document • Paired readers
Before inviting colleagues: • Develop and pilot test rubric. • Select exemplars. • Develop a recording system. • Consider pre-programming a spreadsheet.
Closing the Loop • Celebrate! • Change pedagogy • Change curriculum • Change student support • Change faculty support
Bringing It All Together • Campus-wide conversations • Institution-wide implications for faculty/staff development • Quality-assurance process • Reporting structure • Implications for funding or infrastructure development
Some Friendly Suggestions • Focus on what is important. • Don’t try to do too much at once. • Take samples. • Pilot test procedures. • Use rubrics. • Close the loop. • Keep a written record.