390 likes | 576 Views
Blazing a Path and Not Getting Burned: Developing Assurance of Learning Objectives in General Education. Valerie Whittlesey, Mary Lou Frank, Marlene Sims, Amy Howton, Jan Phillips, and Tom Doleys. Kennesaw State University (KSU).
E N D
Blazing a Path and Not Getting Burned: Developing Assurance of Learning Objectives in General Education
Valerie Whittlesey, Mary Lou Frank, Marlene Sims, Amy Howton, Jan Phillips, and Tom Doleys
Kennesaw State University (KSU) • KSU is part of the University System of Georgia; located in metropolitan Atlanta • Enrollment is approximately 18,000 students • More than 55 bachelor’s and master’s degree programs • Ethnic minority enrollment is 21%; more than 1,000 international students representing 123 countries
FOCUS OF LEARNING OUTCOMES UNIT OF ANALYSIS • THE GRADUATE OF THE PROGRAM • STUDENT LEARNING ASSESSMENT ISSUES • KNOWLEDGE OF GRADUATES • SKILLS OF GRADUATES • ATTITUDES/VALUES OF GRADUATES • EDUCATIONAL GROWTH/GAINS
ASSESSMENT REPORTING PRIOR TO 2003 • From 1994 to 2003, academic departments at KSU sent their annual assessment reports to the Office of Institutional Planning. • These nonstandard annual reports, ranging from 1 to 50+ pages, were simply filed away with no follow-up or feedback.
ASSURANCE OF LEARNING INITIATIVE AT KSU • Established an Assurance of Learning (AOL) Council in Spring of 2003 with High-Level Academic Leadership (AVPAA as Chair). • Council Membership Includes Two Assessment-Oriented Faculty from each College, the Director of Excellence in Teaching & Learning, and the Director of Institutional Effectiveness.
AOL COUNCIL FUNCTIONS • The AOL Council serves as a steering committee and advisory/consultative group to academic programs as they design, implement, and improve their assessments of student learning outcomes. • The AOL Council provides common structures and good practices for learning outcomes assessment and collegial formative feedback on annual reports at the program level.
SEVEN ELEMENTS OF STUDENT LEARNING OUTCOMES ASSESSMENT • Articulating Student Learning Outcomes • Connecting Outcomes to the Curriculum • Connecting Outcomes to Assessment Methods • Articulating Expected Results with Respect to Outcomes • Articulating the Assessment Plan & Timetable for Collecting Data • Collecting, Analyzing, & Interpreting Data • Using Results for Improvement
Campus Environment • Not excited • Question the use of the assessment • Fear of change • Fear of more, more, more….
Development of a Process: Not getting burned • Campus-wide committee • Encourage and support leadership • Open meeting • Continuing to support and encourage the process
Developing General Education Learning Outcomes: Opportunities • Interdisciplinary • Connect all of the disciplines in the General Education Program • Applicable across the General Education curriculum • Each course contributes to the student’s attainment of several different outcomes • Each outcome is reinforced by multiple courses in the curriculum
Challenges Sublimating departmental goals and objectives to those of the General Education Program • English Department proposed an additional general education learning outcome addressing proficiency in writing and rhetoric • Communication Department interpreted Goal 2, Demonstrate proficiency in communication, as referring to their discipline
Challenges • Incorporating faculty input from both the General Education Council and the KSU faculty as a whole vs • Completing the process of writing outcomes in order to move to the next step in the assessment process
STANDARDIZED TESTS EXAMINED AT KSU • CAAP • ETS • BASE • NSSE
NSSE • Used by KSU because of our involvement with • Foundations of Excellence • American Democracy Project • Enables KSU to assess various elements of student success, especially student engagement
PROS • Already exist • Statistical results would be sent to us by the parent company • Have been used by other General Education programs • Could be compared to other schools of similar demographics • Might help us in drafting our Specific Learning Objectives (SLO’s)
CONSIDERATIONS • Would we choose to administer to • All General Education Students? • A random sample of our General Education Students? • At the end of the sophomore year (the usual choice by institutions employing these assessment instruments)? • At the end of the degree program?
CONS • Disconnect between the standardized instruments and our Goals and Objectives at KSU • Do not address many of our General Learning Objectives (GLO’s) • Do not assess whether students have made connections between the various GLO’s • Often do not assess the material within the discipline in the way it is taught at KSU
EXAMPLES • MATH • KSU’s courses geared toward application and being able to explain the meaning in terms of a problem or situation • Standardized instruments asked students to solve stand-alone problems and equations
EXAMPLES • WRITING • KSU teaches a semester-long, process approach to writing • Standardized instruments asked for a short, product-oriented approach 2. KSU teaches writing related to the various disciplines • Standardized instruments predominately asked for writing about literature
EXAMPLES • KSU’s General Education General and Specific Learning Objectives address multiculturalism, global perspective, and other topics not addressed in the standardized instruments
UNDECIDED FACTORS • If a generalized instrument is used to assess General Education, will it be administered • At the end of the sophomore year • At the end of the degree program
Unique Teaching Background I have taught in the English and Communication Departments at KSU, respectively.
I have brought an understanding of both disciplines to the assessment table. • As our committee was developing the Global Learning Outcomes, we emphasized communication skills as Global Learning Outcome 2. • Since communication is an academic discipline, problems arose regarding the responsibility and ownership of the assessment process.
We emphasized written communication skills as Global Learning Outcome 2. • Colleagues in the English Department became concerned. • They felt they were the most qualified in assessing the writing portion of the specific learning outcome.
Resolution through Impassioned Discussion • The assessment committee reassured our colleagues in the English Department that they would continue to assess writing for their portion of the General Education Program. • However, other departments from other disciplines would also assess written communication skills in other courses besides English Composition.
Once ownership was determined, the confusion disappeared. • This pattern of negotiating meaning came from a social constructionist perspective. • For example, the meaning of the word communication was interpreted not only as a skill but as a discipline. • Furthermore, it was the interpretation of the word communication as a discipline that added to the confusion for our colleagues in the English Department.
Negotiation of Meaning Among Faculty Members • The resolution of this issue came through a socially constructed model of negotiating meaning. • Therefore, negotiating meaning is an essential part of blazing an assessment path without getting burned.
A View From One Department • Department of Political Science & International Affairs • Faculty: 24 Full Time, 10 Part Time • Majors: 411 • Relevant General Education Course • POLS 1101: American Government in a Global Perspective • Sections in Fall 2004: 26 • Total Students: 1,796
The General Challenge How does a department that is focused primarily on the delivery of discipline-specific knowledge and skills assess General Education-oriented learning objectives?
What is the Objective? Option #1: Determine which SLOs we should be fulfilling. • PRO: Encourages consistency across instructors & classes. • CON: Raises issues of faculty independence and autonomy. Option #2: Determine which SLOs we are fulfilling. • PRO: Discover whether we are doing what we think we are doing. • CON: None Apparent. Decision: Option #2.
Specific Challenge #2: Collecting the Data Mechanism = Faculty Survey (see handout)
Who Should be Surveyed? • Option #1: Full-time Faculty only • PRO: Focuses on faculty/courses at the heart of program assessment. Provides consistency across time (for measurement purposes). • CON: Incomplete picture of what is being done. • Option #2: All Faculty who teach POLS 1101 • PRO: Complete picture of what is being done in the department. • CON: Instructor turnover raises measurement difficulties. • Decision: Option #1
How Should Results be Compiled? • Option #1: Include SLOs selected by at least one faculty member. • PRO: Collective representation of what we think we are doing as individuals. • CON: Misleading representation of what we are doing as a department. • Option #2: Include SLOs selected by some threshold number of faculty. • PRO: Better picture of what is being done both as individuals and as a department. • CON: Threshold selection is arbitrary. • Option #3: Include only SLOs selected by all faculty. • PRO: Provides a picture what department is doing. • CON: May exclude SLOs that are widely – but not universally – covered. • Decision: Option #3
The Expectations Gap • POLS 1101 and the Expectations Gap • Expectation: Course title and catalogue description give impression several SLOs under GLO #4 should be covered (esp. SLOs 4.1, 4.4 & 4.5) • The Data: Only SLO 4.5 covered by all faculty • What to Do About It? • Option #1: The data say what the data say. • Option #2: Give the survey again. • What was Done About It • Administer survey again, highlighting the gap. • Now SLOs 4.1 & 4.5 included • Implications for the Assessment Process…
The Next Steps • Begin with Goal 1 • Design an assessment plan that will connect outcomes to assessment methods • Each course in the program will determine how to assess the specific learning outcomes that they have chosen • Articulate a plan for data analysis and collection