1 / 39

Lehman College School of Arts and Humanities The Dean’s Office Assessment Workshop Fall 2013

Lehman College School of Arts and Humanities The Dean’s Office Assessment Workshop Fall 2013. WORKSHOP: PART 1

helena
Download Presentation

Lehman College School of Arts and Humanities The Dean’s Office Assessment Workshop Fall 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lehman College School of Arts and HumanitiesThe Dean’s Office Assessment WorkshopFall 2013

  2. WORKSHOP: PART 1 • Assessments take place across an entire institution, in our instance, across CUNY, within each college, and within each area of each college, including Lehman College… in Academics, Administration and Student Services. • This workshop focuses on how Academic Assessment, specifically within Lehman College’s • School of Arts and Humanities is connected to CUNY, on Academic Assessment basics and how they are currently employed in one department, and on getting into our new Assessment • template. • Dean Deirdre Pettipiece’s will discuss her role as the Middle States Commission on Higher Education report Committee Chair, and why Lehman’s assessments work, in conjunction with CUNY’s Mission and Strategic Plan, and the College’s Mission are vital to the report and our Accreditation • Associate Dean Gina Dominique Herseywill introduce or review Academic Assessment Basicsin the context of an Assessment Cycle, program and course alignment, learning outcomes, various types of assessment, grading vs. assessment, and common assessment terms • Professor Janette Tilley will present how Lehman’s current Music Program Assessments are affecting the department curriculum, and where they are in their Assessment Cycle. • Assessment Manager Ray Galinski will introduce us to our School’s new database templates, guide us through an assessment exercise, and assist during the open lab session

  3. Middle States Commission on Higher Education standards for Institutional Effectiveness and Student Learning:(A Refresher) MSCHE is looking for “sound institution-wide assessment [in which] faculty guide decisions about curriculum and pedagogy, and thus guide effective assessment of student learning outcomes,” a process that is supported by the administration BOTTOM LINE: We need to be able to demonstrate that we use assessment results to continuously improve our programs.

  4. MSCHE will look for the following documentation regarding assessment: • Do institutional leaders support and value a culture of assessment? • Is there adequate guidance, resources, coordination and support? • Are assessment efforts recognized and valued? • Are efforts to improve teaching recognized and valued? • Are goals, including learning outcomes, clearly articulated at every level: institutional, unit-level, program-level, and course-level? • Do they have appropriate interrelationships? • Are all learning outcomes for master’s programs more advanced than those for Undergraduate programs? • Have appropriate assessment processes been implemented? • Do assessment results provide convincing evidence that the institution is achieving its mission and goals, including key learning outcomes? • Have assessment results been shared in useful forms and discussed widely with appropriate constituents? • Have results led to appropriate decisions and improvements about curricula and pedagogy, programs and services, resource allocation, and institutional goals and plans? • Have assessment processes been reviewed regularly? • Have the reviews led to appropriate decisions and improvements in assessment processes and support for them? • Where does the institution appear to be going with assessment? • Does it have sufficient engagement and momentum to sustain its assessment processes? • Or does it appear that momentum may be slowing? • Are there significant gaps in assessment processes?

  5. Characteristics of Assessment Processes that Meet MSCHE Expectations: • Useful • Cost-effective • Reasonably accurate and truthful • Planned • Organized, systematized, and sustained

  6. Assessment Should Support Strategic Decisions in . . . • Student Success • Resources and their allocation (including personnel) • Planning • Curriculum • Enrollment • Space requests and equipment purchases

  7. Good Assessment is Connected • What students learn should be connected to the institution’s stated mission, goals and plans. • What our programs do should be connected to how they are accredited. • What our faculty do in classrooms often serves both assessment areas. • No assessment activities should be undertaken that are not directly related to goals for student learning and program success.

  8. CUNY’s Mission As outlined in Investing in our Future The City University of New York’s Master Plan 2012-2016 The Mission Part One: The University Will Continue to Maintain and Expand its Commitment to Academic Excellence The Mission Part Two: Maintain the University as an Integrated System and Facilitate Articulation Between Units The Mission Part Three: Expanding Access The Mission Part Four: The University Must Remain Responsive to the Needs of its Urban Setting

  9. CUNY’s Strategic Plan The Mission Part One: (Pages 28-29) A Culture of Evidence and Assessment “As indicated repeatedly in this plan, the University’s efforts to maintain academic excellence are reflected in its concern for evidence-based practices and its interest in metrics that indicate specific aspects of student achievement and success. Cuny will take additional steps during the next four years to assess specific learning gains and other indicators of student achievement…’’ (p. 28) The Collegiate Learning Assessment (CLA) “Another example concerns CUNY’s assessment of the general education of its students. For approximately a decade, the University administered the CUNY Proficiency Exam (CPE); students pursuing both associate and baccalaureate degrees were required to pass it as a condition of advancing to upper-division work and of receiving their degrees. This past year, after extensive deliberation, a task force recommended, and the Board of Trustees voted, that CUNY discontinue the use of the CPE…” ultimately the CLA is now being used…focusing on broad Institutional assessment, with individual student assessments as one part of the larger assessment (p. 28) The University as a National Model for Generation and Use of Data “…In a consolidated data warehouse, CUNY integrates the information necessary to track from beginning to end the academic career of every degree-seeking student who has matriculated at a CUNY college…” (p. 29)

  10. Lehman College’s (Academic) Mission A core mission of Lehman College is to provide quality education and a nourishing and supportive environment for the academic success of its students. Our undergraduate students have access to many essential services and informational resources which are provided by the college, and our graduate students and adult learners learn from dedicated and distinguished faculty and engage in cutting edge research and studies at our state-of-the-art facilities. http://www.lehman.edu/academics/

  11. Departmental Mission: An example Lehman College’s Art Department: From the Chair The mission of the Lehman College Art Department is to offer students from the Bronx and surrounding regions a unique opportunity to study art and art history in a program where both traditional studio and digital art practice share common goals and vision. · Fundamental skills of traditional and digital art making are taught as a vehicle for the development of creative thinking and innovative problem solving. Historical and analytical skills are emphasized, broadening student’s cultural literacy while helping prepare them for advanced study. · Through a wide range of degrees and concentrations at the graduate and undergraduate level, students interested in education, fine art and digital media production build a strong foundation to advance their careers while studying in New York City, one of the world’s most important cultural centers. Professor Flavia Bacarella, Chair

  12. Lehman College General Education Goals and Learning Objective (Outcomes) The entire General Education curriculum is designed around a set of core fluencies, which each of the courses develop to varying degrees. The core fluencies are basic to all the coursework, including the required English composition, foreign language courses, mathematics, natural science courses, Distribution Area courses, capstone LEH300-LEH301 sections, and writing intensive sections. These fluencies represent skills and abilities that overlap one another, as this diagram suggests. http://www.lehman.edu/academics/general-education/core-fluencies.php

  13. School of Arts and Humanities Dean’s Office’s Role in Assessing Student Learning • Coordinate and oversee assessment at undergraduate and graduate levels in the school. • Ensure assessment activities are integrated with strategic plans and resource allocation and that they are ongoing. • Identify particular needs of school for assessment expertise (i.e., providing workshops, documentation, reassigned time). • Ensure all departments and programs have an assessment coordinator or committee. • Encourage coordination between department and Ray Galinski for development and revision of assessment plans. • Ensure each department and program has an updated assessment plan, which provides data that drives unit decision-making. • Provide feedback to departmental assessment committee or chairperson re: assessment plan and data findings.

  14. WORKSHOP: PART 2What is academic assessment? An introduction or review • What do we want students to have experienced and know upon completion of an assignment, a course or a program? And how will we know what they have learned? These questions are at the heart of assessment. • We may be brilliant in our field of expertise, and we may have genuine desires and abilities to to educate others, but we do not necessarily come to higher education with a background in education or assessment. • We may think that because we have given exams, led meaningful discussions, and/or assigned creative projects and research papers, that we are engaged in assessing our students, but only when we use what we have discovered about student learning to refine our teaching are we actively, fully engaging in assessment. • Assessment is the process of collecting, recordingand using information about student learning and performance to improve education…both teaching and learning. And for assessment to be meaningfuland not simply busy work, or teaching to the test, or exclusively about accreditation report data collection, though that must be done, it must be done systematically, thoughtfully, and reflectively, and on an on-going or cyclical basis. Academic assessment is faculty driven so that the information gathered honestly: • Reflects the goals and values of particular disciplines • Helps instructors refine their teaching practices and grow as educators • Helps departments refine their program curriculum to better prepare students for a current, relevant education that prepares them for their future as active citizens, and thoughtful discerners of their place in the world, especially in what will soon become their work lives

  15. Academic Assessment Basics Assessment is a broad and rapidly growing field, with a strong theoretical and empirical base. Some of us are becoming (academic) assessment experts, some of us are familiar with assessment but clearly still learning, and some of us are brand new to assessment. In any case, we all need to effectively and collaboratively employ agreed upon, sound assessment practices. To review, or refine what we already know, or as a first introduction, this presentation highlights basic academic assessment concepts. It is intended to assist us in becoming more adept at academic assessment planning and implementation. • Why should program, course and assignment learning outcomes be aligned with • assessments and instructional strategies? • Why should the components of a course be clearly aligned? • What do well-aligned assessments look like? • Bloom’s Taxonomy • What is the difference between formative and summative assessment? • What is the difference between assessment and grading? • Glossary of common assessment terms

  16. Why should program, course and assignment learning outcomesbe aligned with assessments and instructional strategies? • Assessment results reveal how well (or not) students are apprehending our intended learning outcomes. Good instruction ensures that they learn our defined outcomes. For this to occur, learning outcomes, our assessments of them, and our instructional strategiesought to be closely aligned to complement and reinforce one another, and to give us valuable feedback to continue improving. • To ensure that assignment, course and/or programs are aligned, lets ask ourselves a few questions : • Learning outcomes: What do I want students to know how to do when they complete this assignment, this course and/or this program? • Assessments: What kinds of tasks will reveal these defined learning outcomes, or let me know what they have achieved? • Instructional strategies: What kinds of activities in and out of class will reinforce my • learning outcomes and prepare students for assignment, course or program assessments? • Aligning learning outcomes with assessments and instructional strategies produces informative data that allows us to improve teaching and learning.

  17. Why should the components of a course be clearly aligned? We want to clearly align project and course objectives with instructional strategies and to do the same between course and program objectives. This way students have opportunities to learn what we say they ought to be learning, and to demonstrate that this particular learning is taking place. If assessments are misaligned with learning outcomes or instructional strategies, it can undermine both student motivation and learning. Consider this scenarios: If our assessment measures students’ ability to compare and contrast works by two different sculptors, but the instructional strategies focus entirely on memorization and recitation of artist and the works of art they created, dates of creation, and periods in which the works were made, then students do not learn or practice the skills of comparison and evaluation that will be assessed. http://assessment.uconn.edu/primer/mapping1.html

  18. What do well-aligned assessments look like?This table presents examples of the kinds of activities that can be used to assess different types of learning objectives (adapted from the revised Bloom’s Taxonomy). Type of learning outcome Examples of appropriate assessments Objective test items such as fill-in-the-blank, matching, labeling, or multiple choice questions that require students to: recall or recognize terms, facts, and concepts Interpret Exemplify Classify Summarize Infer Compare Explain • Activities such as papers, exams, problem sets, class discussion, or concept maps that require students to: • summarize readings, films, or speeches • compare and contrast two or more theories. events, or processes Recall Recognize Identify Apply Execute Implement • Activities such as problem sets, performances, labs, prototyping, or simulations that require students to: • use procedures to solve or complete familiar or unfamiliar tasks • determine which procedure(s) are most appropriate for a given task • Activities such as case studies, critiques, labs, papers, projects, debates, or concept maps that require • students to: • discriminate or select relevant and irrelevant parts • determine how elements function together • determine bias, values, or underlying intent in presented material Analyze Differentiate Organize Attribute Evaluate Check Critique Assess • Activities such as journals, diaries, critiques, problem sets, product reviews, or studies that require students to: • Test, monitor, judge, or critique readings, performances, or products against established criteria or standards Create Generate Plan Produce Design • Activities such as research projects, musical compositions, performances, essays, business plans, • website designs, or set designs that require students to: • make, build, design or generate something new This table does not list all possible examples of appropriate assessments. Multitudes of other assessments have been and will continue to be developed.

  19. Bloom’s Taxonomy This taxonomy was originally created by Benjamin Bloom in 1956 to categorize a continuum of educational objectives. These objectives are described in terms of student-centered actions that represent the kind of knowledge and intellectual engagement we want our students to demonstrate. Many updated versions exist, and they often represent the incorporation of new knowledge into the original framework, which remains as relevant today as when it was created. It is very typical within assessment practices that Bloom’s Taxonomy is used to define outcomes. Whatever outcomes and assessments we do create and employ, again, they ought to align with learning outcomes and instructional strategies. Here are a few readily available interpretations of Bloom’s Taxonomy, beginning with the original: http://upload.wikimedia.org/wikipedia/commons/9/9e/BloomsCognitiveDomain.svg

  20. http://en.wikipedia.org/wiki/Bloom's_Taxonomy

  21. What is the difference between formative and summative assessment? • Formative assessment • The goal of formative assessment is to monitor student learning to provide ongoing feedback that can be used by instructors to improve their teaching and by students to improve their learning. More specifically, formative assessments: • help students identify their strengths and weaknesses and target areas that need work • help faculty recognize where students are struggling and address problems immediately • Formative assessments are generally low stakes, which means that they have low or no point value. Examples of formative assessments include asking students to: • draw a concept map in class to represent their understanding of a topic (I.e. see left) • submit one or two sentences identifying the main point of a lecture (Writing Across the Curriculum (WAC) activity) • turn in a research topic for early feedback

  22. Summative assessment • The goal of summative assessment is to evaluate student learning at the end of an instructional unit by comparing it against some standard or benchmark. Summative assessments are often high stakes, which means that they have a high point value. Examples of summative assessments include: • a midterm exam • a final project http://www.youtube.com/watch?v=xB901tfi5j0 • a paper • a capstone recital, exhibition, thesis, etc. • Information from summative assessments can be used formatively when students or faculty use it to guide their efforts and activities in subsequent courses.

  23. What is the difference between assessment and grading? Assessment and grading are not the same. Generally, the goal of grading is to evaluate individual students’ learning and performance. Although grades are sometimes treated as a proxy for student learning, they are not always a reliable measure. Moreover, they may incorporate criteria – such as attendance, participation, and effort – that are not direct measures of learning. The goal of assessment is to improve student learning. Although grading can play a role in assessment, assessment also involves many ungraded measures of student learning (such as concept maps and WAC exercises). Moreover, assessment goes beyond grading by systematically examining patterns of student learning across courses and programs and using this information to improve educational practices, (I.e. to revise course projects, quizzes and/or exams, to update, to rewrite or eliminate courses, and to revise and guide programs. Here is a sample scoring rubric. Is it used for grading or an assessment of discussion board posts? If the former, how do you know? If the later…for a formative or summative assessment? http://askelearning.csuohio.edu/kb/?View=entry&EntryID=231

  24. Glossary Common Assessment Terms Assessment for Accountability The assessment of some unit, such as a department, program or entire institution, which is used to satisfy some group of external stakeholders. Stakeholders might include accreditation agencies, state government, or trustees. Results are often compared across similar units, such as other similar programs and are always summative. An example of assessment for accountability would be Middle States accreditation at Lehman College, whereby Middle States creates a set of standards that must be met in order for Lehman to Receive Middle accreditation status. Assessment for Improvement Assessment activities that are designed to feed the results directly, and ideally, immediately, back into revising the course, program or institution with the goal of improving student learning. Both formative and summative assessment data can be used to guide improvements. Concept Maps Concept maps are graphical representations that can be used to reveal how students organize their knowledge about a concept or process. They include concepts, usually represented in enclosed circles or boxes, and relationships between concepts, indicated by a line connecting two concepts. Example [ Direct Assessment of Learning Direct assessment is when measures of learning are based on student performance or demonstrates the learning itself. Scoring performance on tests, term papers, or the execution of lab skills, would all be examples of direct assessment of learning. Direct assessment of learning can occur within a course (e.g., performance on a series of tests) or could occur across courses or years (comparing writing scores from sophomore to senior year).

  25. Embedded Assessment A means of gathering information about student learning that is integrated into the teaching-learning process. Results can be used to assess individual student performance or they can be aggregated to provide information about the course or program. can be formative or summative, quantitative or qualitative. Example: as part of a course, expecting each senior to complete a research paper that is graded for content and style, but is also assessed for advanced ability to locate and evaluate Web-based information (as part of a college-wide outcome to demonstrate information literacy). External Assessment Use of criteria (rubric) or an instrument developed by an individual or organization external to the one being assessed. This kind of assessment is usually summative, quantitative, and often high-stakes, such as the SAT or GRE exams. Formative Assessment Formative assessment refers to the gathering of information or data about student learning during a course or program that is used to guide improvements in teaching and learning. Formative assessment activities are usually low-stakes or no-stakes; they do not contribute substantially to the final evaluation or grade of the student or may not even be assessed at the individual student level. For example, posing a question in class and asking for a show of hands in support of different response options would be a formative assessment at the class level. Observing how many students responded incorrectly would be used to guide further teaching. High stakes Assessment The decision to use the results of assessment to set a hurdle that needs to be cleared for completing a program of study, receiving certification, or moving to the next level. Most often, the assessment so used is externally developed, based on set standards, carried out in a secure testing situation, and administered at a single point in time. Examples: at the secondary school level, statewide exams required for graduation; in postgraduate education, the bar exam. Indirect Assessment of Learning Indirect assessments use perceptions, reflections or secondary evidence to make inferences about student learning. For example, surveys of employers, students’ self-assessments, and admissions to graduate schools are all indirect evidence of learning.

  26. Individual Assessment Uses the individual student, and his/her learning, as the level of analysis. Can be quantitative or qualitative, formative or summative, standards-based or value added, and used for improvement. Most of the student assessment conducted in higher education is focused on the individual. Student test scores, improvement in writing during a course, or a student’s improvement presentation skills over their undergraduate career are all examples of individual assessment. Institutional Assessment Uses the institution as the level of analysis. The assessment can be quantitative or qualitative, formative or summative, standards-based or value added, and used for improvement or for accountability. Ideally, institution-wide goals and objectives would serve as a basis for the assessment. For example, to measure the institutional goal of developing collaboration skills, an instructor and peer assessment tool could be used to measure how well seniors across the institution work in multi-cultural teams. Local Assessment Means and methods that are developed by an institution's faculty based on their teaching approaches, students, and learning goals. An example would be an English Department’s construction and use of a writing rubric to assess incoming freshmen’s writing samples, which might then be used assign students to appropriate writing courses, or might be compared to senior writing samples to get a measure of value-added. Program Assessment Uses the department or program as the level of analysis. Can be quantitative or qualitative, formative or summative, standards-based or value added, and used for improvement or for accountability. Ideally, program goals and objectives would serve as a basis for the assessment. Example: How well can senior engineering students apply engineering concepts and skills to solve an engineering problem? This might be assessed through a capstone project, by combining performance data from multiple senior level courses, collecting ratings from internship employers, etc. If a goal is to assess value added, some comparison of the performance to newly declared majors would be included. Qualitative Assessment Collects data that does not lend itself to quantitative methods but rather to interpretive criteria (see the first example under "standards"). Quantitative Assessment Collects data that can be analyzed using quantitative methods (see "assessment for accountability" for an example).

  27. Rubric A rubric is a scoring tool that explicitly represents the performance expectations for an assignment or piece of work. A rubric divides the assigned work into component parts and provides clear descriptions of the characteristics of the work associated with each component, at varying levels of mastery. Rubrics can be used for a wide array of assignments: papers, projects, oral presentations, artistic performances, group projects, etc. Rubrics can be used as scoring or grading guides, to provide formative feedback to support and guide ongoing learning efforts, or both. Standards Standards refer to an established level of accomplishment that all students are expected to meet or exceed. Standards do not imply standardization of a program or of testing. Performance or learning standards may be met through multiple pathways and demonstrated in various ways. For example, instruction designed to meet a standard for verbal foreign language competency may include classroom conversations, one-on-one interactions with a TA, or the use of computer software. Assessing competence may be done by carrying on a conversation about daily activities or a common scenario, such as eating in a restaurant, or using a standardized test, using a rubric or grading key to score correct grammar and comprehensible pronunciation. Summative Assessment The gathering of information at the conclusion of a course, program, or undergraduate career to improve learning or to meet accountability demands. When used for improvement, impacts the next cohort of students taking the course or program. Examples: examining student final exams in a course to see if certain specific areas of the curriculum were understood less well than others; analyzing senior projects for the ability to integrate across disciplines. Value Added The increase in learning that occurs during a course, program, or undergraduate education. Can either focus on the individual student (how much better a student can write, for example, at the end than at the beginning) or on a cohort of students (whether senior papers demonstrate more sophisticated writing skills-in the aggregate-than freshmen papers). To measure value-added a baseline measurement is needed for comparison. The baseline measure can be from the same sample of students (longitudinal design) or from a different sample (cross-sectional). Adapted from Assessment Glossary compiled by American Public Library System, 2005

  28. Bibliography links: http://www.cmu.edu/teaching/assessment/index.html http://www.bridgew.edu/AssessmentGuidebook/overview.cfm http://www.csustan.edu/oaqa/index.html http://gradstudies.csusb.edu/documents/OutcomesAssessment/Closing_the_Assessment_Loop_by_Trudy_Banta_Charles_Blaich_2011.pdf http://www.lehman.edu/academics/

  29. WORKSHOP: PART 3Music Department AssessmentsMusic Department Mission The Music Department is committed to providing the highest quality musical education in an environment that provides an inclusive and engaging learning experience for all students. The Department serves the Bronx and surrounding areas with its diverse and active free admission performance schedule, involving students, alumni, college faculty and personnel, and many members of the community. The department offers comprehensive undergraduate and graduate degree programs with a variety of opportunities for individual creative development. The graduate program prepares students to teach music in the public schools with New York State Professional Certification.

  30. Program Goal • Demonstrate a functional knowledge of music’s grammar and formal structures. Learning Objectives: • Be able to analyze music of the Western classical tradition from the 18th-21stcenturies • Write brief exercises using musical materials and concepts from the periods • Compose appropriately for instruments of the standard orchestra for small and large ensembles

  31. Curriculum Map I: Introduced. P: Practiced A: Advanced mastery

  32. Assessment Project • MST 236 Theory I is the important first step for majors in the music theory sequence. It typically has a high attrition rate. • Tradition held that it met once/week, but intuitively we knew this was poor pedagogy • Wanted evidence to persuade scheduling change and to try to improve retention

  33. Findings and Implementations • Fall semester class average grade 2.75 and 45% pass rate • Spring semester class average 3.5 and 92% pass rate Lots of variables not accounted for, but preliminary evidence suggests students better prepared to go on in the sequence. Scheduling change now permanent. BUT. . .

  34. Only 58% of the students who successfully passed Theory I in the Spring, have gone on to enroll in Theory II this semester. This number is lower than normal So… improved student outcomes did not lead to improved enrolment. What is going on? Instead of “closing the loop” on this, we’ve opened another one.

  35. WORKSHOP: PART 4GETTING STARTED WITH ASSESSMENTS

  36. Assignment 1 (If Department is just getting started on Assessment) Write Department Mission and Program Missions: Work with Dept. Chair to schedule assessment meeting to construct Department Mission and Program Mission Statements, if they are not already written… Brainstorm at the meeting, and collaboratively write the statement/s Each ought to be concise…three to four sentences long Review and edit the statement/s Post on website Document in TaskStream

  37. Assignment 2 (If Department has Mission and/or Program statement/s done, and is ready to begin Program Assessments) Write relevant Program Outcomes: Work with Dept. Chair to schedule assessment meeting to construct Program Outcomes, if they are not already written… Brainstorm at the meeting, and collaboratively write the outcomes Each ought to be concise…one brief sentence Review and edit the outcomes Post on website Document in TaskStream (Option: Bloom’s Taxonomy may be used to get started…begin each Course outcome with a Bloom’s verb for example. Again, be concise and list 3-5 Course Outcomes): Program Graduates will: 1 2 3 4 5

  38. Assignment 3 (If Department has Mission and/or Program statements written and posted…if the Department has constructed Program Outcomes, and is ready to move into Program Course Assessments) Work with Department members to write the designated Course Assessments (Option: Again, Bloom’s Taxonomy may be used to get started…begin each Course outcome with a Bloom’s verb for example. And again, be concise, so list 2-4 Course Outcomes per course): 1 2 3

More Related