300 likes | 313 Views
ASSESSEMENT METHODS AND ACTIONABLE DATA. Richard Mahon, Riverside City College, facilitator Lesley Kawaguchi, Santa Monica College Maggie Taylor, Fresno City College Linda Umbdenstock, member, ASCCC-RP SLO Collaborative. AUTHENTIC ASSESSMENT.
E N D
Richard Mahon, Riverside City College, facilitator • Lesley Kawaguchi, Santa Monica College • Maggie Taylor, Fresno City College • Linda Umbdenstock, member, ASCCC-RP SLO Collaborative
AUTHENTIC ASSESSMENT • “. . . [A]uthentic assessment simulates a real world experience by evaluating the student’s ability to apply critical thinking and knowledge or to perform tasks that may approximate those found in the work place or other venues outside of the classroom setting.” SLO Terminology Glossary: A Resource for Local Senates, pp. 2-3
Sample Student Learning Outcome: • History: “Demonstrate the ability to interpret historical information by applying analytical skills used by historians – such as synthesizing evidence from both primary and secondary sources, comparing and contrasting multiple perspectives, contextualizing information, and/or identifying causes and effects of change and continuity – to the course content.”
Possible ways to Assess: • Essays • Research Papers • Midterm or Final Exams • Oral Discussion and/or Presentation
Registered Nursing SLOs • Fundamentals of Nursing Skills Lab • SLO: “Provide safe, comprehensive and proficient care for the adult client in a simulated environment.” • Each procedure has a rubric. • At end of course, a patient scenario is given which involves a set of procedures. • Performance of procedure, including rationale • Documentation • Core RN faculty discussed the rubrics, the scenarios, and use the results to make any changes.
ENGL 252: Writing Improvement • Assigned English faculty met several times, developed rubric, and exam. • Held several norming sessions to assist in evaluating papers. • When finished, had an overall critique of the exam. • Results will be used in the next semester.
Identifying Accessible External Sources of Data • Chancellor’s Datamart http://www.cccco.edu/ChancellorsOffice/Divisions/TechResearchInfo/MIS/DataMartandReports/tabid/282/Default.aspx • NCES National Center for Education Statistics http://nces.ed.gov/ • CPEC California Postsecondary Commission http://www.cpec.ca.gov/ • CalPASS - http://www.cal-pass.org/
Defining Actionable Data – Using Data Interactively • Data collection does not equate to action or improvement. • Even the most valid and reliable data are not a substitute for action and will not by themselves motivate action. • Actionable data provide information that leads to improved practice. • Actionable data result from “action-oriented” research.
Actionable Data? Is this data actionable?
Actionable Data? Is this data actionable?
Actionable Data? Is this data actionable?
Actionable Data? Is this data actionable?
Student Learning Outcomes Data ESL Level 1 Outcomes Assessments % passed • Listening/Speaking SLO: Demonstrate understanding of frequently used words, phrases and questions in familiar contexts. Engage in limited social conversations to communicate basic survival needs. • Assessment = Interview • Outcomes = 84% passed successfully • Reading • SLO: Construct meaning from simplified print materials on familiar topics. • Assessment = CASAS Level A (A comprehensive standardized test) • Outcomes = CASAS Success rate 73% • Writing • SLO: Produce simple sentences in paragraph format and complete simple forms. • Assessment = Written Paragraph and Rubric for scoring • Outcomes = 74% pass rate
Exploring data - Data 101 Principle 1 – Use longitudinal data when possible Principle 2 – Use data in context Principle 3 – Look for both direct and indirect data Principle 4 – Do not oversimplify cause and effect of data Principle 5 – Use appropriate levels of data for appropriate levels of decisions Principle 6 – Perception is the reality within which people operate Principle 7 – Use of data should be transparent Principle 8 – Consider carefully when to aggregate or disaggregate data Principle 9 – Focus on data that is actionable Principle 10 – Consider implications and the “What if?
Creating Institutional Processes for Conducting Research and Using Data to Inform Practice -Information Capacity Challenges -Research Processes and Procedures -Research Agendas -Action Research Guided Questions
Information Capacity Challenges • Building an Evidence-based Infrastructure • Managing and responding to myriad requests • Maintaining quality and integrity of data process • Making data and information widely accessible • Keeping Up with the Demand • Responding to heightened accountability mandates • Linking research to (resource) planning • Supporting data-driven decision-making • Turning Data into Action • Making data available and applicable at all levels • Making sense of and taking action on the data • Building a Culture of Inquiry
Processes & Procedures • Guidelines for use of data and information • Protection of human subjects policy • Review panels and committees • Request and fulfillment procedures • Criteria for prioritizing ad hoc requests • Linking requests to broader goals & initiatives • Creating and using Research Agendas
Research Agendas • College-wide Research Agenda • Supports major college-wide initiatives & activities • Tied to college-wide plan (goals & priorities) • Includes recurring requests • Topical Research Agenda • Focused on a single topic or group of interest • Tied to a specific initiative or activity • Fewer research activities than college-wide
Action Research Guided Questions Developing the Research Agenda • What and who will be researched? • How is research tied to college plans, goals, initiatives and/or activities? • How will the information be used, by whom and how often? • Which methodology or approach will be used? Turning Data into Information • What do the data tell us? • Which questions were fully answered by the research and which need more exploration? • What are reasonable benchmarks based on the research? Taking Action on the Information • What interventions or strategies do we need to deploy in order to move the needle? • How should this information be shared and applied across the college?
Developing a Culture of Collaborative Inquiry • -Operational Definition • -Building a Culture of Inquiry • -Data Integration & Inquiry Strategies
Characteristics of Evidence Good evidence used in evaluations has the following characteristics: • It is intentional, and a dialogue about its meaning and relevance has taken place. • It is purposeful, designed to answer questions the institution has raised. • It has been interpreted and reflected upon, not just offered up in its raw or unanalyzed form. • It is integrated and presented in a context of other information about the institution that creates a holistic view of the institution or program. • It is cumulative and is corroborated by multiple sources of data. • It is coherent and sound enough to provide guidance for improvement. Source: ACCJC/WASC Guide to Evaluating Institutions, 2009
Culture of Inquiry Evidence and Inquiry Culture of Evidence • Data are provided for multiple purposes. • Data demonstrate performance & outcomes. • Data are accessible to all constituency groups. • Data are continuously distributed. • Dialogue centers around data/evidence. • Dialogue focuses on taking action. • Dialogue is open and collaborative. • Dialogue is continuous and widespread. • Dialogue is reflective and dynamic.
Building a Culture of Inquiry • Make evidence and inquiry the paradigm. • Communicate often, widely and clearly. • Embed the cycle of collectively assessing, planning, implementing, re-assessing and re-planning….. • Create opportunities for continuous collaboration with multiple and mixed constituency groups.
Organizing data: Weave, Trakdat, Taskstream, Elumen “The New Guys in Town”
Questions? • Thank YouPlease fill out the evaluations