250 likes | 380 Views
Data Collection – to Curricular Change Using your Assessment Data to Drive Curriculum Changes. Assessment. “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development .”
E N D
Data Collection – to Curricular ChangeUsing your Assessment Data to Drive Curriculum Changes
Assessment • “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development.” • Palomba & Banta, 1999, pg. 4
The 50,000 Foot level • Start with the culture because faculty must be comfortable with the assessment culture before they feel comfortable with assessment! • Determine the role that assessment will play in college processes – and what it WON’T be used for! • Improving Student Learning • Program Evaluation • Budgeting • NOT Faculty Evaluation
Start with the culture • Value Campus Culture & History (Assessment is not a one size fits all) • Respect and Empower People – especially Faculty! • Value Assessment by providing appropriate resources and infrastructure • Value innovation & risk taking to improve teaching (even if it fails)
Value Innovation “I have not failed. I've just found 10,000 ways that won't work.” ― Thomas A. Edison
I think we are about to turn the corner on this whole “Assessment” thing… Assessment
Effective Assessment • Is linked to decision making about the curriculum – Paloma & Banta • Measures real-life gaps in desired skills & performance – Swing, et. al. • Leads to reflection and action by faculty – Paloma & Banta
We’re collecting data but… • Institutions and faculty are good at collecting data – just not as good at using data to drive curricular changes • This part of the assessment cycle is often called: • “Closing the loop” • What loop – and who left it open?
Helping Faculty get to Closing the Loop • Start with asking the right questions • Questions to help faculty focus their assessment activities • What should students learn? • How well are they learning it? • What evidence do you have? • What are you doing with the evidence? Assessment Trail Assessment Cycle
Ask the “Right” Question • The right question is: • Meaningful – it is a question that faculty want to know the answer to, and that knowing the answer will help them impact student learning • Measurable – work at asking a question that faculty can answer – usually that means narrowing down the question • Manageable – it is important to keep the question and the process of collecting data manageable – this isn’t the only or primary job that faculty have
Assessment for Learning • Once faculty has determined the actual question that is going to be answered this will drive the methodology of assessment • Pre/Post Test • Embedded test questions • Project Based Assessment • Portfolios • Surveys • Performance, etc., etc. • How often will they collect the data, in what classes – What makes sense? (Remember keep it manageable)
Data Collection – Now What • First encourage faculty to take time to organize the data • Excel is an easy accessible tool available to 99% of all faculty • Offer Data management workshops • What does on a row, column, what are some analyses you can run in Excel
Qualitative Data Analysis • Qualitative Data is an appropriate tool for many disciplines • Map out the requirements of the assignment • Search for themes in student responses • Track how often key curricular themes appear
Statistics - Smatistics • If you are a IR person – or big on Stats – close your ears for a minute… • Rather than worry so much about reliability, validity, statistical significance, think about these assessment projects as Action Research • Action Research focuses on getting information that will enable faculty to change conditions in a particular situation in which they personally involved • Seeks to solve a problem, or • Inform local practice – specifically –classroom/course practice!
Steps in Action Research • Identify the Research Question • Gather the necessary information • Analyze and interpret the information • Develop an action plan • Sound familiar?? – Think Cycle of Assessment
Looking at the Data • Move beyond “Averages” • Look at the Spread of the Data • What does the spread indicate? • Is the data evenly distributed? • Are there large gaps? Where do they exist? • Has the faculty member or department decided what is an acceptable level of performance?
When to change curriculum – when to make other change? Curricular Change • Follow the data trail – and then talk to invested participants • Data leads to students not understanding a concept • What is the benchmark of performance? • Curriculum mapping – where does the concept occur? • How is the concept taught (pedagogy) • Where is the concept reinforced? (Scaffolding) • What changes can we (faculty/department) make to the curriculum to help students understand and apply concept • How will we measure this curricular change to see if it is successful
When the data supports Other Changes • Case Study – Visual Communications • General Education Curriculum Assessment Cycle • Visual Communications • Mass Exodus of Visual Communications Classes • Faculty couldn’t come up with acceptable assignments to show how they were assessing visual communications • Focused discussion with department chairs/faculty groups on the issue • Faculty felt unqualified to “teach” visual communications • Professional Development, in-service workshops, teaching circles, etc.
Making the Curricular change • If the focus throughout the process has been on student learning (versus report writing), then faculty will be more open to making curricular changes • Next Assessment Cycle – what difference did the change make? Was there a difference in performance? • Make sure appropriate time has elapsed for changes to be in effect • Make sure the measurement is parallel to the previous assessment
Are we done yet?“Closing the Loop” • “When are we done with a learning outcome?” • Did you see improvement? • Did you meet your benchmark performance? • Are you satisfied? • Do you see a greater need/question that needs to be asked?
Keep the Focus on What Matters Most – Student Learning • Reports and data analysis that don’t focus on student learning are a waste of paper • Faculty must be engaged in making sense of and interpreting assessment results – administration can’t do it for them • Share Successes • With permission – share assessment results from other departments/disciplines • Get the faculty to tell their success stories – it carries more weight with their peers
Make Realistic Plans to Avoid pitfalls • Effective Assessment Takes time to Plan, Implement and Sustain • Don’t expect instant results – reliable data takes time to gather • Make sure the assessment is asking a question that can be answered • This means being narrow in scope – rather than throwing as much stuff as possible at a wall and seeing what sticks!
Don’t Forget to Engage administration • While it is important for faculty to “own” assessment, support needs to come from administration: • Recognizing faculty efforts • Attending faculty functions • Providing appropriate resources • Considering policy implications
Questions?? Sheri H. Barrett, EdD Director of Outcomes Assessment Johnson County Community College sbarre13@jccc.edu 913-469-7607