1 / 48

Achieving Excellence in Blended Learning Assessment & Evaluation

Learn about the principles, methods, and tools for successful Scholarship of Teaching and Learning (SOTL) research, focusing on effective assessment and impact evaluation in blended learning environments. Gain insights into clear goal setting, adequate preparation, and interpreting significant results.

Download Presentation

Achieving Excellence in Blended Learning Assessment & Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MAINTAINING QUALITY IN BLENDED LEARNING: FROM CLASSROOM ASSESSMENT TO IMPACT EVALUATIONPART II: IMPACT EVALUATION Patsy Moskal (407) 823-0283 pdmoskal@mail.ucf.edu http://rite.ucf.edu

  2. DEFINING SOTL

  3. SCHOLARSHIP OF TEACHING AND LEARNING (SOTL) • Scholarly research on effective teaching and student learning • Ernest Boyar, 1990, Scholarship Reconsidered • Carnegie Academy for the Scholarship of Teaching and Learning • Scholarship Assessed (1997) Charles Glassick, Mary Taylor Huber, and Gene Maeroff

  4. NATIONAL ORGANIZATIONS DEVOTED TO THE SCHOLARSHIP OF TEACHING & LEARNING • International Society for the Scholarship of Teaching & Learning • American Association for Higher Education & Accreditation • Carnegie Academy for the Scholarship of Teaching & Learning • The Carnegie Foundation for the Advancement of Teaching

  5. MOTIVATION FOR SOTL • Research new instructional methods or classroom changes for improvement • Provides opportunities for publication and presentation • Tenure and promotion • Supporting data for accreditation, grant proposals, etc…

  6. JOURNALS DEVOTED TO SOTL • Journal of Scholarship of Teaching & Learning • Teaching in Higher Education • New Directions for Teaching & Learning • Journal on Excellence for Teaching and Learning • Achieving Learning in Higher Education

  7. DISCIPLINE SPECIFIC JOURNALS • Journal of Education for Business • American Biology Teacher • Journal of Research in Science Teaching • Studies in Art Education • Teaching and Learning in Medicine • Journal of Nursing Information • Arts and Humanities in Higher Education

  8. HOW TO ACCOMPLISH SOTL

  9. CHALLENGES IN COMPLETING SOTL RESEARCH • Faculty lack of expertise in research/stats • Lack of time, resources • Minimize class disruption • Challenges to designing research

  10. The Alice in Wonderland approach to assessment “Would you tell me, please, which way I ought to go from here?” said Alice. “That depends a good deal on where you want to get to,” said the Cat. “I don’t much care where—” said Alice. “Then it doesn’t matter which way you go,” said the Cat. “—so long as I get somewhere,” Alice added “Oh, you’re sure to do that,” said the Cat, “if you only walk around long enough.” --Lewis Carroll

  11. Principles that guide our evaluation • Evaluation must be objective. • Evaluation must conform to the culture • Uncollected data cannot be analyzed. • Data do not equal information. • Qualitative and quantitative approaches must complement each other. • Evaluation must show an impact. • Results may not be generalized

  12. THE KEY TO SUCCESSFULLY ACCOMPLISHING SOTL… • Clear Goals • Adequate Preparation • Appropriate Methods • Significant Results • Effective Presentation • Reflective Critique (Glassick, Huber, Maeroff, 1997)

  13. CLEAR GOALS • Do you have a clear goal? • Is your goal doable?

  14. THE K.I.S.S. PRINCIPLE OF ASSESSMENT DESIGN Keep It Simple and Straightforward! A simple, doable design is better than a complex, impossible design that is never completed!

  15. HITTING THE TARGET… • What do you want to know? • You must clearly define your questions. • And, if your data doesn’t answer your questions… what’s the point?!

  16. ADEQUATE PREPARATION • Have you looked at the literature? • Can you do your study? • If you need help, can you get support?

  17. FINDING SOURCES OF HELP • Faculty development center • Institutional research • Office of Assessment • Statistics or research folks • Content experts • Other researchers

  18. APPROPRIATE METHODS • Do you have data or can you get it? • Do your methods “fit” your goal and objectives? • Be prepared to rewind and repeat!

  19. SOME ASSESSMENT/EVALUATION TOOLS • Surveys • Focus groups • Course-based performance • Observations • Tests/exams • Pre-collected data • E-portfolios • Rubrics

  20. SIGNIFICANT RESULTS • Did you achieve your objectives? • Does this work inform and add to the field?

  21. STATISTICALLY OR PRACTICALLY SIGNIFICANT?? • Don’t let the statistics run your design! • Statistically significant may not be practically significant. • What about the random sample? • Quantitative and qualitative approaches must complement each other

  22. SOME EXAMPLESSURVEYS

  23. SURVEY PROS AND CONS PROS CONS Student opinions Low response rate Timing can impact results Wording of questions is important “Over surveyed” students • Easy to administer • Electronic is possible • Researcher’s questions • Can tell you “what” • Open ended can tell you more • Can look at demographics

  24. Student Satisfaction in Blended Courses N = 36,801 49% Percent 28% 17% 6% 2% Very Satisfied Unsatisfied Very Unsatisfied Satisfied Neutral

  25. STUDENTS’ POSITIVE PERCEPTIONS ABOUT BLENDED LEARNING • Convenience • Reduced Logistic Demands • Increased Learning Flexibility • Technology Enhanced Learning Reduced Opportunity Costs for Education

  26. Less Positives With Blended Learning • Reduced Face-to-Face Time • Technology Problems • Reduced Instructor Assistance • Overwhelming • Increased Workload Increased Opportunity Costs for Education

  27. Student satisfaction in fully online and blended courses Fully online (N = 67,433) Blended (N = 36,801) 49% 47% Percent 28% 28% 17% 16% 6% 6% 3% 2% Very Satisfied Neutral Very Unsatisfied Satisfied Unsatisfied

  28. SOME EXAMPLESPRE-EXISTING DATA

  29. PRE-EXISTING DATA PROS AND CONS PROS CONS Someone else decided what to collect May not be in a form of your choosing Requires permission and obtaining from others • Already collected • May have longitudinal data • Often in electronic spreadsheet

  30. STUDENT SUCCESS AND WITHDRAWAL

  31. Success rates by modalitySpring 09 through Spring 10 F2F n=456,125 Blended n=30,361 Fully Online n=83,274 Percent

  32. Withdrawal rates by modalitySpring 09 through Spring 10 F2F n=456,125 Blended n=30,361 Fully Online n=83,274 Percent

  33. STUDENT EVALUATION OF INSTRUCTIONSEI

  34. A decision rule for the probability of faculty member receiving an overall rating of Excellent (n=1,280,890) If... Excellent Very Good Fair Poor Good Facilitation of learning Communication of ideas Respect and concern for students Then... The probability of an overall rating of Excellent = .97 & The probability of an overall rating of Fair or Poor =.00

  35. A comparison of excellent ratings by course modality--unadjusted and adjusted for instructors satisfying Rule 1 (n=1,171,664) Course Overall If Rule 1 Modality % Excellent % Excellent Blended 48.9 97.2 Online 47.6 97.3 Enhanced 46.8 97.5 F2F 45.7 97.2 ITV 34.2 96.6

  36. EFFECTIVE PRESENTATION • Did you remember your objectives? • Did you remember your audience? • Did you present your message in a clear, understandable manner?

  37. DATA DO NOT EQUAL INFORMATION… • Data, by itself, answers no questions and is nothing more than a bunch of numbers and/or letters. • How you interpret the data for others can determine how well they understand. • Visuals are good! • Ongoing assessment is best.

  38. REFLECTIVE CRITIQUE • Did you critique your own work? • What worked well and what didn’t work? • Where do you go from here?

  39. Evaluation Cycle • Define your question(s). • Determine methods that can answer question(s). • Implement methods, gather, and analyze data. • Interpret results – did you answer your question? • Make decisions based on results.

  40. Additional things to consider

  41. Some Issues to ponder with data • Uncollected data cannot be analyzed! • Data is not always “clean” and can require work • Look for data you already have

  42. TACKLING IRB • What is it? • Training may be required • A MUST if you want to publish or present • Don’t be intimidated!

  43. UNEXPECTED ISSUES • Technology challenges • People challenges • No response • Dirty data

  44. MAKE AN IMPACT WITH YOUR ASSESSMENT… • The final step in assessment (or evaluation) should be determining how your results can impact decisions for the future.

  45. EXTENDING SOTL TO PROGRAM RESEARCH AND BEYOND

  46. IMPACT EVALUATION AS A MATTER OF SCALE A lot Institution Opportunity Costs College There is added value at every level Department A little Program Individual Few Many Level

  47. RESEARCH INITIATIVE FOR TEACHING EFFECTIVENESS What services do we provide? • Research design • Survey construction & administration • Data analysis & interpretation • Results provided in “charts and graphs” format • Publication and presentation assistance

  48. Patsy Moskal, Ed.D. (407) 823-0283 pdmoskal@mail.ucf.edu http://rite.ucf.edu Contact Information

More Related