1 / 83

Online Course Assessment Methods: How Effective Are They?

Online Course Assessment Methods: How Effective Are They?. Dr. Kim Marie McGinnis. My Background. Doctorate in Educational Leadership and Policy Analysis, ETSU Master’s in Education, WCU BS in Landscape Horticulture, NCSU

shen
Download Presentation

Online Course Assessment Methods: How Effective Are They?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Course Assessment Methods: How Effective Are They? Dr. Kim Marie McGinnis

  2. My Background • Doctorate in Educational Leadership and Policy Analysis, ETSU • Master’s in Education, WCU • BS in Landscape Horticulture, NCSU • Director of Occupational Extension Training, Carteret Community College • Dean of Technical and Vocational Programs, Mayland Community College

  3. My Interest in this Study • I develop Internet courses • I want to continually improve the quality and effectiveness of the courses that I teach • I am also an Internet course student and have questions about the choices of assessments that have been used in those courses • Assessment is a very current accountability issue and distance education is becoming a common method of instruction

  4. Need for this Study • Assessment has become an integral part of accountability in the American educational system • Distance education is the fastest growing form of education • With increased emphasis on accountability in general and increased scrutiny of online teaching and learning, issues of assessment have taken on more importance than ever before (Comeaux, 2005)

  5. Research Questions • Academic discipline: Are there differences in assessment methods being used among faculty who teach in different academic disciplines in the online environment? • Learning Objectives being met: Are there differences in perceived effectiveness of the assessment methods being used among individual instructors in determining if the course learning objectives have been met?

  6. Research Questions • Internet course development training: Are there differences in assessment methods used between those online instructors who received training in Internet course development as compared to those who did not? • Number of assessments per course: Are there differences in the number of different types of assessments being used per course by each instructor?

  7. Research Questions • Years teaching Internet courses: Are there differences in the types of assessments being used by online instructors who have been teaching in the online environment for more than three years, as compared with instructors who have been teaching in the online environment for three or fewer years?

  8. Research Questions • Number of Internet courses per year: Are there differences in the types of assessments being used by online instructors who teach more than one Internet course per year and instructors who teach only one Internet course per year?

  9. Research Design • Primary data analysis was conducted based on responses to an original survey document developed by the researcher • The survey was conducted online and sent to all online instructors who had taught an Internet course or a web-enhanced course during the 2004-2005 academic year at the 15 Western North Carolina community colleges that serve the Appalachian region (the population - 371) • 174 responses were received (47%)

  10. The Survey • The survey consisted of 16 questions • Questions 10 and 12 were open-ended questions • Question 1 identified the academic department for which the instructor taught • Questions 2,3,4, and 5 addressed the number of unduplicated Internet courses that the instructor taught during different academic years • Question 6 identified the assessment methods that individual instructors were using in the Internet courses they taught

  11. The Survey • Question 8 required a yes or no response to receiving training in Internet course development • Question 9 focused on the area in which the training was received • Question 11 provided the number of different types of assessment being used by individual instructors per course • Questions 13-16 retrieved demographic information

  12. Demographics • Gender: Of 167 responses, 71 respondents were male (40.8%) and 96 were female (55.2%) • Age: Of 164 responses, the mean age was 44.65 with a range from 25-76 years old • Academic degree: Of 166 responses, 8 had an associate’s degree (4.6%), 19 had a bachelor’s degree (10.9%), 125 had a master’s degree (including EdS) (71.8%), and 14 had a doctoral degree (8.0%)

  13. Demographics • Years experience in education: Of 169 responses, the mean number of years of experience in education was 13.95 with a range from 1-45 years

  14. Research Question 1 • Are there differences in assessment methods being used among faculty who teach in different academic disciplines in the online environment? • Chi-square with a two-way contingency table analysis was conducted for each assessment method

  15. Research Question 1 • Academic discipline and portfolio use were not found to be significantly related • Uses portfolio: Business (21.3%), Vocational (23.1%), Health Occupations (0%), Arts (20%), Public Safety (0%), Continuing education (0%), Social Sciences (18.2%), Hard Sciences (0%), Other (0%)

  16. Research Question 1 • Academic discipline and true/false test use were found to be significantly related • Uses true/false tests: Business (68.0%), Vocational (76.9%), Health Occupations (57.1%), Arts (30%), Public Safety (60%), Continuing Education (100%), Social Sciences (27.3%), Hard Sciences (56.3%), and Other (100%)

  17. Research Question 1 • Academic discipline and multiple-choice test use were found to be significantly related • Uses multiple-choice tests: Business (92%), Vocational (92.3%), Health Occupations (100%), Arts (63.3%), Public Safety (100%), Continuing Education (100%), Social Sciences (81.8%), Hard Sciences (87.5%), and Other (100%)

  18. Research Question 1 • Academic discipline and short-answer test use were found to be significantly related • Uses short-answer tests: Business (44%), Vocational (84.6%), Health Occupations (57.1%), Arts (36.7%), Public Safety (60%), Continuing Education (100%), Social Sciences (31.8%), Hard Sciences (62.5%), Other (100%)

  19. Research Question 1 • Academic discipline and essay use were not found to be significantly related • Uses essay tests: Business (28%), Vocational (53.8%), Health Occupations (42.9%), Arts (56.7%), Public Safety (80%), Continuing Education (50%), Social Sciences (50%), Hard Sciences (37.5%), Other (100%)

  20. Research Question 1 • Academic discipline and discussion use were found to be significantly related • Uses discussion: Business (48%), Vocational (84.6%), Health Occupations (57.1%), Arts (80%), Public Safety (100%), Continuing Education (100%), Social Sciences (90.9%), Hard Sciences (37.5%), Other (100%)

  21. Research Question 1 • Academic discipline and individual project use were not found to be significantly related • Uses individual projects: Business (74.7%), Vocational (69.2%), Health Occupations (57.1%), Arts (73.3%), Public Safety (80%), Continuing Education (50%), Social Sciences (40.9%), Hard Sciences (6.3%), Other (0%)

  22. Research Question 1 • Academic discipline and group project use were found to be significantly related • Uses group projects: Business (24%), Vocational (30.8%), Health Occupations (71.4%), Arts (53.3%), Public Safety (80%), Continuing Education (0%), Social Sciences (40.9%), Hard Sciences (6.3%), Other (0%)

  23. Research Question 1 • Academic discipline and problem-solving activity use were not found to be significantly related • Uses problem-solving activities: Business (44%), Vocational (23.1%), Health Occupations (28.6%), Arts (40%), Public Safety (20%), Continuing Education (50%), Social Sciences (27.3%), Hard Sciences (43.8%), Other (0%)

  24. Research Question 1 • Academic discipline and self-assessment use were not found to be significantly related • Uses self-assessment: Business (14.7%), Vocational (15.4%), Health Occupations (28.6%), Arts (20%), Public Safety (20%), Continuing Education (0%), Social Sciences (4.5%), Hard Sciences (0%), Other (0%)

  25. Research Question 1 • Academic discipline and other assessment use were not found to be significantly related • Uses other assessments: Business (6.7%), Vocational (7.7%), Health Occupations (0%), Arts (23.3%), Public Safety (0%), Continuing Education (0%), Social Sciences (18.2%), Hard Sciences (12.5%), Other (0%)

  26. Research Question 2 • Are there differences in perceived effectiveness of the assessment methods being used among individual instructors in determining if the course learning objectives have been met? • A chi-square test of frequencies was used • There was a statistically significant difference in the perceptions of effectiveness of all of the assessment methods with the exception of self-assessment and the category of other

  27. Research Question 2 • Figure 1: Bar Graph of the Perceived Effectiveness of Portfolio Assessment (least effective N = 7, somewhat effective N = 5, effective N = 12, more effective N = 13, most effective N = 22)

  28. Research Question 2 • Figure 2: Bar Graph of the Perceived Effectiveness of True/False Tests (least effective N = 12, somewhat effective N = 19, effective N = 58, more effective N = 22, most effective N = 7)

  29. Research Question 2 • Figure 3: Bar Graph of the Perceived Effectiveness of Multiple-Choice Tests (least effective N = 6, somewhat effective N = 16, effective N = 61, more effective N = 49, most effective N = 23)

  30. Research Question 2 • Figure 4: Bar Graph of the Perceived Effectiveness of Short Answer Tests (least effective N = 2, somewhat effective N = 2, effective N = 28, more effective N = 49, most effective N = 23)

  31. Research Question 2 • Figure 5: Bar Graph of the Perceived Effectiveness of Essay Tests (least effective N = 0, somewhat effective N = 2, effective N = 10, more effective N = 41, most effective N = 41)

  32. Research Question 2 • Figure 6: Bar Graph of the Perceived Effectiveness of Discussion Questions (least effective N = 3, somewhat effective N = 5, effective N = 27, more effective N = 54, most effective N = 39)

  33. Research Question 2 • Figure 7: Bar Graph of the Perceived Effectiveness of Individual Projects (least effective N = 6, somewhat effective N = 0, effective N = 12, more effective N = 37, most effective N = 72)

  34. Research Question 2 • Figure 8: Bar Graph of the Perceived Effectiveness of Group Projects (least effective N = 4, somewhat effective N = 13, effective N = 15, more effective N = 29, most effective N = 15)

  35. Research Question 2 • Figure 9: Bar Graph of the Perceived Effectiveness of Problem- Solving Activities (least effective N = 4, somewhat effective N = 3, effective N = 8, more effective N = 30, most effective N = 43)

  36. Research Question 2 • Figure 10: Bar Graph of the Perceived Effectiveness of Self Assessment (least effective N = 11, somewhat effective N = 13, effective N = 16, more effective N = 10, most effective N = 3)

  37. Research Question 2 • Figure 11: Bar Graph of the Perceived Effectiveness of Other Assessment Methods (least effective N = 5, somewhat effective N = 4, effective N = 3, more effective N = 12, most effective N = 9)

  38. Research Question 3 • Are there differences in assessment methods used between those online instructors who received training in Internet course development and those that did not? • A chi-square two-way contingency table analysis was used to compare each assessment method individually with training

  39. Research Question 3 • Training and uses portfolio were not significantly related • 16.1% of instructors who received training used portfolio • 22% of instructors who did not receive training used portfolio

  40. Research Question 3 • Training and true/false test use were not found to be significantly related • 55.6% of instructors who received training used true/false tests • 52% of instructors who did not receive training used true/false tests

  41. Research Question 3 • Training and multiple-choice test use were not found to be significantly related • 87.9% of instructors who received training use multiple-choice tests • 78% of instructors who did not receive training use multiple-choice tests

  42. Research Question 3 • Training and short-answer test use were not found to be significantly related • 47.6% of instructors who received training use short-answer tests • 50% of instructors who did not receive training use short-answer tests

  43. Research Question 3 • Training and essay test use were not found to be significantly related • 41.9% of instructors who received training use essay tests • 40% of instructors who did not receive training use essay tests

  44. Research Question 3 • Training and discussion use were found to be significantly related • 70.2% of instructors who received training use discussion • 48% of instructors who did not receive training use discussion

  45. Research Question 3 • Training and individual project use were not found to be significantly related • 71.8% of instructors who received training use individual projects • 70% of instructors who did not receive training use individual projects

  46. Research Question 3 • Training and group project use were not found to be significantly related • 36.3% of instructors who received training use group projects • 24% of instructors who did not receive training use group projects

  47. Research Question 3 • Training and problem-solving activity use were not found to be significantly related • 41.9% of instructors who received training use problem-solving activities • 30% of instructors who did not receive training use problem-solving activities

  48. Research Question 3 • Training and self-assessment use were not found to be significantly related • 16.1% of instructors who received training use self-assessment • 6% of instructors who did not receive training use self-assessment

  49. Research Question 3 • Training and other methods of assessment use were not found to be significantly related • 12.1% of instructors who received training use other methods of assessment • 8% of instructors who did not receive training use other methods of assessment

  50. Research Question 4 • Are there differences in the number of different types of assessments being used per course by each instructor? • A chi-square frequencies test was conducted

More Related