1 / 23

Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia

Teacher Practices in Scoring Multiple-Choice Items, Interpreting Test Scores & Performing Item Analysis. Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia. Introduction.

majed
Download Presentation

Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Teacher Practices in Scoring Multiple-Choice Items, Interpreting Test Scores & Performing Item Analysis Cheryl Ng Ling Hui Hee Jee Mei, Ph.D Universiti Teknologi Malaysia

  2. Introduction • Multiple choice items are widely used to assess student learning. They feature prominently in public examinations and likewise in teacher-made-tests. • Research Project – to develop a tool to assist teachers in scoring multiple choice items, interpreting test scores and performing item analysis. • Thus, this preliminary needs analysis study was conducted….

  3. Aim and Purposes of the Study • Objectives: • Identify teachers’ current practice and the challenges they face in scoring multiple-choice items, interpreting test scores and performing item analysis; and • Ascertain the prevalence of OMR scanners and item analysis software in Malaysian secondary schools. • Aim: Findings from this preliminary study serve to inform subsequent development of a software to aid teachers in these three aspects.

  4. Methodology • Research Design: Survey • Participants: 130 secondary school teachers serving in public schools across Malaysia. • Instrument: Questionnaire with 3 sections: Section A: Demographic Profile – gender, location and type of school, teaching experience Section B: Teacher Practices – scoring OMR answer sheets, interpreting test scores, performing item analysis Section C: OMR Scanner and Item Analysis Software – availability in school, willingness to invest in one etc.

  5. Methodology Precedures: • Questionnaires were disseminated to various schools through the help of colleagues and friends. • An online version of the questionnaire was created using a free online survey tool and the link was sent to potential respondents. • The collected data were analysed using descriptive statistics.

  6. Respondents’ Demographic Profile • Number of respondents, N = 130 teachers • Number of schools represented = 74 schools • Teaching experience 0-33 years

  7. Teachers’ Practices : Scoring • Teachers’ estimates on the amount of time (in seconds) needed to score one OMR answer sheet (assuming that there are 40 items), count the number of correct answers and convert the raw score to percentages :- Table 1 Means and Standard Deviations for Teachers’ Practices in Scoring

  8. Teachers’ Practices : Scoring • Teachers have to spend a substantial amount of time to process OMR answer sheets when they do it manually. Based on the mean of teachers’ estimates, 1 OMR answer sheet takes around 101.21 seconds (i.e. 1.69 minutes) to mark. 5 classes x 40 students = 200 OMR answer sheets 200 sheets x 1.69 minutes = 5 hours 38 minutes

  9. Teachers’ Practices : Scoring • They would need to spend even more time if they were to do the following (commented by individual teachers): • “Check if two answers were marked for one question” (3s) • “Make a note of items answered wrongly by students” (120s) • “Recheck for errors in marking” (20s) • “Key in the data” (60s)

  10. School Requirements to Interpret Test Scores Mean = 3.09, S.D. = 1.406

  11. Teachers’ Practices – Interpreting Test Scores

  12. Challenges in Interpreting Test Scores • “I struggle with time constraints.” (106 respondents, 81.54%) • “I do not know how to do it.” (44 respondents, 33.85%) Other comments: • “Too much paper work.” • “Do not do it as I find the result irrelevant.” • “Do not face any challenges (not required to do it / SU Peperiksaan does it for teachers)”

  13. School Requirements to Perform Item Analysis Mean = 3.44, S.D. = 1.341

  14. Teachers’ Practices – Performing Item Analysis

  15. Challenges in Performing Item Analysis • “I struggle with time constraints.” (115 respondents, 88.46%) • “I do not know how to do it.” (42 respondents, 32.31%) Other comments: • “The information from item analysis is considered irrelevant by students.” • “Do not do it as I find the result irrelevant.” • “Do not face any challenges (not required to do it)”

  16. Prevalence of OMR Scanners & Item Analysis Software • Out of the 74 schools represented in this study, only 13 schools (18%) have invested in an OMR scanner and 16 schools (22%) provide teachers with item analysis software.

  17. Teachers’ Willingness to Invest in the Tool

  18. Teachers’ Willingness to Invest in the Tool • Among those who are willing to invest (N=84), the respondents gave a wide range of answers when asked how much they are willing to pay for such a tool. • Their answers range from RM5 (min) to RM3000 (max). • The most common response was RM100 (mode, 22 respondents). The median was slightly higher at RM150. • The mean was RM337.65, with a standard deviation of RM512.97. Max = RM3000 Mean = RM337.65 Median = RM150 Mode = RM100 Min = RM5

  19. Discussion • Teachers have to spend a substantial amount of time to score OMR answer sheets, interpret test scores and perform item analysis if they do it manually. • Time constraint is the greatest challenge that hinder teachers from interpreting test scores and performing item analysis. • Therefore, a tool to aid teachers in scoring OMR answer sheets would greatly reduce teachers’ burden and allow them more time to focus on enhancing test items and improving their teaching practice. • However, the tool should be made available to teachers at an affordable price range (RM100-RM350).

  20. Discussion • Interpreting test scores and performing item analysis are crucial aspects of the assessment process. • Interpreting test scores will give insights to the teachers as to their students’ performance in the test. It also helps to highlight areas of weakness that need to be addressed. • Information from the item analysis is crucial for improving the item quality. From the item analysis results, teachers can decide whether to retain, modify or discard test items. Functional and modified items could be stored for future use. (Gronlund and Linn, 1990; Reynolds, Livingston and Willson, 2010) • .

  21. Discussion • Requirements to interpret test scores and perform item analysis vary from school to school. If teachers neglect to interpret test scores and perform item analysis, then the value of the assessment process is greatly reduced. • Suggestion: Teachers should be required to interpret test scores and perform item analysis as part of their professional practice. • However, they should be released from the burden of clerical work (i.e. manually scoring OMR answer sheets and keying in the data) which could be done effectively with the assistance of an OMR processing and item analysis tool.

  22. Discussion • In conclusion, there is a need to develop an affordable and teacher-friendly tool to help teachers to process OMR answer sheets and provide them with information on students’ performance in the test as well as the quality of the items.

  23. ~Thank you~ Cheryl Ng Ling Hui Universiti Teknologi Malaysia cherlinghui@gmail.com http://intoherworldofteaching.wordpress.com

More Related