390 likes | 971 Views
Developing Assessments. Christine Coombe Dubai Men’s College. Agenda. Are you testwise? Cornerstones of assessment 10 things to remember about developing assessments Cornerstones case study (time permitting) Questions and comments.
E N D
Developing Assessments Christine Coombe Dubai Men’s College
Agenda • Are you testwise? • Cornerstones of assessment • 10 things to remember about developing assessments • Cornerstones case study (time permitting) • Questions and comments
1. Assessment is an integral part of teaching/learning cycle. • It involves: • Planning • Development • Administration • Analysis • Feedback • Reflection
The circle of interrelationships Standards,Program objectives Approach Needs analysis Syllabus Analysis and feedback Students Materials Assessment Teaching
2. Teachers involved w/several phases of assessment simultaneously. • The overall nature of assessment necessitates this. • Assessment development should be a collegial activity where we benefit from contributions and feedback from faculty. • Not unusual to be analyzing a previous exam while writing questions for the next one. • Prepare grading criteria and answer keys along with the test.
3. Keep the cornerstones in mind when developing assessments. • Assessment and testing • Many forms, same principles • Guiding principles • Validity • Reliability • Practicality • Washback • Authenticity • Transparency • Security
Validity • Does the assessment instrument measure what it is supposed to measure? • Appropriateness of measurement for a particular purpose • Types of validity • content • construct • face • other types
Types of validity • Content validity • Assessing what and how you teach • Assessment of course content with clear reference to goals and outcomes • Use of formats and tasks familiar to students • Construct validity • “Fit” with underlying theories and methodology of language learning of programme • Face validity • Credibility or acceptance of a measurement instrument on basis of appearance
Reliability • Reliability refers to the consistency of test scores: • same test given at different times to same students would produce similar results • raters or markers for a task should have consensus on grading • Factors that effect reliability: • test factors: item construction, formats, test length • administrative factors: setting, procedures, timing • scoring factors: clear marking key, intra and inter-rater reliability • affective factors: candidate test-taking strategies, familiarity with formats
Improving reliability • Good test construction • allow time for test development • use specifications • item writing, use of analysis, clear unambiguous directions, editing • Consistent administration • Scoring procedures • objective when feasible, training and moderation for scorers of subjective sections • Learner training and preparation
Practicality • Availability of time and resources • Assessment cycle • adequate time for preparation, grading and analysis • feedback an important component • Often the reason that teachers do not become involved in the assessment process
Washback • Washback is the effect of assessment on teaching and learning • Negative washback: test-driven curricula, teaching to test formats and contents • Positive washback: all stakeholders aware of learning goals. Assessment marks progress, accomplishment towards these outcomes, measured in different ways
Authenticity • Our aim is to prepare students to function in the real world. • Whenever possible, assessment should mirror real world situations and contexts • formats and tasks • authentic use of target language • Authenticity is motivating for students!
Transparency • The availability of clear, accurate information to students about assessment • Information should include: • what they have to do to succeed, outcomes • expected content and format • time allocated for task, deadlines • weighting of task • grading criteria • useful feedback for improvement
Security • What’s your policy on assessment security? • Students: • Cheating, plagiarism or any other kind of intellectual dishonesty is forbidden • Staff: • There are clear security guidelines for all stages of assessment that must be followed • There are severe consequences for breaches of security.
4. Specifications link learning outcomes with assessments. • Use ‘specs’ to promote validity and reliability • Tailor your specs to the audience (i.e., student, teacher, item writer, administrator) • Specs help us to be open about our rationale, procedures and results.
5. Employ multiple measures. • There is no single type of assessment that can give us all the info we need to accurately gauge a student’s proficiency or level of ability/mastery • Use traditional tests along with alternative forms of assessment • Keep the % of assessment types ‘low stakes’
6. Test what has been taught, how it has been taught. • This is the basic concept of content validity • Only assess Ss on what you have covered in class or on content from the textbook that has been assigned • Use formats that students are familiar with and have practiced beforehand • Choose formats that are authentic, purposeful and mirror real life contexts
7. Specify the material to be tested. • Be transparent • Ss need info on what they will be tested on, how they will be tested and the criteria upon which they will be assessed • The more transparent you are in your assessment practices, the lower your students’ test anxiety will be
8. Provide timely feedback. • For feedback to do any good, it must be provided in a timely manner • Tailor your feedback to the audience • Make feedback to students and administrators something they can actually use
9. Make time to reflect on your assessments. • The cycle does not stop with obtaining grades • Schedule time for analysis and reflection • Important information comes from learning how students performed on tests • Ask yourself questions like: • Did your assessment serve its purpose? • What improvements will you make for next time?
10. Professional development in assessment benefits everyone. • This session is just the tip of the iceberg • More sessions needed on topics like: • Item writing • Grading • Analysis • Feedback • HCT’s PD course in assessment • Current Trends in English Language Testing Conference (CTELT), Nov 2010 at DMC • 2nd Edition of Fundamentals of Language Assessment volume & manual
Cornerstones Case Study • Use your materials as you work with others to review the case of Mrs. Wright and Mr Knott. • Remember, help is an email away! • www.christinecoombe.com • Christine.coombe@hct.ac.ae