1 / 30

Introduction to CAP Quality Survey

This training provides an overview of Competent Assessment Program (CAP), including its theory, self-evaluation process, and the CAP quality measurement questionnaire. It covers topics such as competency assessment methods, validity and reliability, and additional quality criteria. The training also emphasizes the importance of transparency, acceptancy, comparability, and fairness in assessment. The participants will engage in self-evaluation and group discussions to further understand CAP.

Download Presentation

Introduction to CAP Quality Survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An introduction to the CAP quality survey presenter name & time Resource based on Liesbeth Baartman

  2. Content of the training • CAP: the theory behind it, what is it? •   Quality: explanation and discussion •   Self-evaluation process and experiences •   CAP quality measurement questionnaire: short demonstration, instruction to fill out the survey

  3. What is a Competent Assessment Programme? Open question Practice-based assessment Portfolio CAP Multiple choice question Oral exam Short answer question Essay Seminar

  4. CAP: a mix of methods • competency is too complex to be assessed by one single method • complementary: combine various assessment methods together, the mixed method is suitable for measuring competencies • “traditional” and“new” combined • depending on context, •  formative and / or summative? However, CAP is NOT always easy to describe and demonstrate in training.

  5. Miller’s ‘pyramid of competence’ (1990)

  6. Validity & reliability • Validity: • What is the validity exactly? • Not integrated anymore • Argument-based approach • CAP quality measurement: • Contents & level: all competencies? Knowledge, skills, attitude + performance? • Assessment method/format: suitable for measuring competency?

  7. Validity & reliability • Reliability: • Competence / performance varies depending on the task • Standardization does not necessarily lead to higher reliability • Rely on human judgment • Often qualitative information • Key result independent of specific assessor or circumstances: • Instruction / training assessor • Multiple assessors and different tests

  8. Additional new quality criteria • Formative function: • Provides good feedback? •   Students learn something from the test? •   Does the test have a positive impact on learning? •   Self / Peer-assessment CAP quality measure: • Meaning / Self / Educational implications

  9. Fitness for purpose: all competences are assessed by appropriate assessment method/format

  10. Reproducibility: A combination of multiple assessors, assessment tasks and assessment situations is used.

  11. TRANSPARENCY: The criteria and standards used by a CAP are clear and understandable to all stakeholders.

  12. ACCEPTANCY: Do the stakeholders agree on the design and implementation of the assessment?

  13. Comparability A CAP is conducted in a consistent and responsible way.

  14. Fairness Everyone should able to demostrate his/her competence. Everybody is a genius. But if you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid. - Albert Einstein

  15. Fitness for self‐assessment/reflection The CAP stimulates self-reflection and formulation of one’s own learning goals

  16. MEANINGFULNESS: Assessment feedback should help one learn.

  17. COGNITIVE COMPLEXITY: The assessment methods should allow students to demonstrate different levels of cognitive skills and thinking processes

  18. AUTHENTICITY: The assessment tasks should reflect the professional tasks

  19. Educational Consequences The CAP has a positive influence on teaching & learning 

  20. Time and cost: Feasibility of developing and implementing

  21. Self-evaluation

  22. Self-evaluation: stages • stage 1: explain the criteria and clarify the assessment (CAP) • stage 2: individual evaluation via CAP quality measurement • stage 3: group interview /discussion

  23. Stage 1: info session Goal: • Give a clear description of the CAP • Explain the quality criteria and indicators • Demonstrate the self-assessment tool (questionnaire) • Quality of the provided evidence as proof to support your judgment

  24. Description of your CAP

  25. participants need to indicate whether they agree or not with the statements . Explain what “suitability for purpose” means. Provide a piece of evidence for each statement to support what you indicated in the Likert scale section.

  26. What type of evidence is expected? • Relevant / representative? • Complete? • Both positive and negative? • Types of evidence: "That's obvious" “I observed this with my students' "We asked the students " • => Shared consensus is a proof!

  27. An example of evidence: “Before the start of the assignment, we will discuss the assessment with the students” “I think everything is clear, because I never get questions”

  28. “that we do not know” “I receive all positive reactions from thesis supervisors” “but they don’t like the assessment format”

  29. Next time… • Discuss interpretation CAP quality measurement • Preparation of the group interviews • Carry out the group interview

More Related