1 / 37

Linking the conceptual framework to assessment and evaluation

Linking the conceptual framework to assessment and evaluation. Kathe Rasch Maryville University St. Louis, MO. Acknowledgements. This session is informed by the courageous and pioneering work of Mary Diez and Alverno College

jacob
Download Presentation

Linking the conceptual framework to assessment and evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linking the conceptual framework to assessment and evaluation Kathe Rasch Maryville University St. Louis, MO

  2. Acknowledgements • This session is informed by the courageous and pioneering work of Mary Diez and Alverno College • Her willingness to share with all of us (including some of the ideas presented today) continue to inspire us to examine possibilities

  3. Wanted:A clear picture of the teacher • The conceptual framework • Provides a picture of the knowledge, skills, attitudes, values and dispositions of the teacher/advanced professional you aim to develop • Needs to be clear to your faculty, students and partners

  4. The conceptual framework • Can be tested against research, consensus documents like NBPTS propositions and standards, INTASC principles, and state standards • Encompasses what is most important to institutional mission • Articulates fundamental beliefs

  5. The conceptual framework • Is the basis for the design of assessment • Provides the framework for program evaluation

  6. Assessment and Evaluation • Assessment is concerned with “knowing students”, their abilities, knowledge, skills, etc. • Evaluation is about “knowing courses/programs/degrees,”etc. • Thus, assessment data can inform evaluation studies, not the other way around. • William Scott, University of Bath

  7. How does the conceptual framework help to devise an assessment system? • Knowledge, skills, dispositions manifest themselves in behaviors • What matters most becomes clear • Development of teachers related to the framework takes time • The tenets of the framework cut across the programs

  8. And what the framework lays out……. • NOTE: Knowledge, skills and dispositions are: • Complex • Developmental, i.e., able to be extended throughout life • Relevant in multiple settings

  9. Assessing this complexity • Begin with the complexity • What are some of the most important abilities for your program? • Are they reflected in the conceptual framework? • How do you think that you develop them now? • What would you like to be able to do?

  10. For example, a central ability • For example, you can use INTASC or your program outcomes to identify a large ability • The ability to • Apply principles of human development and learning in the design of instruction • Use reflection to guide practice

  11. How can such an ability be developed and assessed across the program? • Where do you currently start building an understanding of human development? • How do you assess it? How might you want to assess it?

  12. How can an ability be developed across the courses/experiences of a program? • How do you build the ability to apply that understanding? • How do you assess it? How might you want to assess it?

  13. How can an ability be developed across the courses/experiences of a program? • What should prospective teachers/counselors/principals be able to do regarding this ability when they are ready to be licensed? • How do you assess it? How might you want to assess it?

  14. One example of such a progression (Alverno and Maryville) • Beginning life-span development course • “Take a Learner to Lunch” (analysis of interview with 4-year old in relationship to Piaget and Erikson) Field Experience logs Observations about groups of learners based upon expected physical, emotional, intellectual characteristics for age

  15. One example of a progression • Methods courses • A case study on a student with reading/language delays Student teaching Work sampling that includes instructional decisions and assessment data for learners at different developmental levels

  16. Conceptual Framework to Assessment System (Diez) • The difference between an assignment and an assessment is in the larger relationships implied in the latter • Etymological definition of assessment: “To sit down beside” • An assessment system requires, over time, successively more complex demonstrations of the target abilities.

  17. Assessment as learning • Public, explicit outcomes tied to the conceptual framework • Prompts/assignments/tasks that allow for the differentiation of performance • Mutually agreed upon criteria (tied to past experience and standards) that describe the expected performance

  18. Assessment as learning and measures of performance • Feedback by a range of assessors, who have a chance also to share their feedback • Self-assessment that develops and measures the learner’s knowledge of the skill and the performance

  19. For Example • Conceptual Framework stresses the importance in the school developing democratic citizens • Long-term task: Candidates asked to assess the differences in access to knowledge and resources in varying school districts in the county and their assessment of educational opportunity

  20. What does this assessment reveal? • Ability to gather information • Values and beliefs about resources • Knowledge of the community • Ability to synthesize the collective information gathered • Willingness to take risks and take a stand

  21. How might this get a range of performances? • Multiple ways that a student can participate • Complexity of evaluation • No clear right/wrong answers

  22. What criteria might describe this performance? • Shows the ability to gather and analyze data • Shows how candidates seek and use current information • Demonstrates candidates ability to pool information and work together • Demonstrates candidates understanding of important tenets of democracy

  23. About dispositions • We have no trouble identifying students who we believe have highly acceptable or highly unacceptable dispositions • It’s those in the middle…. • We can describe behaviors that manifest themselves that might suggest certain dispositions.

  24. About dispositions • We have to identify what they are and where they appear in the conceptual framework • Students must be able to self-assess

  25. And on to a system • How do we make it flow? • How is it usable for both candidates and program improvement? • How can we keep it true to the conceptual framework and serve the external systems of accountability? • How do we acknowledge individual candidate variations?

  26. And on to a system • How do we hone our ability to portray candidates and program development over time? • How do we ensure multiple measure and different modalities for students to model their proficiency. • And what about rigor? • What are the key decision points in the system?

  27. Whom do we involve? • Internal and external constituencies Candidates Faculty Partner school faculty District partners Arts and Sciences faculty

  28. Onto fairness and consistency • Multiple measure requires multiple evaluators and a long term commitment to developing assessments • If they don’t understand the framework in their own terms, it is hard for them to assess • Role of technology

  29. The program should build: • What do we know about assessing the knowledge, skills and dispositions that emerge from the framework • At the beginning • In development • At program exit • In light of P-12 student learning • Across time

  30. Sharing and debating • In program development • In program implementation • As the candidate performance becomes more complex across time

  31. Putting the system into place (Fullan) • Change is a blueprint, not a journey • You can’t mandate what matters • Problems are our friends

  32. Should this help with program • ABSOLUTELY!!!! • If you don’t think about program evaluation at the same time, the efforts will be stymied.

  33. OK, so I have to go home….. • You are developing a conceptual framework • Don’t assume that your current assessments are not valuable • Use what you have • Work to make it (and the results) more public, explicit and consistent across the program

  34. Don’t be afraid to look at the real gaps • Look at progression with regard to rigor and complexity • Use your assessment data to look at how your candidate perform on the goals/ outcomes you have set at different levels of the program

  35. Choose new assessments and the system carefully • Choose decision and data points carefully • Think about utility, feasibility, rigor and renews opportunities in designing your system

  36. Resources to help • AACTE: the PETE project • INTASC: The INTASC Academies (July 2003 at Alverno—see www.ccsso.org • NBPTS • Alverno: The Assessment Workshop (June 2003 at Alverno www.alverno.edu The Renaissance Group

  37. Final Thoughts • The conceptual framework is every institution’s most valued and guarded intellectual property • This journey must be for you…not because NCATE made you do it • Your faculty care about producing good teachers

More Related