1 / 25

College of Education & Human Development unit assessment system

This session provides an overview of the Unit Assessment System, including its implementation process, the importance of data analysis, and the annual program-level reports. Attendees will gain an understanding of the system's impact on decision-making and the College's expectations.

cheree
Download Presentation

College of Education & Human Development unit assessment system

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. College of Education & Human Developmentunit assessment system Step 1: Program-level

  2. Welcome • The Unit Assessment System as a decision-making framework • The import of the changes we are implementing • Your leadership role • The College’s expectations

  3. Objectives • To provide an review of the Unit Assessment System • To review the “big picture” – how and why we are implementing the Unit Assessment System • To describe the annual time line for implementation of the system • To distribute and describe the Program Data Yearbooks • To describe the format and content of the annual program-level reports

  4. The Big Picture • A little history: NCATE 2003  2013 • Program-level experiences with SPAs and GMU’s APR • NCATE Unit Standard 2 • What a focused visit means, and why we are having one • Since 2011 • CEHD Reorganization • Divisions • API • Continuous improvement & core values • Unit Assessment System & strategic decision making for CI

  5. What is the Unit Assessment System? • An integrated decision-making framework that involves multiple levels of decision making

  6. What is the Unit Assessment System? • At the program level, APCs and program faculty review evidence in Data Yearbook to inform decisions related to improving candidate performance on standards • Program decisions and strategic goals inform Division Directors’ decisions related strategic goals, resources, staffing • The above informs the Executive Team and the Dean on decisions related to resource allocation, organizational structures and processes, and strategic goals and objectives adopted for the unit.

  7. Time line for Unit Assessment System

  8. Activity: Using the Data Yearbook • What questions would you like to answer about your programs, based on a review of the following types of data?

  9. GALLERY WALK & (brief) BREAK

  10. Program review process • How well are candidates performing on each of your key assessments, across the calendar year? • How well are candidates performing on standards, across the calendar year? • What opportunities exist for continuous improvement of your program? • What objectives for improvement will you commit to as a program?

  11. Components of the data yearbook • For each category of evidence: • What is this evidence? • Where did it come from? • Suggested ways of using it • What might this evidence imply? • What other evidence might you use to triangulate?

  12. An action research approach

  13. Candidate information • Admissions data: # of applicants, # accepted or denied admission, etc. • Admissions is the “gateway” into your program • What do trends suggest? • Are you satisfied that your process yields desired outcomes? • Candidate demographics • Snapshot of diversity represented among candidates in your program

  14. Assessment of candidate performance • Candidate performance on key assessments (by assessment “bin”) • Are candidates in our program demonstrating what they know and are able to do with consistency? • Are there specific standard elements on which candidates seem to excel, or to have difficulties? • What does data suggest about assessment processes? • Candidate performance disaggregated • Data will be distributed in February • Candidate assessment of dispositions (2013)

  15. Graduate and employer surveys • Mason graduate exit survey (May 2012 grads) • CEHD graduate exit survey • Satisfaction with various aspects of candidate experience at GMU and in your program • May help answer “why” questions • CEHD graduate follow-up survey (2013) • CEHD employer follow-up survey (2013)

  16. Internship, field experiences • Internship/supervisor qualifications • Internship/supervisor demographics • What are the characteristics of field supervisors? • How does this relate to the quality and diversity of field experiences? • Internship/field placement supervisor evaluations • What do candidates say about supervision? • Internship/field placement site characteristics • How diverse are placement sites?

  17. Faculty information • Faculty qualifications (by f/t, p/t, adjunct) • Does your program have a sufficient cadre of highly qualified instructors? • Faculty demographic characteristics • Are candidates taught by a diverse group of instructors? • Course evaluations (by on/off campus; f/t, p/t, adjunct) • Do candidates perceive teaching to be high quality? • Course syllabus review

  18. Program accreditation • State accreditation matrix • Standards alignment crosswalk

  19. Report template • Part 1: Program goals • What, if any, goals and objectives did your program pursue during 2012? • Part 2: Candidate performance • How well are candidates performing on each key assessment, across the calendar year? • How well are candidates performing on standards, across the calendar year? • What evidence did you consult to support your conclusions?

  20. Report template • Part 3: Examination of program data • What does candidate admissions and demographic data tell you about the quality, quantity, and diversity of candidates? • What do candidate and employer surveys suggest about program efficacy? What, if any, areas represent a concern? • What does evidence related to internship and field experiences (if applicable) suggest about the quality, quantity, and diversity of placements? • What does evidence suggest about the quality, quantity, and diversity of faculty, including candidate evaluation of faculty teaching?

  21. Report template • Part 4: Program improvement objectives • What opportunities exist for improvement? • What are your program’s long term (3-5 years) and/or short term (1 year) goals and objectives? • What resources do you need to accomplish these? • Part 5: CI: Program assessments • What have you done to study assessment consistency? • What have you found as a result? • What changes have you made?

  22. What the program report is…& is not • The program report is… • Evidence of your program faculty’s examination of data related to continuous improvement; • A conduit for your program to communicate its accomplishments, goals and resource needs • A means for division directors & deans to learn from you about your program • The program report is not… • A repetition of the data presented in the yearbook

  23. Activity: What to do now? • How will you take this information back to program faculty? • What is your action plan? • How will evidence in the Data Yearbook help you answer the questions you posed earlier?

  24. Dates & times for support • Tuesday, February 26th, 1-2 pm • PhD, PE, Ed Psych, IOT, LT • Tuesday, February 26th, 2-3 pm • ELMS • Tuesday, March 5th, 11am-Noon • SPED • Tuesday, March 5th, Noon-1pm • APTDIE • PLUS – BY APPOINTMENT, IF NEEDED

  25. Questions The complaint department is currently closed. However, if you have questions…(Libby will be happy to answer them)

More Related