160 likes | 296 Views
Deconstructing Standard 2c. Angie Gant, Ed.D. Truett-McConnell College. 2c: Use of Data for Program Improvement. Sub-elements for meeting element 2c: Unit uses data to evaluate the three components. Unit uses data to make changes in the three components.
E N D
Deconstructing Standard 2c Angie Gant, Ed.D. Truett-McConnell College
2c: Use of Data for Program Improvement Sub-elements for meeting element 2c: • Unit uses data to evaluate the three components. • Unit uses data to make changes in the three components. • Faculty have access to data/data systems. • Data are shared with faculty/candidates to work toward improvement.
Handout Overview • Rubric for determining scenario evaluations: Page 1 • CI Model (moving toward target): Page 2 • Scenarios: Pages 3-6
Sub-elements of Standard 2c (1) Unit uses data to evaluate the three components. “The professional education unit regularly and systematically uses data, including candidate and graduate performance information, to evaluate the efficacy of its courses, preparation programs, and clinical experiences.” • 3 AFIs cited
Scenario One The evidence room contained Program Advisory Committee schedules and minutes spanning three years. Documents outlined specific dates on which the Committee analyzed and summarized Field Experience Observation data, faculty reviews, Principal Surveys, and Candidate GACE Performance data, as well as the recommendations of the Committee. During interviews, Program Advisory Committee members confirmed what was found in the evidence room. Members described specific changes made in assigning field experience supervisors after reviewing and analyzing student complaints, which occurs in May of each year. When members of the Program Advisory Committee were asked to describe how decisions were made regarding changes in the program, members described a recent meeting in which they determined the need for earlier emphasis on classroom management techniques in courses. Members discussed the process of analyzing results from Principal Surveys and Field Experience Observation Instruments, as well as Candidate Action Plans. One member explained that instruction addressing the creation of a classroom management plan now took place in all junior-level courses.
List of Evidence (Scenario One) • Analysis of Candidate Action Plans • Analysis of Field Experience Observation Instruments • Analysis of Principal Surveys • Department Meeting Minutes (9/5/11) • Department Meeting Minutes (10/2/11) • Department Meeting Minutes (11/4/11) • Department Meeting Minutes (12/1/11) • Faculty Assessment Tables • Program Advisory Committee Meeting Minutes (5/24/11) • Program Advisory Committee Meeting Minutes (11/15/11) Unacceptable, Acceptable, or Target?
Sub-elements of Standard 2c (2) Unit uses data to make changes. “The professional education unit analyzes preparation program evaluation and performance assessment data to initiate changes in the program and professional unit operations.” • 2 AFIs cited
Scenario Two The evidence room contained descriptions of the roles and responsibilities of unit faculty and P-12 partners in the administration and evaluation of the unit’s assessment system, but interviewees were unable to articulate their roles. The evidence room also housed three years of aggregated data for the Field Experience Observation Instrument. These data indicated that candidates struggled with incorporating technology. When faculty and clinical supervisors were asked about these data during interviews, no one seemed aware of any changes made to courses. When asked in interviews how decisions were made regarding changes in curriculum or assessment, members of the Program Advisory Committee struggled to answer.
List of Evidence (Scenario Two) • Field Experience Observation Data (3 years) • Field Experience Observation Form • Program Advisory Committee Minutes (2/25/11) • Roles and Responsibilities of Stakeholders Unacceptable, Acceptable, or Target?
Sub-elements of Standard 2c (3) Faculty have access to data/data systems. “Faculty have access to candidate assessment data and/or data systems.” 0 AFIs cited
Scenario Three The Institutional Report described the unit’s assessment system in detail. It included dates for assessment completion and the roles of faculty, candidates, and other stakeholders in reviewing specific assessment data. During interviews, faculty and clinical supervisors articulated the changes made to processes and courses because of data reviews. In interviews, current and past candidates discussed their work on Student Study Teams. They described the process of analyzing results from faculty members’ evaluations of candidate performance for the past three years. These analyses demonstrated an area of weakness in candidates’ use of technology; and candidates had the opportunity to suggest specific ways for making improvements in this area. One of those improvements was to develop a new technology assessment, which is now completed during a pedagogical course (EDUC 3212). In interviews, faculty described the value of the training on the Data Collection System and the Candidate Data Review. They stated that they have a better understanding of how their work fits into the bigger picture of candidate assessment.
List of Evidence (Scenario Three) • Minutes from Faculty Training: Candidate Data Review (1/25/11) • Minutes from Faculty Training: Data Collection System (3/30/11) • Student Study Team Groups (List) • Student Study Team Minutes (5/18/10) • Student Study Team Minutes (5/20/11) • Syllabus: EDUC 3212 (Pedagogy Methods) • Teacher Education Program Timeline for Assessment (2010-2011) • Teacher Education Program Timeline for Assessment (2011-2012) • Technology Assessment (completed in EDUC 3212) Unacceptable, Acceptable, or Target?
Sub-elements of Standard 2c (4) Data are shared with faculty/candidates to work toward improvement. “Candidate assessment data are regularly shared with candidates and faculty to help them reflect on and improve their performance and preparation programs.” 0 AFIs cited
Scenario Four The evidence room contained minutes from the most recent Program Advisory Committee meeting, which revealed that the order of coursework was not highly beneficial to candidates in the field. A different order of classes would be more conducive to the candidates’ field experiences. All stakeholders were made aware of the concerns, and the process for changing the order of coursework had begun. There were also minutes from a department meeting from the previous year that demonstrated that GACE data were examined. ECE candidates were consistently struggling to pass the language arts/social studies portion of the test. An analysis of the sub-element data revealed that social studies was the area of concern. The social studies methods courses were revised to reflect a closer correlation to the GACE frameworks. During interviews, current and previous candidates described the process by which their performance assessments were shared with them in End-of-Semester Conferences.
List of Evidence (Scenario Four) • Candidate Coursework Order (current) • Candidate Coursework Order (proposed) • Department Meeting Minutes (8/1/10) • End-of-Semester Conference Examples from two candidates • End-of-Semester Conference Form • Program Advisory Committee Meeting Minutes (10/12/11) • Program Advisory Committee Members (including two candidates and two graduates) • Syllabus: Social Studies Methods Course (prior to examination of GACE data) • Syllabus: Social Studies Methods Course (revised) Unacceptable, Acceptable, or Target?