250 likes | 452 Views
Katherine Perez Jacqueline Peña Office of Planning and Institutional Effectiveness January 2009. Assessment Workshop College of Education. SACS Accreditation ( http://www.sacs.org ) Key SACS Deadlines: September 10, 2009—Report due March 8-12, 2010—Onsite Review Key IE Deadlines :
E N D
Katherine Perez Jacqueline Peña Office of Planning and Institutional Effectiveness January 2009 Assessment WorkshopCollege of Education
SACS Accreditation (http://www.sacs.org) • Key SACS Deadlines: • September 10, 2009—Report due • March 8-12, 2010—Onsite Review • Key IE Deadlines: • February 1, 2009 • Fall 2008 SLO matrices (with results and use of results) • June 1, 2009 • Spring 2009 SLO matrices (with results and use of results) • 08-09 PO matrices (with results and use of results) • 08-09 Operational Objectives (with results and use of results) • Continuous improvement • Institutional Effectiveness • Sound, research-based and assessment-based decisions Why Focus on Assessment?
Blank SLO Matrix 08-09 Link to the Mission: http://w3.fiu.edu/irdata/portal/inst_effectiveness
Student learning outcomes (SLO) matrix • Undergraduate programs = 3 SLOs minimum • Graduate programs = 3 SLOs minimum • Program outcomes (PO) matrix • All program = 3 POs • 1 outcome per page • Align each outcome with an assessment, set of results, and a clear use of results • Reliable and valid assessment of each outcome • Appropriate use of results to enhance student learning for the outcome Clear SACS Guidelines
There must be some difference among degree programs. • The BA and BS in chemistry cannot have the same SLOS and the same assessment procedures (i.e. artifacts). • Similar Programs? • Outcomes: Show different levels (Bloom’s Taxonomy) • Assessments: Use different artifacts, rubrics, or criteria • Principle 3.6.1 • The institution’s post-baccalaureate professional degree programs, master’s and doctoral degree programs, are progressively more advanced in academic content than its undergraduate programs. SLO Matrices – More SACS Guidelines
Criteria: • Can be observed and measured • Relates to student learning towards the end of the program • Reflects an important concept • Formula: Who + Action Verb + What • As Stewards of the Discipline, students will apply reading education pedagogical and content knowledge and skills in a K-12 learning environment. • As Stewards of the Discipline, studentswill apply reading education pedagogical and content knowledge and skills in a K-12 learning environment. • Students will analyze and reflect on students’ language abilities and develop appropriate lesson plans to address their specific language needs. • Students will analyze and reflect on students’ language abilities and develop appropriate lesson plans to address their specific language needs. Writing the slo
Principle 3.4.12 • The institution’s use of technology enhances student learning and is appropriate for meeting the objectives of its programs. Students have access to and training in the use of technology. (Technology use) • Required for all undergraduate programs • Optional for 2007-2008 SLO matrices • Required for 2008-2009 SLO matrices Technology SLO
Too General: • Students will use information technology to gather and disseminate information. • More Specific • Students will be able to effectively demonstrate information technology skills by locating and retrieving information on education topics and issues and published research in education and related fields. • Students will write and present a capstone project that requires the use of Word, Excel, PowerPoint, and information technology. Sample Technology outcomes
Graduate students will learn basic concepts in their field. • Students will do an oral presentation. • Students will communicate effectively in an oral format. • Students will take all the courses in order. • Graduate students will apply and analyze various statistical concepts in an appropriate quantitative study. SLO Activity
Artifact • Paper or project • Presentation or behavioral observation • Data Collection /Collection of the Artifact • Where, when, how • Census versus sampling • Sampling technique (if applicable) • Criteria • Minimum standards on a rubric/scale or the percentage of correct items • Evaluation of the Artifact • Faculty panel or external evaluators (reliability) • Rubrics or embedded questions Assessing the SLO
Summary of Results • Direct measures • Test items • Performance as determined by rubrics • Indirect measures • Surveys and questionnaires • Interview and focus group data • Format • Narrative • Tables or charts • Every student learning outcome must have at least: • One set of results • One student learning improvement strategy (use of results) Reporting the results
75% of the students met the criteria for success. Our students passed the dissertation defense on the first attempt. All the students passed the national exam. 75% of the students (n=15) achieved a 3 or better on the 5 rubric categories for the capstone course research paper. Results Activity
Master’s Thesis – Results Table Criteria: Students will achieve a 3 or better on a 4-point rubric on all five sections of the master’s level thesis.
Target met. Will continue to monitor. • The faculty will meet and revise the three introductory courses. • Students demonstrating difficulty writing research papers will be referred to the Academic Center for Excellence. • A larger sample will be obtained. • We will revise the rubric and have a calibration session with the faculty prior to evaluating the student papers each semester. • A capstone course will be created that emphasizes research and thesis writing methods. Use of Results Activity
Overview and Comparisons: • Give an overview or summary of all the outcomes together • Discuss trends that you have seen over the years • Explanations • Provide qualitative explanations for poor results or exceptionally high results • Notes and documentation affecting results • Response rate (e.g. Only 50% of the students completed the project.) • Inter-rater reliability (e.g. 2 faculty members reviewed the artifacts and the inter-rater reliability was only 60%.) • Assessment Improvement Plans • Revise or create instrument (e.g. artifact, rubric) • Modify assessment methods (e.g. data collection, sampling, criteria, evaluation process) The summary page
Program-level outcomes • Focus on student success (not student learning) • Formula = Who + Action Verb + What • Examples: Graduatesseekingemployment in the field will find such employment within 6 months of graduation. Candidates will pass the FTCE and score higher than the state average. Program graduates will be satisfied with advising and mentoring services. Program Outcomes
The Ie Team (http://w3.fiu.edu/irdata/portal/inst_effectiveness) • Marta Perez • Director • perezma@fiu.edu • 305-348-2733 • Maria Corrales • Coordinator • corrales@fiu.edu • 305-348-0459 • Katherine Perez • Coordinator • katherine.perez@fiu.edu • 305-348-1418 • Jacqueline Peña • Coordinator • jpena@fiu.edu. • 305-348-1367 • Mayelin Felipe • Computer Specialist • mfelip01@fiu.edu • 305-348-0115 • Karla Felipe • Computer Specialist • kgarcia@fiu.edu • 305-348-0115 • Amanda Berhaupt • Graduate Assistant • aberh001@fiu.edu • 305-348-2731 • RandhirKaur • Graduate Assistant • rkaur001@fiu.edu • 305-348-2731
Institutional Effectiveness • Improve student learning • Student, program, and operational levels • Improve efficiency and effectiveness of the university • Assist university with the assessment process • Assessment Coordinators • Assist with the assessment process, including: • Education concerning assessment • Articulation of outcomes and assessments • Institutionalization of assessment practices • Translation of successful assessment work for the SACS world • Dissemination of assessment and accreditation information Our Role at IE and in FIU