1 / 69

Social Work Education Assessment Project ( SWEAP): Responding to the Challenges of Assessment

Social Work Education Assessment Project ( SWEAP): Responding to the Challenges of Assessment. Pre-Conference Workshop BPD Annual Conference Louisville, KY March 19, 2014. Workshop Outline. Introductions Introducing SWEAP /Goodbye BEAP The SWEAP Team Workshop Participants

liz
Download Presentation

Social Work Education Assessment Project ( SWEAP): Responding to the Challenges of Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Social Work Education Assessment Project (SWEAP): Responding to the Challenges of Assessment Pre-Conference Workshop BPD Annual Conference Louisville, KY March 19, 2014

  2. Workshop Outline • Introductions • Introducing SWEAP /Goodbye BEAP • The SWEAP Team • Workshop Participants • Overview of Accreditation • History of Accreditation • Direct vs. Indirect Measures • Implicit Curriculum • Multiple Measures • Organizational Framework for Program Assessment • EPAS 2008 (&2015) • Competencies • Characteristic Knowledge, Values & Skills • Practice Behaviors • SWEAP Instruments • Entrance • Exit • Alumni/Graduate • Employer • Curriculum Instrument (FCAI) • Field Instrument (FPPAI) • BREAK

  3. Workshop Outline(continued) • Unique Benefits of SWEAP • Tying SWEAP to Program Assessment • Linking SWEAP to EPAS • Matrix • Program Integration Example • Using SWEAP beyond EPAS • Sharing Assessment Ideas • Group Work • Navigating SWEAP • Website • Ordering • Raw Data Policy • Processing • Reports • Questions

  4. Introducing SWEAP • Was BEAP • Now SWEAP • Not just for undergraduate programs anymore • Can assess foundation year for graduate programs as well • Can be modified for program specific advanced year assessment

  5. The SWEAP Team Kathryn Krase LIU Brooklyn kathryn.krase@liu.edu Phil Ng BEAP@PHILNG.NET Patrick Panos University of Utah patrickpanos@gmail.com Roy (Butch) RodenhiserBoise State UniversityRoyRodenhiser@boisestate.edu Vicky BuchanColorado State Universityvictoria.buchan@colostate.edu Brian ChristensonLewis-Clark State Collegeblchristenson@lcsc.eduTobi DeLong HamiltonLewis-Clark State College, CDA tadelong-hamilton@lcsc.edu Ruth Gerritsen-McKaneUniversity of Utahruth.gerritsen-mckane@socwk.utah.edu Sarah Jackman Sarah.jackman@socwk.utah.edu

  6. Introducing Workshop Participants • Your Name • Your Program • Your Role • Have you used SWEAP? • If so, which instrument(s)?

  7. Overview of Accreditation:History of CSWE Accreditation • What was the process like before BEAP? • How has it changed over time? • How do changes to EPAS impact it? • Move to competency based educational standards in EPAS 2008

  8. Overview of Accreditation:Direct vs. Indirect Measures • Direct Measures • “Student products or performances that demonstrate that specific learning has taken place.” • Examples (SWEAP and Non-SWEAP) • Indirect Measures • “May imply that learning has taken place (e.g., student perceptions of learning) but do not specifically demonstrate that learning or skill.” • Examples (SWEAP and Non-SWEAP) • Which do you need and why?

  9. Overview of Accreditation:Implicit Curriculum • What is Implicit Curriculum? • “Educational environment in which the explicit curriculum is presented. It is composed of the following elements: the program’s commitment to diversity; admissions policies and procedures; advisement, retention, and termination policies; student participation in governance; faculty; administrative structure; and resources.” (EPAS, 2008) • How do you measure it? • Why should you measure it?

  10. Overview of Accreditation:Multiple Measures • The Importance of Multiple Measures • SWEAP alone is not enough

  11. Organizational Framework for Program AssessmentEPAS 2008 (& 2015) • Working under EPAS 2008 • EPAS 2015 underway… expected changes • Fewer competencies • Fewer practice behaviors

  12. Organizational Framework for Program AssessmentCompetencies • EPAS 2.1—Core Competencies • Competency-based education • Outcome performance approach to curriculum design. • Measurable practice behaviors comprised of knowledge, values, & skills. • Need to demonstrate integration & application competencies in practice with individuals, families, groups, organizations, and communities. • 10 competencies listed along with description of characteristic knowledge, values, skills,& practice behaviors that may be used to operationalize the curriculum and assessment methods. • Programs may add competencies consistent with their missions and goals.

  13. Organizational Framework for Program AssessmentCompetencies • 2.1.1—Identify as a professional social worker and conduct oneself accordingly. • 2.1.2—Apply social work ethical principles to guide professional practice. • 2.1.3—Apply critical thinking to inform and communicate professional judgments. • 2.1.4—Engage diversity and difference in practice. • 2.1.5—Advance human rights and social and economic justice. • 2.1.6—Engage in research-informed practice and practice-informed research. • 2.1.7—Apply knowledge of human behavior and the social environment. • 2.1.8—Engage in policy practice to advance social and economic well-being and to deliver effective social work services. • 2.1.9—Respond to contexts that shape practice. • 2.1.10— Engage, assess, intervene, and evaluate with individuals, families, groups, organizations, and communities.

  14. Organizational Framework for Program Assessment: Characteristic Knowledge, Values & Skills • Current focus on measuring practice behaviors. • Don’t forget about knowledge, values & skills.

  15. Organizational Framework for Program AssessmentPractice Behaviors • Multiple practice behaviors per competency • Each practice behavior MUST be measured for self-study/reaccreditation • TWO measures required for each practice behavior • At least one measure must be DIRECT

  16. SWEAP InstrumentsWhat are they and how do I use them? • Entrance • Exit • Alumni/Graduate & Employer • Curriculum Instrument (FCAI) • Field Instrument (FPPAI)

  17. EntrancePurpose • Provides demographic profile of entering students. • Completed at time of entrance into the program (Program Defined). • Provides overview of financial resources students are using or plan to utilize. • Provides employment status & background information regarding both volunteer and paid human service experience. • Helps track planned or unplanned changes in the profile of students in the program. • Evaluate impact of policy changes, such as admissions procedures, over time.

  18. EntranceQuestions • Student tracking: ID number & Survey completion date • Gender • Year in school • Overall GPA & GPA in major & Highest possible GPA at school • Length of current social work-related work experience (volunteer & paid) • Citizenship/ length of residence in USA • Employment plans during social work education • Hours per week expected to work during education • Sources of financial aid expected • Language fluency • Expected date of graduation • Race/ Ethnicity • Disabilities/Accommodation

  19. ExitPurpose • Completed by students just prior to graduation. • Often administered in field seminar or capstone seminar. • Feedback from students about their experiences while in the program. • Addresses: • Evaluation of curriculum objectives based on EPAS. • Post-graduate plans, related to both employment and graduate education, are addressed. • Collects demographic information to compare with the entrance profile.

  20. ExitQuestions • Educational Experience • Including implicit curriculum assessment • Current Employment • Employment-seeking activities • Current & anticipated Social Work Employment • Primary function & major roles • Post Graduate Educational Plans

  21. ExitQuestions • Students Evaluate how well program prepared them to perform practice behaviors • Professional Activities • Use of research techniques to evaluate client progress & Use of program evaluation methodology • Personal Demographic Information • Gender, Citizenship, Language fluency, & Disabilities

  22. Alumni/GraduatePurpose • Intended for completion two years after graduation • Standardized timing for administration is essential to create a reliable dataset for comparison over time. • Alumni evaluate how well program prepared them for professional practice. • Alumni employed in social work and those not employed in social work are surveyed. • Also gathers information on current employment, professional development activities, and plans/ accomplishments related to further education.

  23. Alumni/GraduateQuestions • Current Employment • Current Social Work Employment • Evaluation of preparation by the Program in the 10 EPAS competency areas (using Likert-type scale) • Educational Activities • Professional Activities • Demographics

  24. EmployerPurpose • Intended for completion two years after graduation • Addresses both accreditation and university concern for feedback from the practice community. • Measures graduate’s preparation for practice based on supervisor’s assessment. • Alumni/aerequest employer complete the survey: • Addresses primary concern of confidentiality. • Use of student identified allows connection to other instruments.

  25. EmployerQuestions • Educational background of supervisor/employer • Twelve (12) items which evaluate alumni/ae proficiency in all EPAS competencies.

  26. BREAK TIME

  27. Foundation Curriculum Assessment Instrument

  28. Curriculum Instrument (FCAI)Purpose • Provides Pre/Post test in seven major curricular areas of the foundation year. • Provides a direct measure to assist programs with evaluation of their curriculum. • Assists with identification of curricular areas that may need attention. • Provides national comparative data.

  29. Curricular Components

  30. Sample HBSE Question • The concept “person-in-environment” includes which of the following: • Clients are influenced by their environment • Clients influence their environment • Behavior is understood in the context of one’s environment • All of the above

  31. Sample Practice Question • Determining progress toward goal achievement is one facet of the _____ stage. • a. Engagement • b. Evaluation • c. Assessment • d. Planning

  32. Overview of FCAI Respondents

  33. Reliability Testing • Version 9 • Tested in two junior practice classes • Students tested twice, 2 weeks apart • Pearson’s correlation coefficient • r = .86

  34. Item difficulty index • Overall difficulty or average should be around .5 (Cohen & Swerdlik, 2005) • FCAI = .523 (n=415) • “This is a very good difficulty level for the test. Not likely to misrepresent the knowledge level of test takers”.

  35. Reliability & Effect Size • Cronbach’s alpha = .784 • Effect Size d = 6.87

  36. Current Data1/13 to 9/13 • Number of Schools using the FCAI in 2013 110 • Number of Respondents in 2013: • Pre-test: 8432 • Post-test: 5369

  37. School X / National Comparison

  38. Overall Scores Pre-Post

  39. BSW Student Scores by Curricular Area One Program FCAI Entrance & Exit

  40. Expansion beyond BSW • Based upon CSWE assertions related to educational levels in social work education, we expanded testing to three additional groups: • MSW foundation students: • entering • exiting • Advanced standing students: • entering

  41. Points to keep in Mind about the FACI 1. Purpose of this instrument: to review and improve curriculum 2. Program will want to “monitor” scores over several years (or several cohorts) for trends. 3. The FCAI can be considered a measure of “value added” from program entry to exit. 4. Benchmarks: can be set two ways, a. by competency b. overall score

  42. Field Practicum/Placement Assessment Instrument (FPPAI)

  43. Field Practicum/Placement Assessment Instrument (FPPAI) • Responds to need for a standardized field/practicum assessment instrument that measures student achievement of practice behaviors.

  44. Field Instrument (FPPAI) Piloting Phase • Initial Piloting for BSW in May 2008 • Second Pilot in Fall 2008/Spring 2009 • Third Pilot in Fall 2009 • Reliability Analysis • Chronbach’s Alpha of 0.91 or higher in each practice behavior • Full implementation in BSW: Fall 2010 • Piloting in MSW: Spring-Fall 2012 • Full implementation in MSW (Foundation): Fall 2013

  45. Field Instrument (FPPAI) Methodology • 58 Likert Scale questions measuring practice behaviors linked to the EPAS 2008 competencies. • Qualitative feedback form for each domain available for program use. • Available online and in print format. • Individual program outcomes report with national comparisons available. • Individual program outcomes report with national comparisons for EPAS 2008 Competencies & Practice Behaviors including CSWE benchmark reporting. • Can be used as a final field assessment and mid-test/post test design.

  46. Field Instrument (FPPAI) ScalE

  47. Field Instrument (FPPAI) Current Status • 84 programs currently use this instrument • More than 3,132 administrations to date • National data comparisons are available • Cronbach’s Alpha reliability test of internal consistency at midpoint (Pilot and first two years Testing): 0.969 • Cronbach’s Alpha reliability test of internal consistency at final (Pilot and first two years testing): 0.975

  48. BREAK TIME

  49. SWEAP Unique Benefits • Student demographics • Numerous data points for comparison • Explicit and Implicit curriculum assessment • Doesn’t end at graduation • Peer comparison by region, program type, auspice & nationally • …and more

  50. Tying SWEAP to Program Assessment Linking SWEAP to EPAS • All instruments updated to reflect 2008 EPAS • All instruments will be updated and available as soon as 2015 EPAS goes into effect • Competency Matrix for 2008 EPAS(Handout)

More Related