10 likes | 108 Views
Does the Blueprint for Training Cover all the Training Bases?. Matthew J. Holcomb, Amy E. Zimmerman, Anya Mazur-Mosewicz, & Eric E. Pierson . Introduction. Methodology/Results Study One
E N D
Does the Blueprint for Training Cover all the Training Bases? Matthew J. Holcomb, Amy E. Zimmerman, Anya Mazur-Mosewicz, & Eric E. Pierson Introduction Methodology/Results Study One The present study examined the average reported scores on the EPPP from Clinical and School Psychology programs across the country. Data was obtained through the Association of State and Provincial Psychology Licensing Boards (ASPPB) which kept data from 1996-2006 on program scores. An independent T-test was conducted to analyze difference between clinical and school psychology EPPP scores. Results indicate that Clinical Psychology Programs average approximately 10 points higher (8.636) on the EPPP than do School Psychology graduates. Further analysis revealed that approximately 13% of School Psychology programs had an average failing score on the EPPP during the data collection time period compared to only 4.9% of clinical programs. Study Two The present study examined the contents of the NASP Best Practices in School Psychology V (2007) chapter by chapter and attempted to determine the degree to which they match the areas of content for the EPPP examination. Original ratings of chapters from BP-V (2007) were compiled and then a random sample was drawn from these ratings and re-rated to check for consistency and agreement of ratings. A random sample of 14 (10%) chapters was rerated and then compared against original ratings. The rate of agreement was calculated though frequency distribution and indicated 93.75% of agreement between the second rating and the initial ratings. The chi-square analysis was conducted and indicated that two areas were significantly over represented within Best Practices when compared to EPPP standards: Ethical, Legal, and Professional Issues X2(1, N = 141) = 9.42, p ≤.001, and Treatment and Intervention (1, N = 141) = 30.31, p ≤ .001. The content areas that were significantly underrepresented in Best Practices were: Biology of Behavior (1, N = 141) = 16.03, p ≤ .001, (3) 12%, Cognitive Affective Bases of Behavior (1, N = 141) 7.94, p ≤ .001, and Growth and Lifespan Development (1, N= 141) = 15.73, p ≤ .001. Conclusions The results of the present study imply that the NASP model regarding training for graduate students of school psychology overemphasizes some of the EPPP areas of competency while underemphasizing others. With the large number of school psychologists who practice outside of the educational setting this represents an increasing difficulty with regards to obtaining licensure. Doctoral graduates in professional psychology who wish to practice independently are required to successfully pass the Examination for the Professional Practice of Psychology (EPPP) as partial completion of licensure requirements in each of the fifty states. The EPPP is an extensive, 225 item (25 experimental items), multiple-choice examination covering a broad range of topics related to the study and practice of professional psychology. Doctoral programs in professional psychology are designed to help prepare students to enter into the field and as part of that to pass the EPPP. At the present time, it is estimated that 83% of candidates regardless of training background pass the EPPP on their first attempt (DeMers, 2009). Although there are several possible reasons, clinical programs, appear to have a better match between program objectives and test content. Given that two thirds (66.56%; Norcross, Kohout & Wicherski, 2006; Templer, Stroup, Mancuso, Tangen, 2008) of those who take the EPPP are associated with clinical psychology Ph.D. programs, the EPPP questions are weighted toward that particular field. Despite this emphasis on clinical psychology, candidates from other fields of psychology, including school psychology, are required to pass the test in order to engage in independent professional practice (McGaha & Minder, 1993). The presence of a discrepancy between candidate levels of performance may suggest a need to (1) revise the content of the examination to match the training specialization of candidates, or (2) revise the training perspective of school psychology programs so they fit within the EPPP model and its requirements. The present studies demonstrates how well students from the most recent periods compare in performance at the EPPP for school and clinical psychology programs. The second compares the proportion of text devoted in the Best Practices in School Psychology V (BP-V) to the proportion of questions in the eight domains assessed by the EPPP to determine whether BP-V as an interpretation of the Blueprint is likely to train students for EPPP. Best Practices V serves as a uniquely critical sample of material to evaluate in school psychology curriculum for the following reasons. First, it has a noted historical and prominent role in the development and training of school psychologists (Fagan & Wise, 2007). Second, it is intended by its editors to act as an implementation of the NASP training model outlined in the Blueprint III (Harrison & Prus, 2008). Group Statistics and Independent T-Test Results for Program Differences Program TypeNMeanSD Clinical Psychology 304 156.401 7.8652 School Psychology 98 147.765 6.199 F = 6.952 Sig. = .009 Frequencies of Programs with <140 EPPP Score Program TypeFrequency Percentage Clinical Psychology 15 4.9% School Psychology 13 13% Chi-Squared Analysis of EPPP Standards References . References DeMers, S. (2009). Understanding the purpose, strengths, and limitations of the EPPP: A response to Sharpless and Barber. Professional Psychology: research and Practice, 40(4), 348-353. Fagan, T.K. & Wise, P.S. (2007). School psychology: Past, Present, and Future. National Association of School Psychologists. Bethesda MD. Harrison, P. L., & Prus J. S. (2008). Best Practices in Integrating Best Practices V Content with NASP Standards. In Best Practices in School Psychology V ed. Thomas, A. & Grimes, J. NASP Publications: Bethesda, MD, p. 71 – 102. Norcorss, J. C., Kohout, J. L., & Wicherski, M. (Spring, 2006). Graduate admissions in psychology: II. Acceptance rates and financial considerations, Eye on Psi Chi, 10(3), 20-21, 32). Templer, D., Stroup, K., Mancuso, L, & Tangen, K. (2008). Comparative decline of professional school graduates’ performance on the examination for professional practice in psychology. Psychological Reports, 102(2), 551-560. McGaha S., & Minder, C. (1993). Factors influencing performance on the Examination for Professional Practice in Psychology (EPPP), Professional Psychology: Research and Practice, 24, 107 – 109. Presented at the 2010 National Association of School Psychology Annual Convention Chicago, Illinois