1 / 14

Mentor Self-efficacy and Perceived Program Support scale : M-SEPPS Suzannah Vallejo Calvery, PhD

Mentor Self-efficacy and Perceived Program Support scale : M-SEPPS Suzannah Vallejo Calvery, PhD National Mentoring Summit January 25, 2013. Down the Rabbit Hole: Lit Review and Design Fun with Scales: Instrumentation Psychometric Joy: Validity and Reliability

eamon
Download Presentation

Mentor Self-efficacy and Perceived Program Support scale : M-SEPPS Suzannah Vallejo Calvery, PhD

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mentor Self-efficacy and Perceived Program Support scale: M-SEPPS Suzannah Vallejo Calvery, PhD National Mentoring Summit January 25, 2013

  2. Down the Rabbit Hole: Lit Review and Design Fun with Scales: Instrumentation Psychometric Joy: Validity and Reliability Back out of the looking glass: Implications and Applications The Agenda

  3. Funding is increasingly focused on: • Outcomes-based assessment • Best-practices • Only proven interventions are receiving the funding necessary to implement solutions. The Big Question

  4. Best practices gleaned over time • Match quality • Match length • Program infrastructure • 2002 vs. 2011 findings of DuBois et al. studies (2002, 2011) Down the Rabbit Hole: Does mentoring work?

  5. Dyadic construct with monadic research base Best practices tied to mentor self-efficacy What about the Mentor?

  6. M-SEPPS Instrument Research Questions: What are the psychometric properties of the proposed measure? Are there significant differences between demographic groups? New Instrument Preparation & Validation

  7. Fun with Scales • 6. Analysis: • Assumptions • Exploratory Factor Analysis • Item analysis • Reliability estimation • Literature Review • Item Construction* • Pilot • Item Refinement • Data Collection * Bandura, 2006; Fowler, 2009; Gefen & Straub, 2005; Netemeyer, Bearden, & Sharma, 2003; Nunnally, 1978; Nunnally & Bernstein, 1994; Pett, Lackey, & Sullivan, 2003.

  8. Method • Participants • Original scale/item pool: • General self-efficacy • Personal teaching efficacy • Mentor/tutor self efficacy • Program Support

  9. 104 participants in remaining analysis 18 total items 3 latent constructs Process: PAF (Pett, Lackey, & Sullivan, 2003, Tabachnik & Fidell, 2007) Direct Oblimin rotation w/delta level of -.5* 5 original factors extracted, 3 retained Principal Axis factoring *Pett et al.

  10. Psychometric Joy!

  11. Reliability Estimates Factor Correlations and Factor Alpha Coefficients for the M-SEPPS Scale Per Research Question #2: Original Demographic data variables: Age, Gender, Ethnicity, Level of education, Previous experience tutoring, Years tutoring. Age was the only demographic variable that had significant differences between levels on Factors 2 an 3.

  12. Back out of the Looking Glass:Limitations and Future Research • Limitations: • Sample size • Test-retest reliability • Scale redundancy • Next Steps: • CFA • Larger sample • Scale reduction

  13. Program Evaluation Dynamic program assessment Building support for implementation of best practices And after that? Implications for practice

  14. Thank you for attending. Q & A

More Related