260 likes | 274 Views
Discussing the evaluation update of SE AETC, including current stats, process in Caspio, critical questions, and solving evaluation issues. Explore progress towards HRSA's goals and the overall quality of the program. Evaluate event categories, modality, and scenarios for accurate assessment. Delve into key questions for AETCs' contributions to HIV care access and quality, short-term, intermediate-term, and long-term outcomes, as well as outputs and inputs.
E N D
Evaluation Sessions Goals: Power Up • Provide HRSA/NEC Update • Describe SE AETC current evaluation stats • Demonstrate Evaluation Process in Caspio • Discuss which evaluation questions to include • Solve the AETC World’s evaluation issues
HRSA/NEC Update • Yea, I got nothin’ • No FOA yet • No Contract Yet • No evaluation plan • But, there is hope!
HRSA’s Plans • Select a contracted Evaluation Group • Maintain close interaction • The SE Project Officer is the NEC PO • Years of QI experience • She is open to evaluation suggestions from AETCs • Freedom to develop our plan
Poll Question: So How Are We Doing? • What is the SE AETC evaluation rate? (Didactic, Interactive and Preceptorships) • 44% • 52% • 68% • 73% • 81%
Evaluation Data • Evaluation response rate overall: 81.03% • Much improved from year 1 (58%) • Removed TA, Community of Practice and Clinical Consultation from the metric • Focusing on face to face training • Confirming series events have evaluation as required
Level of Knowledge – Scale of 1 to 5 • Before an Event: 3.3 – 3.7 • After an Event: 4.2 – 4.6 • Change in Knowledge: .65 - .95 • Average Improvement in Knowledge: .91
Evaluation in Caspio • Categories of Events • Modality will determine evaluation • Online training • Preceptorships (length and provider levels) • Live Training (one hour or full day)
Caspio Doesn’t Change the Rules All Preceptorships should be evaluated • Eventually 40+hours will have pre/post tests • Evaluation at end of preceptorship or Dec/June All Didactic and Interactive programs evaluated • Exception: Series Events • Evaluation at the end of the series (previously quarterly) • Or December and June
Caspio Doesn’t Change the Rules Case Conf. Live or Webcast/Community of Practice • Reoccurring- Evaluate in Dec and June • Random Case Conferences- evaluated each time Individual Case Conferences – PIFs /ERs Only Technical Assistance /Community of Practice • Evaluate at least two per 6 month period or write “story” in narrative report describing TA impact
Evaluation Questions: • What is the AETCs’ contribution to HIV care access and quality, based on … • ↑ in workforce development (# and skills of practitioners offering HIV care) • ↓ health disparities in access to care measures • ↑ organizational/health systems change (more capacity, higher-quality care) • ↑ measures across the HIV care continuum: testing, prevention, PrEP, linkage to care, treatment, retention in care, viral suppression OUTCOMES Short-Term Individual ↑ Knowledge ↑ Skills ↑ Attitudes Short-Term Organizational Improved teams and organizations, e.g.: Creation of new P&Ps Quality improvement process ↑ Workflow and efficiency Clarified roles on team ↑ Links to experts & others Intermediate-Term ↑ HIV services provided ↑ Best-practice behaviors Implementation of new P&Ps ↑ Capacity to provide care ↑ Patient-centered IP care ↑ Access to information ↑ Training capability • IMPACT (Long-Term) • ↑ Access ↑ Quality • ↑ Service delivery with ↑ HIV CoC outcomes: • ↑ HIV testing • ↑ prevention of HIV • ↑ use of PrEP • ↑ linkage to care • ↑ PLWH receive ART • ↑ adherence • ↑ retention • ↑ viral suppression • Strong networks of care • ↓ stigma + ↓ disparities • ↑ Quality of life for PLWH, ↓ morbidity and mortality • End of the HIV epidemic, new infections are rare OUTPUTS Events – type of interaction, funding source, hours, etc. Trainees – demographics, specialty, role, race/ethnicity, types of patients seen, minority-serving, Federally funded, novice, low-volume Documentation of PTP and IPE activities INPUTS Funding/capital resources Human resources Clinical & training experts Partnership networks Workplan Curricula Facilities THROUGHPUTS/ ACTIVITIES Core Training & TA MAI Training & TA PTP Training & TA IPE Training & TA CDC Training & TA Border Training & TA AETC Combined Logic Model for Program Evaluation Version 1.4 – Updated 19 April 2017
Let’s Agree on Evaluation Questions Training/Presentation Objectives will still be determined per event by Partner Overall event and speaker evaluation questions will be the same across the region. • Determine which questions are most valuable? • What information would be helpful to you? • Samples in packet
Likert Scale /Fill in the blank Let’s agree on 1-7 (unless NEC still wants 1-5) Questions for rating of presenter Questions for event space and atmosphere Rating of over all program Future Impact and Additional Training Needs Goal is to minimize number of questions
Likert Scale 1. Strongly Disagree 2. Disagree 3. Slightly Disagree 4. Neither Agree or Disagree 5. Slightly Agree 6. Agree 7. Strongly Agree
Rating Presenters • The objectives of the training were clearly communicated at the beginning of the event. • The slides or learning materials were clear and helpful • The content was well organized and clearly presented. • The presenter was knowledgeable about the topic. • The presenter was responsive to participants’ questions and concerns. • The presenter effectively answered questions asked by participants.
Rating Presenters • The presenter encouraged audience participation (ARS, polling, role play, discussion) • I would attend other sessions facilitated by this presenter
Rating of over all program (NEC)? • Overall rating of the training • Satisfaction with opportunities to participate during training • My knowledge/skills on this topic before the training • My knowledge/skills on this topic after the training • My confidence to serve people with/at risk for HIV/AIDS after the training
Written Response/Impact • What did you like most about this event? • What was the single, most important thing or takeaway that you learned from this session? • Are you able to apply this knowledge or skill in your work setting? □Yes □No □NA • How are you able to apply or what barriers do you see in applying the techniques learned or information gathered in your work setting?
Written Response/Impact • Why did you choose to attend today’s session? • How could the session be improved (content, presentation format, time, length)? • What would you change about this training?
Meeting Technology Please describe any technical issues you had with the online format? What suggestions do you have to improve the online experience? What opportunities did you have to actively participate in this webcast?
Additional Training Needs • Please list any other HIV-related training needs? • What additional HIV-related training do you feel would be most helpful for you? • List any training topics you would like to see in the future. • For webcasts: What other webcast topics would you attend?
CME/CNE/SW/Pharm CME Information (Must be completed if ANY credit was offered for this course) • Was this activity free of commercial bias or influence? □Yes □No □NA • If no, please explain . . . • Any other credit questions?
Questions for event space and atmosphere? • Do we need these? • People will provide comments in other sections
Thanks! • More to come!