1 / 39

Public Health Learning Network Workforce Training Chartbook February 2019 Prepared by:

Public Health Learning Network Workforce Training Chartbook February 2019 Prepared by: National Coordinating Center for Public Health Training Common Metrics Reporting Year 2: Training Data: July 2017 – June 2018

steinberg
Download Presentation

Public Health Learning Network Workforce Training Chartbook February 2019 Prepared by:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Public Health Learning Network Workforce Training Chartbook February 2019 Prepared by: National Coordinating Center for Public Health Training Common Metrics Reporting Year 2: Training Data: July 2017 – June 2018 Field Placement Data: All 2018 field placements ending by September 14, 2018

  2. Contents Table of Contents • Intro to the Common Metrics • Part 1: Characteristics of Training • Graphic 1. Quick Training Statistics • Graphic 2. Training Delivery Mode • Chart 1. Core Competencies • Part 2: Common Metrics Data: Overall • Chart 2. Common Metrics Question #1 – Improved Understanding • Chart 3. Common Metrics Question #2 – Identified Actions • Chart 4. Common Metrics Question #3 – Presented Clearly • Chart 5. Common Metrics Question #4 – Satisfied with Training

  3. Contents Table of Contents • Part 3: Common Metrics Data : Delivery Mode • Chart 6. Improved Understanding by Delivery Mode • Chart 7. Identified Actions by Delivery Mode • Chart 8. Presented Clearly by Delivery Mode • Chart 9. Satisfied with Training by Delivery Mode • Part 4: Student Field Placements Data • Chart 10. Learning Objectives Met • Chart 11. Application • Chart 12. Relevant to Career • Chart 13. Working with Vulnerable Populations • Chart 14: Preceptor

  4. Contents Table of Contents • Part 5: Comparative Analysis • Chart 15. Common Metrics Question #1 – Improved Understanding (Overall) • Chart 16. Common Metrics Question #2 – Identified Actions (Overall) • Chart 17. Common Metrics Question #3 – Presented Clearly (Overall) • Chart 18. Common Metrics Question #4 – Satisfied with Training (Overall) • Part 6: Key Findings • Three-Year Comparison Data • Hybrid Trainings • Student Field Placements

  5. Intro to the Common Metrics Note: Represents Common Metrics data from nine reporting regions. One region did not submit data.

  6. Part 1: Characteristics of Training

  7. Graphic 1 Quick Training Statistics 4,360hours of training were offered across all regions. 70,586 Range of total training hours across regions = 97 – 1,385 hours. Training participants attended.* 1,081 Unique training courses offered. *Note: This figure is not limited to unique individuals and may include participants who have attended multiple trainings.

  8. Graphic 2 Training Delivery Mode 66% of participants attended trainings offered in a Self-paced Distance learningformat. Archived learning was the most common training format (502 trainings). 438 trainings took place in a Classroom-basedsetting. 57%of all training hours were delivered in a Classroom-basedformat. n=1076

  9. Chart 1 Core Competencies The most frequently addressed core competency across all trainings was Community Dimensions of Practice. Only 3% of courses addressed Financial Planning and Management. n = 1,074

  10. Part 2: Common Metrics Aggregate Data

  11. Chart 2 Question 1: Improved Understanding Half of training attendees strongly agreed with the statement “my understanding of the subject improved as a result of participating in this training.” n=35,725

  12. Chart 3 Question 2: Actions to Apply Learning Nearly half of attendees strongly agreed with the statement “I have identified actions they could take to apply information learned in the training.” n= 35,951

  13. Question 3: Presentation Clarity Chart 4 Over half ofattendees strongly agreed with the statement “the training information was presented in a way they could clearly understand.” n= 35,251

  14. Question 4: Overall Satisfaction Chart 5 Over half of attendees strongly agreed with the statement “I was satisfied with the training overall.” n= 35,581

  15. Part 3: Common Metrics By Delivery Mode

  16. Chart 6 Understanding by Delivery Mode Attendees in Hybrid trainings were the most likely to agreewith the statement “my understanding of the subject improved as a result of participating in this training.” n= 3,329 n= 4,133 n= 11,307 n= 16,956

  17. Chart 7 Identified Actions by Delivery Mode Understanding by Delivery Mode Attendees in Hybrid trainingsexpressed thehighest level of agreementwith the statement “I have identified actions I will take to apply the information I learned.” n= 3,311 n= 4,131 n= 11,251 n= 17,258

  18. Chart 8 Presentation Clarity by Delivery Mode Attendees in Hybrid trainings expressed the highest level of agreement with the statement “the information was presented in a way I could clearly understand.” n= 3,344 n= 4,139 n= 10,955 n= 16,813

  19. Satisfaction by Delivery Mode Chart 9 Overall Satisfaction by Delivery Mode Attendees in Hybrid trainings were most likely to agreewith the statement “I was satisfied with this training overall.” n= 3,368 n= 4,125 n= 11,368 n= 16,720

  20. Part 4: Student Field Placements Data

  21. Chart 10 Question 1: Learning Objectives Over 90% of student field placement participantsagreed with the statement “my learning objectives were met.” n = 164

  22. Chart 11 Question 2: Application A strong majority (94%) of student field placement participantsagreed with the statement “I identified actions to apply the information.” n = 163

  23. Chart 12 Question 3: Career 94% of student field placement participantsagreed with the statement “information was relevant to my career.” n = 164

  24. Chart 13 Question 4: Vulnerable Populations 86% of student field placement participantsagreed that the experience “increased interest in working with vulnerable populations.” n = 164

  25. Chart 14 Question 1: Preceptor Chart 14 95% of student field placement preceptorsagreedthat “student learning objectives were met.” n = 106

  26. Part 5: Comparative Analysis

  27. Chart 15 Comparison Table Three-Year Comparison Data: “My understanding of the subject improved as a result of participating in this training.”

  28. Chart 16 Comparison Table Three-Year Comparison Data: “I have identified actions I will take to apply information I learned from this training in my work.”

  29. Chart 17 Comparison Table Three-Year Comparison Data: “The information was presented in ways I could clearly understand.”

  30. Chart 18 Comparison Table Three-Year Comparison Data: “I was satisfied with this training overall.”

  31. Part 6: Key Findings

  32. Key Findings Comparison Three-Year Comparison (2016-2018) Common Metric (CM) #1: Overall participant understanding was highest in 2018 or Year 3 (agree/strongly agree), with a shift leaning from strongly agree to agree responses CM #2: Overall participant actions identified to apply in the workplaceremained the same over all three years (85% agree/strongly agree)

  33. Key Findings Comparison Three-Year Comparison (2016-2018) CM #3: Overall participant clarity has remained at 90% or above with a strong leaning towards strongly agree over agree responses (preference for strongly agree over agree ranging from 20%-11%) CM #4: Overall participant satisfaction remained the same over all three years (89%; agree/strongly agree), with a general shift leaning towards strongly agree to agree responses

  34. Key Findings Comparison Three-Year Comparison Findings (2016-2018) • The most frequently addressed core competency across all trainings during all years was Community Dimensions of Practice. • There was a significant increase in the number of total training participants each year: from 29,525 to 54,983 in 70,586 in Year 3. • The total number of unique trainings also increased: from 532, to 779, to 1,081in 2018.

  35. Key Findings Comparison Three-Year Comparison Findings (2016-2018) The total number of training hourswas about 3 times higher in 2018 than in 2016 (4,360 versus 1,467). Self-paced distance learning (or archived learning) remains the most common training format across all years.

  36. Key Findings Trainings The Rise of Hybrid Trainings In 2017 those attending hybrid trainings were the most likely to agree with the statement “the information was presented in a way I could clearly understand.”, a slight shift (2%) from classroom based trainings as the lead in 2016. In 2017, those that participated in hybrid trainings were the most likely to agree with the statement “I was satisfied with this training overall. In 2016, those that participated in both classroom and hybrid trainings were the most likely to agree, thereby representing a slight decrease in classroom-based training outcome percentages. In the most recent year (July 2017-June 2018), training outcomes for Hybrid trainings were highest across all four Common Metrics >90% Strongly Agree/Agree on all items

  37. Key Findings Student Field Placements Student Field Placements Key Findings (All 2018 student field placements ending by September 2018) Nearly all field placement participants felt their learning objectives for the placement were met. Similarly, most preceptorsalso agreed that the student field placement learning objectives were met.

  38. Satisfaction by Delivery Mode Funding Statement This project is supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under grant number UB6HP27435.

  39. For More Information • Evaluation Team • Brittany Bickford, MPH • Senior Research and Evaluation Analyst • National Coordinating Center for Public Health Training • National Network of Public Health Institutes • bbickford@nnphi.org • Jennifer Edwards, PhD, GCIS • Principal Research Scientist • National Coordinating Center for Public Health Training • National Network of Public Health Institutes • Aaron Alford, PhD, MPH, PMP • Director, Research and Evaluation • National Coordinating Center for Public Health Training • National Network of Public Health Institutes

More Related