1 / 18

Development of Logic Model & Performance Measures

Development of Logic Model & Performance Measures. AHEC GRANTEE PRESENTATION APRIL 14, 2011 HRSA/ Division of Workforce & Performance Management. Steps in BHPr Performance Process. Develop Program Logic Models (Feb. 15 – March 31) Develop Cluster Measures and Tables (March 1 – April 29)

astra
Download Presentation

Development of Logic Model & Performance Measures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Development of Logic Model & Performance Measures AHEC GRANTEE PRESENTATION APRIL 14, 2011 HRSA/ Division of Workforce & Performance Management

  2. Steps in BHPr Performance Process • Develop Program Logic Models (Feb. 15 – March 31) • Develop Cluster Measures and Tables (March 1 – April 29) • Program and Grantee Reviews of Measures and Tables (April 14 -- May 16) • Develop Draft Grantee Guidance (May 1 – 15) • Review and Revision to Guidance (May 16 – 30) • Obtain Input on Guidance from Grantees (May 1– 30) • Finalize Guidance (May 30) • Prepare Final OMB Package (June 1 – July 15) • Obtain Approval from Advisory Groups (November 1 – 30)

  3. OMB Categories • Quantity- supply • Quality- competency, knowledge gained • Diversity – racial/ethnic diversity • Distribution – placement in underserved areas

  4. Program Descriptionand Outcomes • Program level • Activity Level • Individual level

  5. Elements of the proposed individual-level data collection • PII and contact information – collected only for individuals targeted for longitudinal study and only at graduation/program completion (MD/Residency, PA, APRN(PC NP?), Diversity program participants in structured programs, RN?, faculty development?) • Grant-based unique ID (organization-based unique-id; i.e., by university?) • SSN • Name • Current Address • Expected Address in one year (if different) • Phone numbers (home, cell) • e-mail, facebook, • Name and phone number for someone who will know where you can be contacted

  6. Elements of the proposed individual-level data collection • Data that rarely changes – collected on first contact/”enrollment” • Grant-based Unique ID • Birthdate • Gender • Race (check all that apply) • Ethnicity • Parent’s income (broad categories) • Address where individual grew up (prior to 21st birth date) (“rural”)

  7. Elements of the proposed individual-level data collection • Annual data – activities – one record for each defined activity • Grant-based Unique ID • Activity (defined at the program/cluster level and specifics reported at the grant level) • Activity-related individual/immediate outcome • Levels of activities • Year-long training without separate activities – e.g., year 1 in medical or nursing school • Structured, time-limited activities – e.g., clinical rotations, CE courses, structured activities in diversity programs, etc.

  8. Elements of the proposed individual-level data collection • Activity categories/data • Clinical rotations – focus, location/type of facility, length, contact hours, etc. • Post-baccalaureate program • Saturday Academy • Formal mentoring • Etc.

  9. Program Name: Area Health Education Centers Program Need(s):There is a shortage of high quality primary health care to meet growing demand in the U.S. Goal(s): Train a greater number of competent health care providers to better meet the growing demand for primary health care.

  10. Breakout Groups • Best means to describe program activities and outcomes What is missing? • Categorize major elements • Brief summary report-out

  11. Get to Work!

  12. Proposed Common Quality Measures • The number of program participants demonstrating PC competencies • Proportion of BHPr supported trainees who receive training in medically underserved communities. • Proportion of participants who receive a portion of their clinical training in primary care. • Proportion of participants receiving training in PC focus areas • Proportion of participants with increased knowledge gain at the end of CE as reflected in pre-post testing scores • Proportion of participants receiving multiple modes of PC activities • Overall retention of participants in programs

  13. Proposed Common Quantity Measures • The number and percent of participants in career development / career enhancement/career advancement programs • The number and percent of program participants completing training who indicate their intent to practice as a HP • The number and percent of program participants completing training who indicate their intent to practice in primary care • The number and percent of program participants completing training who indicate their intent to practice in underserved areas. • The number of CE offerings per topic/mode of training • The number of new trainees/slots/units

  14. Proposed Common Diversity Measures • The number/type/proportion of graduates/completers who are URM and/or disadvantaged. • The number/type/percent of URM and disadvantaged participants/faculty • # and % of URM accepted into HP training program • Increased retention rate of URM in BHPR programs • Increased retention rate of URM/faculty in HP school • The number and percent of URM and disadvantaged participants who indicate their intent to work in primary care and/or underserved areas. • The number and percent of URM and disadvantaged participants receiving training in primary care and/or underserved areas.

  15. Other Proposed Common Measures/Reporting • Distribution • Proportion of BHPR supported HP who enter practice in underserved areas • Infrastructure • # participants completing faculty development training • # PC AAU • Progress Report (describe accomplishments) • Describe evaluation activities • Explain how partnerships/leveraging activities have influenced how you conduct training activities (e.g. curriculum, enrollment, placements, etc.)? • Educational innovations (e.g. Innovative curricula) • Best practices • Dissemination of knowledge/strategies

More Related