1 / 69

FY02 ASA Presentation Provide Quality and Organizational Development Services

FY02 ASA Presentation Provide Quality and Organizational Development Services. Presented by: Antonio Rodriguez Assistant Director, Office of Quality Management Prepared by: Antonio Rodriguez, Amy Culbertson, Carmen Kaplan, Adele Egwu, Joshua Rose, Joe Wolski, and Gay Presbury

lorenm
Download Presentation

FY02 ASA Presentation Provide Quality and Organizational Development Services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FY02 ASA Presentation Provide Quality and Organizational Development Services Presented by: Antonio Rodriguez Assistant Director, Office of Quality Management Prepared by: Antonio Rodriguez, Amy Culbertson, Carmen Kaplan, Adele Egwu, Joshua Rose, Joe Wolski, and Gay Presbury Office of Research Services National Institutes of Health Draft 12 November 2002

  2. Table of Contents Main Presentation ASA Template ……………………………….……………………………….3 Customer Perspective……………………….……………………………….4 Customer Segmentation …………………….……………………………………5 Customer Satisfaction……………………….…………………………………….9 Unique Customer Measures……………………………………………………..12 Internal Business Process Perspective……………………………………13 Service Group Block Diagram……………………………………………………15 Conclusions from Discrete Services Deployment Flowcharts………………..17 Process Measures………………………………………………………………..18 Learning and Growth Perspective………………………………………....25 Conclusions from Turnover, Sick Leave, Awards, EEO/ER/ADR Data……..27 Analysis of Readiness Conclusions……………………………………………..28 Unique Learning and Growth Measures……………………………………… 31 Financial Perspective………………………………………………………..35 Unit Cost……………………………………………………………………………36 Asset Utilization……………………………………………………………………39 Unique Financial Measures..……………………………………………….…….42 Conclusions and Recommendations……………………………………….44 Conclusions from FY02 ASA..……………………………………………………45 Recommendations…………………………………………………………………47 Appendix……………………………………………………………….………..…48

  3. Customer Perspective

  4. Customer PerspectiveCommon Objective and Measure • Performance Objective: Increase understanding of customer base • Performance Measure: Customer segmentation charts for each Discrete Service

  5. Customer Segmentation for:Manage the ASA Initiative (DS1) Total working hours = 2099 hrs. ASA Consulting hours =728 Note: Based on OQMstaff ASA consulting hours during Aug - Sept 2002.

  6. Customer Segmentation forManage the ASA Initiative (DS1)(cont.) Note: Based on external consultant hours during April - Sept 2002.

  7. Customer Segmentation for Provide Consulting for Strategic Initiatives (DS2) Note: Based on OQM staff strategic initiative hours during Aug - Sept 2002. Total of 209 hours

  8. Customer PerspectiveCommon Objective and Measure • Performance Objective: Increase customer satisfaction • Performance Measure: ORS Customer Scorecard data collected for each Discrete Service

  9. Customer Satisfaction for Manage the ASA Initiative (DS1) • Methodology for assessment of FY02 ASA cycle • Conduct web survey at close of cycle (after the Performance Management Conference) in November • Customer segments to be surveyed: • ORSEC • ASA Team Leaders/Members • ASA Consultants • Customize the ORS Customer Scorecard • Add questions about how helpful various OQM activities were during FY02 • Open-ended comments on how to improve the process • Hold focus group with Team Leaders to discuss improvement ideas for FY03

  10. Customer Satisfaction for Provide Consulting on Strategic Initiatives (DS2) • Few number of projects in FY02 to conduct meaningful customer assessment • Methodology for FY03 • Administer surveys to customers as projects come to logical closure point • Such as when a major deliverable occurs • If ongoing assistance is provided, conduct assessment in June-July time frame • Use the ORS Customer Scorecard

  11. Customer PerspectiveUnique Objective and Measure • Performance Objective: Provide senior ORS leadership with organizational change and improvement data • Performance Measures: Data for the performance measures for FY02 should be available from ASA team presentations of the Performance Management Conference • Report on the Common ORS Performance Measures • Status of the FY01 Recommendations for Improvement

  12. Internal Business Process Perspective

  13. Internal Business Process PerspectiveCommon Objective and Measure • Performance Objective: Increase understanding of processes • Performance Measures: • Block diagram of Service Group • Deployment flowcharts for each Discrete Service

  14. MANAGE ASA INITIATIVE (PAGE 1) ASA TEAM LEADERS/ ORSEC OQM CONSULTANTS ITB MEMBERS C OQM proposes methodology for new fiscal year ORSEC approves methodolog y Plan Project NO Propose common measures & service group Prepare draft teams template training Approve common measures Create ASA web page YES Select team Finalize template leaders training Post template format Select team members Assign consultants Attend ASA Post training Template Training dates Schedule annual Prepare drafts of conference PM & DA training Post training packages packages Complete and Develop customer submit Template scorecard procedures Review and approve teams' Deliver Training Templates Receive/review templates and provide feedback to teams Post templates and feedback on web A Deployment Flowchart for DS1

  15. MANAGE ASA INITIATIVES (PAGE 2) ASA TEAM ORSEC OQM CONSULTANTS ITB LEADERS/MEMBERS A Analyze report on Attend other learning and training sessions growth Complete data collection plan Interpret Learning and Growth data Post Info about conference Gather Customer Scorecard data Analyze and distribute customer scorecard data Interpret customer scorecard data Collect, group and interpret other data NO Prepare presentation Approve presentation YES Submit Final Presentation to OQM Prepare materials for Conference B Deployment Flowchart for DS1 (cont.)

  16. MANAGE ASA INITIATIVES (PAGE 3) ASA TEAM ORSEC OQM CONSULTANTS ITB LEADERS/ MEMBERS B Deliver Presentation Evaluate Presentations Provide feedback Evaluate ASA process Formulate ASA methodology for new fiscal year C Deployment Flowchart for DS1(cont.)

  17. List in OBSF Service Hierarchy DS1: Manage the ASA initiative DS2: Develop performance measurement systems DS3: Provide organizational development consultation DS4: Process analysis and improvement DS5: Develop and test new business approached Recommended List DS1: Manage performance measurement and improvement DS2: Provide consulting for strategic initiatives Service Group Block Diagram Conclusion: Discrete Services need to be collapsed and reworded

  18. Service Group EmphasisHours of Service Delivered by Discrete Service Note: Based on OQM consulting hours for both DS1 & DS2 during Aug-Sept 2002.

  19. Deployment Flowchart Conclusions • Completed deployment flowchart for each discrete service (see Appendix) • Deployment flowchart for Manage the ASA Initiative (DS1) conclusions: • Process is very involved • Must elaborate on planning phase of the ASA process • Guidance for next FY cycle needs to be developed in collaboration with teams • Deployment flowchart for Provide Consulting on Strategic Initiatives (DS2) conclusions: • Demand has been sporadic • Many efforts so far have been short-term and loosely planned

  20. Internal Business Process PerspectiveCommon Objective and Measure • Performance Objective: Identify methods to measure processes • Performance Measure: Identify process measures for each discrete service • DS1: Percent of ASA Team Members attending trainings • DS1: Training evaluation scores • DS1: ASA Consultant hours by month • DS1: ASA Consultant hours by Team • DS1: OQM Staff hours by Team • DS1: Audience feedback on ASA presentations • DS2: Percent of projects with project plans

  21. 100% N = 240 90% 80% 76% 76% 70% 60% 55% 54% Percent of ASA Team Members in Attendance 50% 40% 30% 20% 12% 10% 0% ASA Template Process Mapping Data Analysis Financial Measures BCI Percentage of ASA Team Leaders/Members Attending ASA-related Trainings (DS1) Note: Based on total of 240 ASA Team Leaders and Members.

  22. Participants’ Ratings of ASA Trainings (DS1) Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding.

  23. ASA Consultant Hours by Month (DS1) Hours Month

  24. OQM Hours per Team (August & September) 60 Total Hours Spent with ASA Teams = 273 50 Total Hours Spent with ALL ASA Teams = 455 Total Hours Spent on ASA Initiative = 728 40 Hours 30 20 10 0 1 2 3 4 5 6 8 9 10 11 13 14 15 18 22 23 24 25 26 27 28 29 30 31 32 37 38 39 40 41 42 43 Teams OQM Staff Hours by Team (DS1)

  25. 60 50 40 Total Hours used 30 20 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 Service Group ASA Consultant Hours by Team (DS1)

  26. Percent of Projects with Project Plans (DS2)

  27. Learning and Growth Perspective

  28. Learning and Growth PerspectiveCommon Objective and Measure • Performance Objective: Enhance quality of life for employees • Performance Measure: At the Service Group level • Turnover • Sick Leave Usage • EEO/ER/ADR complaints/cases • Awards/Recognition

  29. Conclusions from Turnover, Sick Leave, Awards, EEO/ER/ADR Data • Service Group has had little turnover • Sick leave usage is on high end compared to the other groups • High use of sick leave by employee on detail • Service Group members received awards for their accomplishments in FY02 • No EEO/ER/ADR cases

  30. Learning and Growth PerspectiveCommon Objective and Measure • Performance Objective: Maintain and enhance competencies for the future organization • Performance Measure: Analysis of Readiness at the Service Group Level

  31. Analysis of Readiness Conclusions • Q1: Right mix of KSAs • BSC principles and techniques • ABC expertise • Quantitative and qualitative analysis • Communication, training, consultation, and facilitation • Q2: OQM does not have enough people with the right skills to meet ORS – wide demand • Need to continue to use external consultants • Service Groups must increasingly take ownership of process • Q3: Training needs of Service Group • Obtain skills listed above through contracts and development • Q4: Tools/Materials needed to carry out mission • Integrated data collection/analysis systems • Simulation software/expertise • Networking with other Government Agencies doing BSC

  32. Analysis of Readiness Conclusions • Q5: Have right tools/materials carry out mission? • Need more data collection systems to support PM efforts • Need to work with OBSF and service providers to develop and acquire ORS-wide data collection/analysis systems • Q6: Anticipated implications of not obtaining right mix KSAs/tools/materials? • Will not be able to continue to provide required expertise to lead the transformation to a PM based organization • At best, will significantly slow down ORS change to a PM based organization. PM – performance management

  33. Learning and Growth PerspectiveUnique Objective and Measure • Performance Objective: Invest in OQM staff development • Performance Measures: • Percent of employees with individual development plans (100%) • Percent adherence to training plans (86%)* * Trainings canceled

  34. Learning and Growth PerspectiveUnique Objective and Measure • Performance Objective: Invest in relevant tools (e.g., software, best practices) • Performance Measures: • Number of new tools obtained and applied by OQM

  35. Number of New Tools Obtained and Applied by OQM Note: Less than 5% of budget spend on tools in FY02.

  36. Learning and Growth PerspectiveUnique Objective and Measure • Performance Objective: Develop cadre of qualified consultants • Performance Measures: • Ratio of ASA consultants to ASA Teams • In FY01 ratio was 1:16, FY02 ratio is 1:8

  37. Financial Perspective

  38. Financial PerspectiveCommon Objective and Measures • Performance Objective: Minimize unit cost at a defined service level • Performance Measure: • DS1: Dollars per internal ASA consulting hour • DS1: Dollars per external ASA consulting hour • DS2: Dollars per internal strategic initiative consulting hour

  39. Calculation for Unit Cost • Internal ASA consultation/hour • Sum (consultation hours of OQM consultant x hourly rate)/ total hours of OQM consultation • (Total OQM budget – contract $)/hours of consultation • External ASA consultation (contractor) / hour • Sum (consultation hours of external consultant x hourly rate)/ total hours of consultation • Strategic Initiative (SI) consultation/hour • Estimated rate by averaging out hourly rate of staff involved

  40. DS1: Unit cost for external ASA consultation/hour: $200 • DS1: Unit cost for internal ASA consultation/hour • Weighted rate method: $30 • Budget recovery method: $52 DS2: Unit cost for strategic initiative consultation / hr. about $46

  41. Financial PerspectiveCommon Objective and Measures • Performance Objective: Maximize utilization of assets • Performance Measure: • DS1: Percent of time spent on ASAs relative to the total hours possible • DS2: Percent of time spent on strategic initiatives relative to the total hours possible

  42. Calculation for Asset Utilization (Staff hrs.) • Estimated asset utilization during a two-month period (Aug - Sept ) when OQM staff developed and filled assistance logs were completed by all staff • Asset utilization calculation • DS1: 50% • DS2: 53% • Total hrs. accounted: • 704 (DS1) +209 (DS2)+160 training = 1073 • Estimated Total Asset Utilization  1073/1800 = 60% Note: 160 hrs invested in staff training during this period.

  43. Financial PerspectiveUnique Objective and Measure • Performance Objective: Obtain required resources • Performance Measure: • Percent of budget approved (100% in FY01) • Percent of budget approved (100% in FY01)

  44. Financial PerspectiveUnique Objective and Measure • Performance Objective: Execute the budget • Performance Measure: • Percent of dollars executed versus planned: 100% in FY01 • Percent of dollars executed versus planned: 82% in FY02* * $$ was returned to OBSF to meet additional NIH security costs.

  45. Conclusions and Recommendations

  46. Conclusions from FY02 ASA • Customer Perspective • ORSEC is OQM’s primary customer for both Discrete Services due to implementation of ORS-wide initiatives • ASA external consultants provided most assistance to DES, DPS, and DIRS • ORS as an organization has made significant progress in implementing a common set of performance measures • Some of the recommendations made during the FY01 pilot have been implemented • Impact of the implementation has been difficult to quantify

  47. Conclusions from FY02 ASA (cont.) • Internal Business Process Perspective • ASA team training participation could have been better • Participant ratings of training were very high • Teams increased utilization of consultants as deadlines approached

  48. Conclusions from FY02 ASA (cont.) • Learning and Growth Perspective • Continue development of internal staff KSAs • OQM needs external consultant assistance in FY03 to meet ORS demands • Skills in performance management need to be developed throughout ORS • Need to continue to invest in tools and training • Financial Perspective • Established baseline for unit cost • Internal and external consulting rates not comparable for differences in experience levels • Need to develop better methodology to determine asset utilization

More Related