1 / 38

Using Metrics to Drive Research Administration Performance

Using Metrics to Drive Research Administration Performance . NCURA FRA Meeting March, 2013. Marcia L. Smith, Assistant Vice Chancellor, Research Administration, UCLA Elizabeth H. Adams, Executive Director, Office for Sponsored Research, Northwestern University , Evanston Campus

efrat
Download Presentation

Using Metrics to Drive Research Administration Performance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Metrics to Drive Research Administration Performance NCURA FRA Meeting March, 2013 • Marcia L. Smith, Assistant Vice Chancellor, Research Administration, UCLA • Elizabeth H. Adams, Executive Director, Office for Sponsored Research, Northwestern University, Evanston Campus • Susan Lin, Assistant Controller, Extramural Funds, UCSF • Tracey Robertson, Director, Higher Education Consulting, Huron Consulting Group

  2. Agenda • Introduction • Benefits of Metrics • Developing Metrics • Determine Metrics to Track • Collecting Metrics • Discussion (All)

  3. Introductions

  4. Benefit of Metrics

  5. Benefit of Metrics

  6. Drive Performance • Motivate teams to achieve desired outcomes • Increasing transparency can trigger positive culture changes and improve outcomes. • Using metrics to monitor business processes improves accountability so high performers can be recognized and bottlenecks addressed. • Define business processes and responsibilities • Implementing metrics requires an organization to identify its desired outputs resulting in defined business processes. • Use of metrics helps identify operational bottlenecks which can be a result of personnel having varied understandings of processes, roles and responsibilities. • Monitor the impact of new processes • As new processes are implemented, metrics provide confirmation that the change is working. • Easily identify bottlenecks • Help motivate team Benefits of Metrics

  7. Change Behavior • Manage stakeholder expectations • Metrics enable clear communication of process goals and current status to stakeholders. • Concrete information enables stakeholders to determine whether their needs are being met. • If an institution sets reasonable and clear goals which are communicated then customer perception of performance can improve simply through a better understanding. • Evaluate staff performance • Metrics allow leadership to objectively track staff contributions both individually and collectively against operational goals. • With metrics, personnel know their assessments are objective, why they are receiving their assessments, and how to improve them. Benefits of Metrics

  8. Support Investments in Research Administration Infrastructure: • Improve decision making and prioritization • Metrics provide leadership with insights as to where attention and resources are needed. • Concrete information improves decision-making and allows managers to better understand and identify opportunities to improve compliance, financial management, and operational performance. Benefits of Metrics

  9. Developing Metrics

  10. Where to Begin? • Where are the most significant business pressures in your organization? • Strategic priorities • Compliance risk • Customer frustration • What data do you have access to? • What are the relevant systems? Start with enterprise systems, BI tools • Once you get access, do you trust the data? • Do the datasets accurately describe processes? • Likely need for strengthening data definitions, data cleanup, process changes and better reporting tools Developing Metrics

  11. Where to Begin? • What processes do you own, or can you influence? • Many research administration processes touch different teams and offices • If you own the processes in your office, you can move much more quickly • System reconfiguration and/or process modification to collect the data you want (e.g., capture handoffs) • Undeniable connection between data integrity and standard operating procedures • Baseline data can be hard to develop, and accept • Some staff may embrace the challenge, others may not • Though change can be disruptive/controversial, some staff will feel relieved that they know exactly what is being expected of them • Baseline data offers an invaluable opportunity to demonstrate improvement, which can be very positive and morale building • If you embrace transparency, you can more easily expect this of others Developing Metrics

  12. Where to Begin? • Anecdotal data complements empirical data • While indispensable to the story you want to tell about your organization, empirical data doesn’t tell the whole story • Don’t forget to collect the personal stories, internally and externally • Beyond providing important contextualizing information, collecting different perspectives is valuable for building trust, relationships Developing Metrics

  13. Determine Metrics to Track?

  14. Identify Greatest Opportunity Taking a focused approach will allow you to more quickly implement successful, lasting, and measureable improvements

  15. UCLA Case Study Pre-Award Metrics Post-Award Metrics

  16. UCLA Case Study

  17. Pre-Award Metrics

  18. Proposal Submission

  19. Award Set-Up • Award Intake Process (Pilot) • Turnaround time for Expedited Awards has improved by over 80% during the award setup pilot

  20. Award Set-Up • Award Intake Process (Current) • Full implementation January 2012 • Award setup has slowed for expedited awards, but is still 65% faster than previous processing timeline

  21. Award Setup (Current) • New process has identified hold-ups • Shaping policy and procedure decisions • Awards processed 6 days faster when all internal documents are present

  22. Post-Award Metrics

  23. Post-Award Metrics

  24. On-Time Submission • On-time submission increased by 35% for Invoices • On-time submission increased by 48% for Reports, from a low of 14% at the start of FY10

  25. Backlog – Invoices and Reports • Backlogs have decreased by 64% since the start of FY10

  26. Extramural Funds 2012 Non-LOC Cost Reimbursable Invoicing Status

  27. Extramural Funds Fixed Price Billing Status

  28. Extramural Funds Unreimbursed Cost for Type of awards

  29. Extramural Funds Cash Receipts and Expense

  30. Extramural FundsAccounts Receivable Aging Balances Over 120 Days

  31. Extramural FundsTop 10 Sponsors with AR Balances Older Than 120 Days Notes (sponsors over $70K): XXXX Sponsor: $586,495 paid on January 15, 2013; $1,218 approved for payment; $197,443 in approval process at sponsor. The delay is due a transfer of program between departments at Sponsor which required a new contract . XXXX Sponsor : Payment delayed due to change in sponsor’s entity name, which requires new letters of agreement between the sponsor and UCSF. The new agreement has executed January 2013. XXXX sponsor: Payment received on January 15, 2013. XXXX Sponsor: Foundation intended to make a gift rather than entering into a research contract. We are in the process of changing the nature of agreement from research to gift. XXXX Sponsor : Payment delayed due to lack of PO number and difficulty identifying and resolving issues through third-party Help Desk. The Invoice now has been posted in sponsor’s “Aspen” system for payment.

  32. Extramural Funds XXXX Sponsor’s Fund Deficit Status 32

  33. Award Closeout Completion status compared to goal (Operational Goal – close within 150 days after end date of budget period) *June is a peak month when award budget period ends. The reason of the delay for closing out June expired awards is because the Closeout Team is staffed to meet the workload of 100 to 200 expired awards per month. We will continue to explore options how to flex our resources to reduce the delays.

  34. FFR Statistic Overview Completion Progress

  35. Effort Reporting Certification Timeliness & Completion

  36. Collecting Metrics

  37. Collecting Metrics • Things to consider: • Audience for metrics • Effectiveness of metrics is greatly affected by the selection of recipients who will be reviewing the metrics. • Wrong metrics or wrong audience diminishes the value of the metrics. • Data integrity • Is the data that you are using to populate the metrics accurate? • Is the correct logic being used? • Establish standardized processes for entering data • Interpretation of data • Are assumptions being made? • Know what each data field means; develop a data dictionary

  38. Discussion

More Related