100 likes | 189 Views
Academic Senate and Faculty Participation in Operational Excellence Pre-Award 2010-2011. Robert Newcomer, PhD UCSF Division Vice Chair, 2009-2011. Aims. Present faculty concerns Present Senate recommendations
E N D
Academic Senate and Faculty Participation in Operational ExcellencePre-Award 2010-2011 Robert Newcomer, PhD UCSF Division Vice Chair, 2009-2011
Aims • Present faculty concerns • Present Senate recommendations • In addition to the planned cluster option, add one that leaves pre-award with current high performing units • Evaluate both models using the same standards • Include effects on post-award in evaluation • Use the results of the evaluation to determine how to proceed, including whether to • Select one model, or • Maintain both models concurrently
Faculty Members of the Operational ExcellenceFaculty Oversight Committee • Ronald Arenson • Claire Brindis • Marcelle Cedars • Deborah Grady • Richard Jordan • Thomas Kornberg • Lisa Kroon • Wendell Lim • Geoffrey Manley • Wendy Max • Mervyn Maze • David Morgan • Bob Newcomer • Neil Powe
Senate Ad Hoc Operational Excellence Work Group • Steven Cheung, Chair of Academic Planning, & Budget • Roland Henry, Chair of the Research Committee • David Gardner, Academic Assembly representative and former Senate chair • Deborah Greenspan, Academic Assembly representative and former Senate chair • Robert Newcomer, Senate Vice Chair • With consultation from Stan Glantz, representing the Faculty Association
Consensus • UC can improve management of grant pre- and post-award processes. • Pre- and post-award grant service can be improved • Wide variation in service quality and costs • Desire to improve service and reduce costs to standard set by high performing units • Substantially shrink C&G by moving signatory authority closer to grant preparation is a good idea • Improve training • Streamline operations • Reduce costs
Concern: The Pre-Award Cluster Model is an Unproven Experiment Uncertainty arises from possible disruption to Effective existing pre- and post-award operations Faculty and staff productivity Faculty and staff morale Staff retention Could compromise ability for faculty to compete for grants in a very challenging environment Current evaluation design does not have a meaningful comparison group
Concern: Cost Savings Not Assured Cost estimated of Pre-Award cluster operations based on estimates, not experience. Less uncertainty in actual costs of incremental change built on existing structures. Some departments/ORUs already operating at or below the estimated unit cost standards for OE Pre-Award clusters. Cost impact of separating pre- and post-award operations is not considered in the current plan. Senate recommendation: Efficiently operating units be left intact, with appropriately delegated signature authority, to provide a comparison for the OE clusters.
Concern: Funding Model No comprehensive funding model has been presented for pre- and post-award administration. Will financing come from: chancellor and school contributions, indirect cost recovery taxation, departmental recharges, or project direct costs? Which funding source will bear the risk of a Pre-Award initiative that is more expensive than anticipated, particularly for departments currently operating below projected OE cluster costs? Departments may have increased costs due to post-award administration staffing needs Senate recommendation: Departments should not be responsible for such increased costs.
Concern: Evaluation Design Current evaluation design and data elements are not sufficient to determine effectiveness and problems Both pre-award and post-award processes and expenses need to be included in the design Both the cluster initiative and an enhanced current practice comparison group should be included The initiative’s pause, modify, and stop rules need clarification The evaluation timetable does not leave room for program refinements between phases.
Senate Recommendations • Move forward with two complementary models • OE cluster initiative • Delegation of signature authority to current high performing units. • Evaluation • Compare the two models • Include effects on post-award • Extend phase 1 evaluation to allow sufficient time for refining procedures and practices in both models • Remain open to possibility using both models concurrently as a long term solution