860 likes | 873 Views
Learn about managing the Beast, prioritizing eRA systems, benefits to grantees, and navigating the extensive extramural process at NIH. Explore required systems, funding, and user feedback for effective electronic grants administration.
E N D
NIH Electronic Research Administration: Executive Officers Meeting October 3, 2000
Topics for Discussion Today • What is Electronic Research Administration • Brief overview of the required systems • New management structure for NIH enterprise systems • Management of the Beast • Initial diagnosis – resources – funding – open up process • Priority setting process • Give you a sample of the priorities and the estimated cost for NIH to have functional eRA systems to meet its stewardship and oversight responsibilities in the funding and monitoring over ~ $13 billion dollars of the taxpayers monies. • Brief overview of the website and communication strategy • Get your comments and feedback
The NIH Extramural Process: Receives, Copies, Ships, and Stores at a minimum 164 million pages of paper to in the business processes associated with competing and non-competing grants each year.* ** This number is a minimal number. It does not include duplicate files for council versus grants management; program staff file system, paper that moves internally for different IC practices. Nor does it account for time spent managing, pulling, distributing of finding files.
A Lot of Paper to Receive, Copy, Ship and Store* *Does not include: • Committee Management, Review, Reports, Budget, Coding and other processes within Academic Institutions or NIH. Stratified sample of 3,000 grants used for the data.
Four Basic Systems • IMPAC I (legacy) • Began 1964 • Many extension system added over 25 years • IMPAC II • Began 1996 • Most modules deployed • Several in need of a life cycle redesign • Commons • Began 1996 • 2 modules in full production • I-Edison • CRISP on the web • Most of the systems not developed or integrated into IMPAC II • Federal Commons • One face for applicants for all agencies
Periodic Reports - progress - financial - inventions - women/minorities Final Reports - progress - financial - inventions Scientific Proposals Other Support Project Specific Assurances Certifications Assurances Summary Statement Post-Award Correspondence Assignment Grants Policy Statement Application Specification Priority Score Notice of Grant Award ERA Objective = Full Electronic Grants Administration ~ 100,000 Applicants / 2,000 Grantee Institutions Worldwide Commons File (Fed.Commons) (FADEX) IMPAC II - NIH Transactional Database 3,500 Users / OD & 25 - NIH Institutes and Centers
Registration Acct. Admin Inst. Profile Prof. Profile Modules -- Functional & User Groups Infrastructure & Shared I2 Modules GUM API People RAE UA Shared Commons Modules
I2 CRISP Plus I2 Power View Registration I2 Quick View Acct. Admin Inst. Profile Prof. Profile Modules -- Functional & User Groups eRA Interface Modules I2 Infrastructure & Shared I2 Modules Committee Management I2 Grants Management I2 Review GUM I2 Receipt, Referral, Assignment API People I2 Training Activities RAE I2 IC Operations (Program) UA I2 Electronic Council Book Shared Commons Modules COM CRISP on the Web COM I-Edison COM E-SNAP COM X-Train COM Comp. Grant Appl.
I2 CRISP Plus I2 Power View Registration I2 Quick View Acct. Admin Inst. Profile Prof. Profile Modules -- Functional & User Groups eRA Interface Modules I2 Infrastructure & Shared I2 Modules Committee Management To support the system • 25 Machines • Windows NT • Unix • Sun Solaris • Reliant • Tru64 • 28 COTS Products • Custom designed software • Web environment • Client-server environment I2 Grants Management I2 Review GUM I2 Receipt, Referral, Assignment API People I2 Training Activities RAE I2 IC Operations (Program) UA I2 Electronic Council Book Shared Commons Modules COM CRISP on the Web COM I-Edison COM E-SNAP COM X-Train COM Comp. Grant Appl.
What Kind of Volume is Involved in FY 00 • System to monitor and process the receipt, review (initial and council), fund, administer and steward: • Receive ~ 46,000 competing (1 & 2) grant applications • Establish 3,166 meetings to review those applications. • Support the 128 meetings of the National Advisory Councils to provide the second level of review • Award ~ 60,000 competing and non-competing grants. • Maintain historical records 1973 to present • Report on trends and analysis for all grants awarded internally and to the press, public and scientific communities
The Matrix Must Be Built Federal Commons NBS IMPAC II Grantee Community IC Extensions
Project Management Structure March 2000 IT Board of Governors CIO ERA Steering Committee IC Director-Chair DDER, CIO, Project Manager, IC Rep, DDER Project Manager Daily Operations Manager CIT Tech Support Multiple Functional Groups CM GMAC RPC EPMC
Project Management Users, Customers, Team & Management Business Management Technical Management
What Resources Are Available? • What resources are available to NIH eRA? • 53 – NIH FTEs • 23.20 FTEs or 43.7 %- Legacy System (IMPAC I) • 20.75 FTEs or 39.2 % - IMPAC II • 3.35 FTEs or 6.3 % - Commons • 1.20 FTEs or 2.3% - Federal Commons • 4.50 FTEs or 8.5% - I-Edison • Contract dollars for FY 2000 • IMPAC II - 5.0 million • Commons - 1.0 million • Miterek (Acceptance Testing) - 0.7 million • IC extension systems ? • Internet review, budget, council, and reporting tools • Other Agencies ? • Outside community ? • Academic • Private sector Coalitions
Business Management Technical Management Project ManagementQuestions that you have asked: • Are the contractors charging appropriately for services rendered? Yes: • Independent review – Martha Pine • How are costs being monitored? • New Cost Model established monthly vouchers map to the cost model • Is the system technically sound? Yes: • Peer review of the system • IC - Shared Designed Group review CDRs • Budget request for review of the software design process
Project Management Users, Customers, Team & Management Business Management Technical Management
IMPAC II : • New technology/more power/more complex • Expectations high • Built on existing business process and design • Low resources impacted on: • Critical path decision making to develop the modules • Sometimes collapsing functionality within a module • Minimal resources to outreach, communication, help desk, and training. • Lack of understanding decision making and priority setting • Enterprise not prepared for the migration of extension systems • Staff or time and knowledge to link and plan for the changes.
IMPAC II (cont ): • Integration of Commons Data into IMPAC 2 • Stability of the database • Budget reports needed • Communications and Outreach
Report of the Peer Review of the NIH Commons April 10, 2000 Report of the Peer Review of the NIH Commons April 10, 2000 Help Needed to Assess Priorities
Effects of IT Review on NIH Commons • Not financed sufficiently to meet objectives • All Development Work Stopped as of March 1, 2000 • Competitive Grant Application Process (CGAP) • Complex Non-Competing Award Process (CNAP) • Fellowships • Detailed Designs Complete • Redesign E-SNAP with new business practices: • Changes in policy and in design • User group with NIH and the Extramural community needs to be established. • Focus efforts on: • Integration with IMPAC II • E-SNAP, X-Train, 194 data set transmission • Communications-outreach
Status of NIH Commons Deployment…as of July 10, 2000 • Operations, Maintenance of Existing Interfaces has Continued • No major expansion of deployment • Only minimal enhancements • No increase in user support • I-Edison = Production Deployment • Business as Usual • ~240 grantee/contractor organizations registered • > 90% of organizations that routinely submit reports • Submission via interactive web or datastream • ~9,000 inventions & 4,000 patents since Oct. ‘95 • CRISP = Production Deployment • Business as Usual with Minimal Enhancements • ~30,000 CRISP queries per week • Records include 1985 thru Current Awards • Promotion to 1973 thru Current Awards to occur
Status of NIH Commons Deployment…as of July 10, 2000 • Registration/Profiles/Status = Pre-Production • Existing Interfaces Maintained • No Additional User Support • 158 Grantee Organizations Registered • 3,750 User Accounts created • Average of ~30 logons/day • e-SNAP via Interactive Web = FDP Pilot Deployment • No Active Recruitment or Increase in Deployment • No Additional User Support • 15 Grantee Organizations Participating • 185 e-SNAPs processed since July 1999
CY1999 CY2000 CY2001 NIH ERA Deployment: 1998-2001 Interagency Edison CRISP on the Web Commons Registration Accounts Administration Application/Award Status Institutional & Professional Profiles e-SNAP Trainee Appointments (X-Train) Competitive Application Fellowships Federal Commons hold X hold X hold X hold X hold X hold X X X FDP Pilot - 65 Grantee Organizations………..... Pre-Prod. - ~ 120-150 Organizations……..……. Full Prod. - Open Registration………………….. Suspended Further Deployment……………….. Development Stopped prior to deployment….. X X
Technical, Schedule, and Cost Performance are Opposing Forces that Require Compromise?
Overall Performance Measurements • Appropriate funding • Participatory planning / prioirty setting process • Gap of 15% is a red flag effecting quality and scope of the project (G Kapur-Center for Project Management, 1998) • Establish and communicate clear critical path measure milestone and deliverable (within the next two months) hit rates • Evaluate cost to date versus estimated cost to date • Recognition and health of the project team • Effectively manage issues
I2 CRISP Plus I2 Power View Registration I2 Quick View Acct. Admin Inst. Profile Prof. Profile Modules -- Functional & User Groups eRA Interface Modules I2 Infrastructure & Shared I2 Modules Committee Management I2 Grants Management I2 Review GUM I2 Receipt, Referral, Assignment API People I2 Training Activities RAE I2 IC Operations (Program) UA I2 Electronic Council Book Shared Commons Modules COM CRISP on the Web COM I-Edison COM E-SNAP COM X-Train COM Comp. Grant Appl.
CMO (Committee Management) GMAC (Grants Mgmt. Adv. Comm.) RPC (Review Prog. Comm.) CSR (Review Program Comm.) TPC (Training Policy Comm.) EPMC (Extramural Mgmt. Comm.) Coalition of IC’s I2 CRISP Plus I2 Power View Registration I2 Quick View Acct. Admin Inst. Profile Prof. Profile Modules -- Functional & User Groups eRA Interface Modules I2 Functional Group I2 Infrastructure & Shared I2 Modules Committee Management I2 Grants Management I2 Review GUM I2 Receipt, Referral, Assignment API People I2 Training Activities RAE I2 IC Operations (Program) UA I2 Electronic Council Book Shared Commons Modules Commons Advisory Group Public, Press, Scientist COM CRISP on the Web Institution, NIH COM I-Edison Type 5, VP research COM E-SNAP T32, F32 COM X-Train Applicants, VP Research COM Comp. Grant Appl.
CMO (Committee Management) GMAC (Grants Mgmt. Adv. Comm.) RPC (Review Prog. Comm.) CSR (Review Program Comm.) TPC (Training Policy Comm.) EPMC (Extramural Mgmt. Comm.) Coalition of IC’s I2 CRISP Plus I2 Power View Registration I2 Quick View Acct. Admin Inst. Profile Prof. Profile Modules -- Functional & User Groups eRA Interface Modules I2 Functional Group I2 User Groups I2 Infrastructure & Shared I2 Modules Committee Management I2 Grants Management I2 Review GUM I2 Receipt, Referral, Assignment API People I2 Training Activities RAE I2 IC Operations (Program) UA I2 Electronic Council Book Shared Commons Modules Commons User Groups Commons Advisory Group Public, Press, Scientist COM CRISP on the Web Institution, NIH COM I-Edison Type 5, VP research COM E-SNAP T32, F32 COM X-Train Applicants, VP Research COM Comp. Grant Appl.
Project Management Team Structure TPC Advocate Wally Schaffer RPC Advocate Eileen Bradley GMAC Advocate Marcia Hahn PI Advocate CMO Advocate Claire Benfer Daily Operations Manager Jim Cain POPOF/Program Advocate Christopher Beisel CIT liaison & IC Tech Advocate Donna Frahm Reports Advocate Carol Martin Data Integrity Belinda Seto Research Institution Advocate George Stone ECB Advocate Thor Fjellstedt Project Manager John McGowan IT Design/Arch. Advocate Donna Frahm Receipt and Referral Brent Stanfield OER Liaisons FDC/ROW Liaison Jay Silverman
NIH IT Project Management Structure IT Board of Governors CIO ERA Steering Committee IC Director-Chair DDER, CIO, Project Manager, IC Rep, Deputy Director for Extramural Research Federal Commons IAEGC ECC Fed Commons Working Group eRAProject Manager NIH Commons Advisory Group Academic Institutions Private Sector NIH Community Federal Agencies Daily Operations Manager CIT Tech Liaison Multiple Functional Groups CM GMAC RPC EPMC IT Shared Design Group eRA Project Management Team IC Representatives
What Do We Hope To Have? • Clear: • Person responsible • Set of identified priorities • Phasing process: • Establish budgets and needs • Project future directions • Allow time for business/policy redesign and the communication needed at the design stage to integrate with other systems (IC, NBS, Institutions, Fed Commons) and their processes. • Design and implement projects with business plans using the funds made available with target milestones.
Priority Setting Begins With User Group ERA Module User Group Functional Group Group Advocate Project Team • Steering Committee • Reviews and sets priorities Recommendations and priorities posted on Web; community informed through newsletter.
Straw-man for Developing Priorities • FY 2001 Project Plan submitted to the steering committee and the BOG. • Next, Project Plan will be developed through the priority setting process. • Current list not prioritized but follows the cost model.
Criteria to Rank Projects for: Phasing and Funding Criteria Maximum Point Value Management Support 15 Business Risk of Not Doing It 10 Culture Change 10 Technical Risk 10 Scope of Change 20 Business Process Redesign 15 Business Model 10 Quality of Work Life 10 TOTAL 100 Criteria need improvement
Nomenclature for Requirements • NR = new requirement with no budget • NBR = new budgeted requirement (planned) • NFR = new funded requirement (financed) • OBR = original requirement budgeted for and would be added in the release of software upgrades when the money becomes available • OFR = original requirement that has been funded and will be added in the schedule for release of software upgrades • ST = special requirements established outside the normal priority setting process
Advocates Took the Job Seriously • Cornell • Dartmouth • Emory • Florida • Fred Hutchinson Cancer Center • Michigan • MIT • North Carolina State • Northwestern • Penn State • St. Jude • Wisconsin
ERA Requirement Business Plan Group Advocate: Belinda Seto/Carol S. Martin Person Responsible: Belinda Seto Title: Data Integrity (Cross Walk with Requirement # 3) Priority Number: 1 Query and Reports Sub-priority Number: 1.1 Check one: New: Original: Special: Check one: Budgeted: Funded: Time to complete: Cost: Year 1: Year 2: Year 3: Objective: Current business model and opportunities for redesign: What will happen if this requirement is not fulfilled? How will this requirement affect other modules while being fulfilled and after integration? What policy issues need to be addressed before design, or how can system be designed to adapt to future policy changes? Development of Business Plans
It’s No Joke Without Proper Funding NIH is at Risk Of Not Meeting its Mandate* * G.P.E.A. (NIH must be electronic by Oct. 2003) P.L. 106-107 (Plan for single gov’t system by May 2001)
It’s Unanimous – Query and Reports are Number 1 • Reporting is complex • Review needs at least 28 different reports • Grants needs at least 5 • Budget and Coding (see examples) • Need different level tools • Need different report options • Word, Worperfect, PDF • Excel • Print