1 / 18

Streamline, Standardize and Automate Statistical Data Processing - Case Study

Streamline, Standardize and Automate Statistical Data Processing - Case Study. Andreas Hake April 14, 2014. Business Context. In 2009 the international community identified important data gaps that needed to be addressed by the IMF and other international organizations

wbarnard
Download Presentation

Streamline, Standardize and Automate Statistical Data Processing - Case Study

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Streamline, Standardize and Automate Statistical Data Processing - Case Study Andreas Hake April 14, 2014

  2. Business Context • In 2009 the international community identified important data gaps that needed to be addressed by the IMF and other international organizations • A report on these information gaps was prepared by the FSB and IMF staff, and endorsed by the G-20 Finance Ministers and CB Governors in Nov 2009 (G-20 Data Gaps Initiative) • As a result, it is anticipated that the volume of data processed by the IMF Statistics Department will increase by a factor of four over the next five years • To cope with this significant increase, the IMF Statistics Department needs to redesign its business processes and extend the capabilities, scalability, accuracy, reliability and timeliness of strategic business operations

  3. Goal: standardized business processes • Strategy strongly recommends a generalized, flexible and scalable approach that could be reused across statistical products • An exercise of this magnitude will span across multiple years, impacting people, processes and tools with a significant investment, which makes it critical for achieving the desired results • Design principles: • business process change and not IT tools implementation • Based on an enterprise data and metadata model • Reduction of manual steps through automation and standardization • Preparation of IT tools

  4. Organizational Specialization and Operational Independence Interface Interface Dissemination Collection • Production Standards, Processes and Technology

  5. Process Automation and Resource allocation (To-be) Interface Interface Dissemi-nation Collection • Production Standards, Processes and Technology

  6. Standard Production Process Template

  7. Goal: support increased demands and improve timeliness of data delivery • To validate the approach for streamlining, it was recommended to implement the new processes on a pilot dataset. • Two possible options for the pilot • Parallel run to compare and validate an existing dataset; or • New dataset. • First approach would be safer but could result in possible delays due to conflicting priorities, while the later approach poses high risks by relying solely on new processes and tools.

  8. Pilot selection • The Coordinated Portfolio Investment Survey (CPIS) dataset required a major change due to the expanded coverage, which almost doubled its size • The change was impacting all full data process including collection, processing and dissemination • Analysis confirmed that the existing tools and processes would not be able to deliver the desired results in the expected timeframe • Hence the expanded CPIS was taken up as the pilot implementation for the new streamlining exercise

  9. Pilot implementation • Support for increased data demand (data coverage expanded to approx. 34,000 series per country from 17,000) • Reduce the size of report collection forms • Just in time processing • Readiness to disseminate data real-time • Automation and easy data validation: The new implementation has eliminated most of the manual steps by implementation of automated workflows • Transparent workflow through data workflow dashboards • Performance improvements and access to business user tools

  10. Pilot results - Collection

  11. Pilot results - Processing

  12. Submission status reports

  13. Work in Production - Validation Charts Cross-Database Comparisons Diagnostic Summary Detailed Diagnostics OLAP Analytics Metadata Integration

  14. Pilot results – Dissemination

  15. Success indicators • Business ownership at end user level • Strategic buy-in from Senior management • Allocation of budgets for capital investment • Sentiments in the business community • “Can we be next?”

  16. Critical areas identified • The successful pilot implementation demonstrated the benefits of standardization and automation • Key areas to be addressed for full implementation: • Organizational structure • People change management • Outreach and communication • Governance • Plan overall at high level, but detailed for next six months • Share success and celebrate

  17. Next Steps • Establish steering group to oversee and govern the change process • Detail the overall high-level plan for the coming months • Adjust the organizational structure • Communicate, communicate, communicate…

  18. Thank You Andreas Hakeahake@imf.org+1 (202) 623 8130

More Related