1 / 39

Creating a New Business Model for a National Statistical Office in the 21st Century

This case study explores the development of a new business model for Statistics New Zealand, focusing on the organization, program, strategy, statistical metadata systems, and design and cultural issues. It highlights the lessons learned and provides insights into the transformation strategy implemented.

ahmadc
Download Presentation

Creating a New Business Model for a National Statistical Office in the 21st Century

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Statistics New Zealand’s Case Study”Creating a New Business Model for a National Statistical Office if the 21st Century”Craig Mitchell,Gary Dunnet, Matjaz Jug

  2. Overview • Introduction: organization, programme, strategy • The Statistical Metadata Systems and the Statistical Cycle: description of the metainformation systems, overview of the process model, description of different metadata groups • Statistical Metadata in each phase of the Statistical Cycle: metadata produced & used • Systems and Design issues: IT architecture, tools, standards • Organizational and cultural issues: user groups • Lessons learned

  3. Maori Statistics Unit Elizabeth Bridge x4696 Planning & Performance Reporting Greta Gordon x4223 63 Corporate Services and Maori Statistics Unit HRAM: Robynn Cade x4681 Prices John Morris x4307 Strategy & Communications HRAM: Alan McIntyre x4662 09 57 Geography, Regional &EnvironmentTammy Estabrooks x4614 Manager (Acting) EA: Indigo Freya x4858 National Accounts Michael Anderson x4930 08 Corporate Support Sandy Natha x4242 52 Strategic Policy & Planning Paul Maxwell x4727 Macro-Economic StatisticsDevelopment Unit Judith Hughes X4803 07 Human Resources Business Unit Vina Cullum X4815 51 Macro-Economic, Environment, Regional & Geography HRAM: Alan McIntyre x4662 Government & International Accounts Peter Swensson x4060 Strategic Communication Sam Fisher x4225 Financial Services Raj Narayan x4709 EA: Eugénie Bint x4903 37 03 55 14 15 09 Statistical Education & Research HRAM: Alan McIntyre x4662 Rachael Milicich Deputy Government Statistician (Acting) Macro-Economic, Environment, Regional & Geography Statistics EA: Indigo Freya x4858 Nancy McBeth General Manager Strategy & Communication EA: Hanli van der Westhuizen x4235 Whetu Wereta General Manager Maori Statistics Unit EA: Eugenie Bint x4903 David Archer General Manager Vina Cullum (Acting till 11 July 2007) Corporate Services EA: Eugenie Bint x4903 Industry & Labour Statistics HRAM: Lisa Mulholland x4871 Statistical & Methodological HRAM: Robynn Cade x4681 Work, Knowledge & Skills Julian Silver x4387 62 OSRDAC Hamish James x4237 21 61 Business Indicators Louise Holmes-Oliver x8780 and Kathy Connolly x8975 Andrew Hunter General Manager Christchurch Office Geoff Bascand Government Statistician EA: Kathy Warren x4760 Sharleen Forbes General Manager Statistical Education & Research EA: Indigo Freya x4858 36 01 Collection & Classification Standards Bridget Hamilton-Seymour x4833 31 Business, Financial & Structural Andrew Hunter x8355 35 39 67 Dallas Welch Deputy Government Statistician Industry & Labour Statistics EA: Eugenie Bint x4903 Vince Galvin General Manager Statistical & Methodological Services EA: Indigo Freya x4858 27 34 Statistical Methods Diane Ramsay x4355 HRAM: HR Account Manager EA: Executive Assistant Business Performance & Agriculture Eileen Basher x4701 65 90 Ray Freeman General Manager Auckland Office EA: Diane McGuire x9315 Gary Dunnet General Manager Business & Dissemination Services EA: Hanli van der Westhuizen x4235 Cathryn Ashley - Jones Deputy Government Statistician Social & Population Statistics EA: Tania Mattock x4074 Census 2011 Carol Slappendel x4947General Manager EA: Tania Mattock x4074 Product Development & Publishing Gareth McGuinness x4851 Chief Information Officer Matjaz Jug x4238 EA Hanli van der Westhuizen x4235 47 78 29 56 Population Denise McGregor x4303 Integrated Data Collection Ray Freeman x9143 Application Services Nathan Scott x4156 38 58 Information Customer Services Mike Moore x8701 Social Statistics Development Unit Tere Scotney x4956 82 Business & Dissemination Services and Chief Information Officer HRAM: Lisa Mulholland x4871 91 59 Social & Population HRAM: Robynn Cade x4681 Social Conditions Paul Brown x4304 Integrated Data Collections HRAM: Alan McIntyre x4662 IT Operations & Services Sharon Hastie x4645 Business Transformation Strategy Gary Dunnet x4650 98 66 Standard of Living Andrea Blackburn x4680 Last Updated 20/06/07

  4. Business model Transformation Strategy • A number of standard, generic end-to end processes for collection, analysis and dissemination of statistical data and information • Includes statistical methods • Covering business process life-cycle • To enable statisticians to focus on data quality and implemented best practice methods, greater coordination and effective resource utilisation. • A disciplined approach to data and metadata management, using a standard information lifecycle • An agreed enterprise-wide technical architecture

  5. BmTS & Metadata The Business Model Transformation Strategy (BmTS) is designing a metadata management strategy that ensures metadata: • fits into a metadata framework that can adequately describe all of Statistics New Zealand's data, and under the Official Statistics Strategy (OSS) the data of other agencies • documents all the stages of the statistical life cycle from conception to archiving and destruction • is centrally accessible • is automatically populated during the business process, where ever possible • is used to drive the business process • is easily accessible by all potential users • is populated and maintained by data creators • is managed centrally

  6. A - Existing Metadata Issues • metadata is not kept up to date • metadata maintenance is considered a low priority • metadata is not held in a consistent way • relevant information is unavailable • there is confusion about what metadata needs to be stored • the existing metadata infrastructure is being under utilised • there is a failure to meet the metadata needs of advanced data users • it is difficult to find information unless you have some expertise or know it exists • there is inconsistent use of classifications/terminology • in some instances there is little information about data, where it came from, processes it has been under or even the question to which it relates

  7. B - Target Metadata Principles • metadata is centrally accessible • metadata structure should be strongly linked to data • metadata is shared between data sets • content structure conforms to standards • metadata is managed from end-to-end in the data life cycle. • there is a registration process (workflow) associated with each metadata element • capture metadata at source, automatically • ensure the cost to producers is justified by the benefit to users • metadata is considered active • metadata is managed at as a high a level as is possible • metadata is readily available and useable in the context of client's information needs (internal or external) • track the use of some types of metadata (eg. classifications)

  8. How to come from A to B? • Identified the key (10) components of our information model. • Service Oriented Architecture. • Developed Generic Business Process Model. • Development approach from ‘stove-pipes’ to ‘components’ and ‘core’ teams. • Governance – Architectural Reviews & Staged Funding Model. • Re-use of components.

  9. 10. Dashboard / Workflow 4. Analytical Environment CURFS Imaging 5. Information Portal Admin. Data INFOS 6. Transformations Official Statistics System & Data Archive Output Channels Multi-Modal Collection 1. Input Data Store 2. Output Data Store Web CAI ‘UR’ Data Summary Data Raw Data Clean Data Aggregate Data E-Form RADL 8. Customer Management 7. Respondent Management 3. Metadata Store Statistical Process Knowledge Base 9. Reference Data Stores 10 Components within BmTS

  10. Statistics New Zealand Current Information Framework Analyse Disseminate Need Design/ Build Collect Process Generic Business Process Time Series Store (& INFOS) QMS, Ag Range of information stores by subject area (silos) HES etc. ICS Store Web Store Metadata Store (statistical, e.g. SIM) Reference Data Store (e.g. BF, CARS) Software Register Document Register Management Information - HR & Finance Data Stores

  11. Statistics New Zealand Future Information Framework Design/ Build Collect Analyse Need Disseminate Process Generic Business Process TS Raw Data Clean Data Summary Data Output Data Store (confidentialised copy of IDS - Physically separated) Input Data Store ICS WEB Metadata Store (statistical/process/knowledge) Reference Data Store Software Register Document Register Management Information - HR & Finance Data Stores

  12. CMF – gBPM Mapping

  13. Metadata: End-to-End • Need • capture requirements eg usage of data, quality requirements • access existing data element concept definitions to clarify requirements • Design • capture constraints, basic dissemination plans eg products • capture design parameters that could be used to drive automated processes eg stratification • capture descriptive metadata about the collection - methodologies used • reuse or create required data definitions, questions, classifications • Build • capture operational metadata about selection process eg number in each stratum • access design metadata to drive selection process • Collect • capture metadata about the process • access procedural metadata about rules used to drive processes • capture metadata eg quality metrics

  14. Metadata: End-to-End (2) • Process • capture metadata about operation of processes • access procedural metadata, eg edit parameters • create and/or reuse derivation definitions and imputation parameters • Analyse • capture metadata eg quality measures • access design parameters to drive estimation processes • capture information about quality assurance and sign-off of products • access definitional metadata to be used in creation of products • Disseminate • capture operational metadata • access procedural metadata about customers • Needed to support Search, Acquire, Analyse (incl; integrate), Report • capture re-use requirements, including importance of data - fitness for purpose • Archive or Destruction - detail on length of data life cycle.

  15. Metadata: End-to-End - Worked Example Question Text: “Are you employed?” • Need • Concept discussed with users • Check International standards • Assess existing collections & questions • Design • Design question text, answers & methodologies • Align with output variables (e.g. ILO classifications) • Data model, supported through meta-model • Develop Business Process Model – process & data / metadata flows • Build • Concept Library – questions, answers & methods • ‘Plug & Play’ methods, with parameters (metadata) the key • System of linkages (no hard-coding)

  16. Metadata: End-to-End - Worked Example Question Text: “Do you live in Wellington?” • Collect • Question, answers & methods rendered to questionnaire • Deliver respondents question • Confirm quality of concept • Process • Draw questions, answers & methods from meta-store • Business logic drawn from ‘rules engine’ • Analyse • Deliver question text, answers & methods to analyst • Search & Discover data, through metadata • Access knowledge-base (metadata) • Disseminate • Deliver question text, answers & methods to user • Archive question text, answers & methods

  17. Conceptual View of Metadata • Anything related to data, but not dependent on data = metadata • There are four types of metadata in the model: Conceptual (including contextual), Operational, Quality and Physical …defined by MetaNet

  18. Dimension • Standard classifications • Standard variables Dimension • Standard questions Dimension • Survey • Instruments • Survey mode Dimension • Standard data definition Metadata Implementation: Dimensional Model FACT

  19. Reference data Classifications User access Architecture INFORMATION PORTAL Metadata Service layer Input Data Environment FACT FACT

  20. Fact definitions Versioning Time Questions & Variables Dimensions Hiearchies Units of Interest Collections & Instruments Respondents

  21. Goal: Overall Metadata Environment

  22. Metadata: Recent Practical Experiences • Generic data model – federated cluster design • Metadata the key • Corporately agreed dimensions • Data is integrateable, rather than integrated • Blaise to Input Data Environment • Exporting Blaise metadata • ‘Rules Engine’ • Based around s/sheet • Working with a workflow engine to improve (BPM based) • IDE Metadata tool • Currently s/sheet based • Audience Model • Public, professional, technical – added system

  23. SOA

  24. Standards & Models - The MetaNet Reference ModelTM • Two Level Model based on: • Concepts = basic ideas, core of model • Characteristics = elements, attributes, make concepts unique • Terms and descriptions can be adapted • Concepts must stay the same • Concepts should be distinct and consistent • Concepts have hierarchy and relationships

  25. Collection Instance Questionaire B Questionaire A Do you live in Wellington? Question 1 Question 1 Question 1 Question 1 What is your age? Question 2 Question 2 Question 3 Question 3 Fact definition 1 Classifications How old are you? Fact definition 2 Fact definition 2 Person lives in Wellington Classifications Classifications Fact definition 3 Classifications Fact definition 4 Fact definition 4 Classifications Age of person Collection Eg. Census Frequency= 5 yearly Eg. Census 2006 Classification: CITY Category: WGTN Classification: NZ Island Category: NTH ISL

  26. Defining Metadata Concepts: Example

  27. How will we use MetaNet? • Use to guide the development of a Stats NZ model • Another model (SDMX) will be used for additional support in gaps • Provides the base for consistency across systems and frameworks • Will allow for better use and understanding of data • Will highlight duplications and gaps in current storage

  28. Metainformation systems Concept Based Model SIM CARS IDE Other Metadata stored in: • Business Frame • Survey Systems • BmTS components • etc Classifications Domain Value Data Collections Variables Fact Classification Categories Statistical Units Response Sample Design Concordance Collection

  29. Metadata Users - External • Government, • Public, • External Statisticans (incl. Intl Orgs)

  30. Metadata Users - Internal • Statistical Analysts • IT Personnel (business analysts, IT designers & technical leads, developers, testers etc.) • Management • Data Managers / Custodians / Archivists • Statistical Methodologists • External Statisticians (researchers etc.) • Architects - data, process & application • Respondent Liaison • Survey Developers • Metadata and Interoperability Experts • Project Managers & Teams • IT Management • Product Development and Publishing • Information Customer Services

  31. Lessons Learnt – Metadata Concepts • Apart from 'basic' principles, metadata principles are quite difficult. To get a good understanding of and this makes communication of them even harder. • Every-one has a view on what metadata they need - the list of metadata requirements / elements can be endless. Given the breadth of metadata - an incremental approach to the delivery of storage facilities is fundamental. • Establish a metadata framework upon which discussions can be based that best fits your organisation - we have agreed on MetaNet, supplemented with SDMX.

  32. Lessons Learnt – BPM • To make data re-use a reality there is a need to go back to 1st principles, i.e. what is the concept behind the data item. Surprisingly it might be difficult for some subject matter areas to identify these 1st principles easily, particularly if the collection has been in existence for some time. • Be prepared for survey-specific requirements: the BPM exercise is absolutely needed to define the common processes and identify potentially required survey-specific features.

  33. Lessons Learnt – Implementation • Without significant governance it is very easy to start with a generic service concept and yet still deliver a silo solution. The ongoing upgrade of all generic services is needed to avoid this. • Expecting delivery of generic services from input / output specific projects leads to significant tensions, particularly in relation to added scope elements within fixed resource schedules. Delivery of business services at the same time as developing and delivering the underlying architecture services adds significant complexity to implementation.

  34. Lessons Learnt – Implementation • Well defined relationship between data and metadata is very important, the approach with direct connection between data element defined as statistical fact and metadata dimensions proved to be successful because we were able to test and utilize the concept before the (costly) development of metadata management systems.

  35. Lessons Learnt – SOA • The adoption and implementation of SOA as a Statistical Information Architecture requires a significant mind shift from data processing to enabling enterprise business processes through the delivery of enterprise services. • Skilled resources, familiar with SOA concepts and application are very difficult to recruit, and equally difficult to grow.

  36. Lessons Learnt – Governance • The move from ‘silo systems’ to a BmTS type model is a major challenge that should not be under-estimated. • Having an active Standards Governance Committee, made up of senior representatives from across the organisation (ours has the 3 DGSs on it), is a very useful thing to have in place. This forum provides an environment which standards can be discussed & agreed and the Committee can take on the role of the 'authority to answer to' if need be.

  37. Lessons Learnt – Other • There is a need to consider the audience of the metadata. • Some metadata is better than no metadata - as long as it is of good quality. • Do not expect to get it 100% right the very first time.

  38. Questions?

More Related