240 likes | 252 Views
This work session on statistical metadata discusses the Quality Metadata System (SMS), its architecture, monitoring, assessment, and lessons learned. It highlights the importance of cross-sectional aspects and time coordination to ensure applicability for different types of statistics.
E N D
The Quality Metadata SystemIn the Czech Statistical Office Work Session on Statistical Metadata (METIS) 10 -12 March 2010 , Geneva Czech Statistical Office Jitka Prokop
Content • Background • Quality Metadata System • Quality Monitoring • Quality Assessment • Lessons Learned • Conclusions
1. BackgroundLinks between SMS subsystems SMS-CLASS SMS-VAR SMS-TASKS SMS-USERS SMS-RESP SMS-QUALITY SMS-DISSEM SMS-SERIES Statistical task - a set of statistical activities needed to fulfil a user’s request for statistical information. It can be composed of one or more statistical surveysor similar statistics.
2. Quality Metadata SystemSMS-QUALITY Architecture Based on • The ESS concept • Quality criteria (e.g. relevance, accuracy, etc.) • Quality and performance indicators (respondent burden, cost) • The European Self-Assessment Checklist for Survey Managers (DESAP) • Statistical Business Process Model Applicable for different types of statistics (in some extent).
2. Quality Metadata SystemQuality Form Map (QFM) • Model of Q-attributes in defined structure. Approved by the top management. Includes all proposed Q-attributes. Different types of statistics to be covered. • SW application for QFM • Defines Q-attributes and/or links into other SMS subsystems • Enables inputs of values manually • Provides views into DWH • Values of Q-attributes are stored in DWH
2. Quality Metadata systemLevels of Q-attributes • Most stable Q-attributes • Relates to the whole statistical task, survey, phase of process • e.g.: Key users, Information on training of staff, Methodology.. • Q-attributes related to processing in a concrete period • e.g.: Unit response rate, Extent of sample and frame, Punctuality • Key statistical variable • In generally defined breakdown • e.g.: Coefficient of variation, Item response rate Levels of Q-attributes relate to content and stability of values.
3. Quality MonitoringAspects • Quality monitoring covers • Collection of data / metadata • Calculations of Q-attributes • Checks with expected values / scales (future; comparisons) • Quality monitoring is based on • Input variables, incl. „history of changes“ • Variables specially designed for quality monitoring • Related to questionnaire or interview • Related to concrete variables • Results of Q-attributes • During a sub-process / phase of statistical process • At the end of the sub-process / phase • After finishing the whole statistical process
3. Quality MonitoringDuring running of sub-process • Collection of input data • for calculation of Q-attributes • Calculated Q-attributes(by SW) e.g. • Unit response rate for the population • Unit response rates for individual strata
Based on input variables Editingand imputation rates Distinction of inputs, corrections and deletes Recognitionof most edited items Based on specific quality variables Way (mode) of data collection Type of contact with respondents Aspects of individual imputations (e.g. with or without respondent, technical mistakes) 3. Quality MonitoringAfter the end sub-process
Indicators on accuracy Coefficient of variation Item response rates Final unit response rates Imputation rates Revisions Timeliness, Punctuality, Comparability, Coherence 3. Quality MonitoringAfter the end of the process or After the end of one run of the process
4. Quality AssessmentThe Quality Assessment Guidance (QAG) • Purpose, applicability • Support for management of process • Support for high level decisions • Support for quality reporting • Self-assessment • Auditing • Levels - Assessment of • Quantitative and qualitative resultsof Q-attributes • Statistical survey / statistical task
4. Quality AssessmentWays ofAssessment • Categorial assessment • Averages of individual results • Textual assessment and summary • Expert commentaries (suggested issues in QAG) • Strong aspects • Weaknesses • Proposal of concrete actions, priorities
Structure of QA follows theESS quality criteria andcovers the following levels: Key statistical variable in particular breakdowns. Statistical variable as an aggregate (average) of the breakdownsorstatisticalsurvey. Set of similar indicators, quality of sub-process, quality sub-criterion or criterion. The audited statistics as the whole. 4. Quality AssessmentLevels of Q Assessment of Q criteria
5.Lessons learned1. Links to the phases of the SBP Suggestions -> Collection -> Calculations -> Assessment -> Feedback and actions
5. Lessons LearnedCross-sectional aspects and Time coordination • Cross-sectional aspects • Ensure applicability for differenttypes of statistics • Avoid duplicities, arrange links with other SMS subsystems • Involve experts: • SMS-Quality project team • quality methodologists • subject-matter statisticians • ICT experts, members from other project teams • Time coordination • Design and implementation of SMS-QUALITY and other SMS subsystems to bemutually coordinated – Committee for the Redesign of SIS and SMS
5. Lessons Learned4.SMS-Quality project team • Appointed by the top-management • Suggests schedule of the SMS-QUALITYactivities • Regularly reports to the top-management • Designed QF-map (Architecture of SMS-QUALITY) => • Proposes SW application (content, functions) incl. updates • Coordinates implementation and cooperation among activities of involved experts
5.Lessons Learned5. Role of subject-matter statisticians and methodologists (a) Testing phase • Qualitymethodologists • Define Q-attributes and explanatory notes into SW application • Test the SW applications • Administrate SMS-Quality, QF Map, QAG • Provide scripts for calculations of quality indicators • Suggest scales for quality assessment • Subject-matter statisticians • Provide data for tests • Adjust scales for quality assessment • Manage routine quality monitoring, assessment
5.Lessons Learned5. Role of subject-matter statisticians and methodologists (b) Full implementation • Q-methodologists manage • Quality methodology updates (according to the ESS development) • Support subject-matter statisticians • Administration of SMS-QUALITY, updates • Subject-matter statisticians manage • Collection of Q-attributes (automatic or manually) • Quality assessment, self-assessment • Approval of results of assessment
5.Lessons Learned6. Compliance with ESS quality framework • SMS-QUALITY should be designed as a flexible tool to ensure easy methodology updates, taking into account development on the ESS level. • Application software should have necessary flexibility concerning collection, monitoring and assessment procedures.
6. Conclusions • Development of SMS-QUALITY has been scheduled with highest priority for the next two years. • Further progress in development of SMS-QUALITY will depend • On available human and financial resources due to fact the same experts are involved in real production of quality reports and in development of SMS-QUALITY. • On progress of other SMS subsystems. • Development of quality methodology on both national and international levels shall be taken into account.
Thank you for your attention. jitka.prokop@czso.cz