160 likes | 276 Views
INSPIRE Annex II+III Data Specifications Consultation & Testing. Michael Lutz European Commission – Joint Research Centre Institute for Environment & Sustainability, SDI Unit 30 th June 20 11. Annex II/III Roadmap. Kick-off: 19-20 .04 2010 Data Specification v1 (29.10.2010)
E N D
INSPIRE Annex II+III Data SpecificationsConsultation & Testing Michael Lutz European Commission – Joint Research CentreInstitute for Environment & Sustainability, SDI Unit 30th June 2011
Annex II/III Roadmap Kick-off: 19-20.04 2010 Data Specification v1 (29.10.2010) Data Specification v2 (06.2011) Testing/Consultation (06-10.2011) Data Specification v3 (04.2012) Draft IR (09.2012) IIIIII II III IV I II III IV 2010 2011 2012
Consultation & Testing • Two separate, but closely related activities • Common aim: Report back to the INSPIRE CT and TWGs gained experience • will be used to improve the data specifications for v3.0 • basis for amendment of the legal act • Consultation: • Review of data specification documents (v2.0) • Domain-specific aspects • Cross-thematic aspects (overlaps and gaps, inconsistencies) • Testing: • Test feasibility of implementation and fitness for purpose of data specifications (v2.0) under real-world conditions • Provide the first test bed for interaction with and between the participating stakeholders (teaming up, exchange of experience)
Scope / content • Documents under consultation • D2.8.I.x Data Specification on <Theme Name> – Draft Guidelines: 24 PDF documents[+ GML Application Schemas] • Proposed updates to D2.5 and D2.7 • D2.9 INSPIRE O&M Guidelines (New document) • For Reference • UML Model (svn, XMI, EAP, HTML):http://inspire.jrc.ec.europa.eu/index.cfm/pageid/2/list/datamodels • The “INSPIRE Data Specifications Cost-benefit considerations” document • INSPIRE Annex I testing summary report
How to read the data specifications • Foreword • General Executive Summary • Theme-specific Executive Summary 1. Scope 2. Overview (incl. 2.2 informal description) 3. Specification scopes 4. Identification information 5. Data content and structure 5.1 Basic notions 5.2 – 5.x Application schemas (incl. UML diagrams and feature catalogues) 6. Reference Systems 7. Data Quality 7.1 DQ Elements 7.2 Minimum DQ Requirements 8. Metadata 8.1 Common MD elements 8.2 MD Elements for data quality 8.3 Theme-specific MD elements 8.4 Guidelines for common elements 9. Delivery (incl. Encodings) 10. Data Capture 11. Portrayal (incl. layers, styles) • Annex A: ATS • Annex B: Use cases • Other Annexes (e.g. examples)
How to read the data specifications Watch out for Legal requirement Implementation requirement Implementation recommendation Open issues your input requested
Overview • Feasibility testing main focus of testing • measure technical feasibility and effort related to transforming existing data (e.g., from Member States’ organisations) into data compliant with the requirements and schemas proposed in the data specification documents • Fitness for purpose testing • assess the benefits of harmonised data specifications from an end-user or application point of view • Provide cost-benefit information • Related to the testing • Contextual / not directly related to testing
Testing – Communication Calls for participation Initial set of information Testing Kick off meeting Dedicated presentations for the testing Questions & answers session INSPIRE Forum Exchange testing releated information Source of relevant information Communicationplatform Uploading and sharingresults of the testing
Outcomes • From Testing • SDICs/LMOs: Testing report via WebForm • SDICs/LMOs: Comments delivered via XLS spredsheet • FromConsultation • SDICs/LMOs: Comments delivered via XLS spreadsheet • Both • INSPIRE CT: Import XLS comments to DS issue tracking system Testing 20/6 – 21/10/2011 Consultation 20/6 – 21/10/2011 Call for Consultation Published on 22/06/2011 Call for Testing Published on 08/04/2011 XLS template for comments Testing Reports (WebForm) XLS template for comments SDIC/LMO DB Issue tracking system
Where you can find partners… … for the testing
Thank you! http://inspire.jrc.ec.europa.eu/index.cfm/pageid/2
Purpose of the testing • Ensure proposed data specifications are feasible • i.e. existing spatial data can be mapped and transformed to the proposed target common INSPIRE structure • Provide evidence that data specifications are fit for purpose • as defined by use cases identified either in data specifications as well as testing participants • Identification of costs and benefits • related to the current status with comparison of the scenario with INSPIRE implementation
Feasibility testing • Higher priority • Purpose • measure technical feasibility and effort related to transforming existing data into data compliant with the requirements and schemas proposed in the data specification documents • Reference systems, Data Quality, Dataset-level metadata, Encoding, Portrayal to be also considered • Expectations • Identify areas, where it is difficult or impossible to transform national data sets into the proposed INSPIRE schema • Provide results of the transformations as an base for fitness for purpose testing as well as future INSPIRE implementation • Documentation of transformation methodologies
Fitness for purpose testing • Lower priority • Purpose • Assess the benefits of harmonised data specifications from an end-user or application point of view • Demonstrate the usefulness of spatial data compliant with the INSPIRE data specifications when addressing real applications – the use cases • To address cross theme, cross border, cross language aspects • Expectations • Collection of use cases identified by testing participants • Documentation of used testing approach • Testing outputs (applications, tasks/processes descriptions)
Cost benefit considerations • Purpose • Assess feasibility and proportionality of the proposed INSPIRE spatial data specifications • Consider contextual / not directly testing related aspects • Costs: Mainly related to feasibility testing • Benefits: Mainly related to feasibility testing • Putting INSPIRE compliant data in existing applications are highly appreciated, which provides sound basis for comparison. • Repeating the same testing exercise with compliant and non-compliant data gives extraordinary value for assessing the potential benefits. • Conclusions • Expected reporting/commenting of issues as well as positive aspects related with data specifications implementation