250 likes | 363 Views
Scientific Information Management Approaches Needed to Support Global Assessments: The USEPA's Experience and Information Resources. Jeffrey B. Frithsen National Center for Environmental Assessment Office of Research and Development U.S. Environmental Protection Agency Washington, DC, USA
E N D
Scientific Information Management Approaches Needed to Support Global Assessments: The USEPA's Experience and Information Resources Jeffrey B. Frithsen National Center for Environmental Assessment Office of Research and Development U.S. Environmental Protection Agency Washington, DC, USA 26 January 2001
Finding, sharing, and applying data and information for global assessments, now and in the future, will require implementation of good scientific information management approaches
GLOBAL ENVIRONMENTAL ASSESSMENTS • Are multi-disciplinary: Greater emphasis on integrating physical, ecological and human data • Include data collected at multiple spatial and temporal scales • Involve multiple information sources • Are completed by investigators distributed across geopolitical and organizational boundaries
INFORMATION DIVERSITY • Environmental assessments typically include multiple types of information • Most assessments are multi-disciplinary • Greater emphasis on integrating human and ecological data Challenge: manage many small pieces of information, and a few very large data sets
Large-Scale Remote Sensing Regional Scale Monitoring Studies Site-Specific Intensive Studies CENR Monitoring Framework - 1996 MULTIPLE SCALES • Environmental assessment typically include data collected at multiple spatial and temporal scales
IM SYSTEM DIVERSITY • Multiple scientific information management systems exist, or are being developed • Individual systems optimized for program specific needs Challenge: develop and provide interoperability between systems and with reference databases
DISTRIBUTED WORK FORCE • Research teams are distributed geographically and across organizations • International, Federal Agencies, Regions, States, Indian Tribes, NGO’s, Academia, etc.) • Organizations use different approaches to scientific information management • Resource limitations significant Challenge: link teams of people
SCIENTIFIC INFORMATION MANAGEMENT CHALLENGES • Technical Challenges • Management of metadata, data, and the tools needed to complete assessments • Management Challenges • Providing adequate resources and support for procedures • “Cultural” Challenges • Encouraging sharing of data and protection of intellectual property rights
TECHNICAL CHALLENGES • Help find relevant data and information in a distributed environment • Provide descriptions (metadata) about that information sufficient to assess usability • Provide access to environmental metadata, data, and other resources
ADDITIONAL TECHNICAL CHALLENGES • Develop approaches and standards to facilitate data integration • Enhance interoperability of data systems • Provide access to analytic, modeling, and visualization tools in distributed environments • Integrate multiple tools and models with data Meeting these challenges requires going beyond metadata
MANAGEMENT CHALLENGES • Commitment of adequate resources for systems development and operation • 10-20% of project budget needed for IM related activities (NRC 1994) • Support for related procedures • Appropriate incentives for involvement by project participants
CULTURAL CHALLENGES • Promote data sharing • Provide protection for intellectual property rights • Recognize that the publication of metadata and data are as important as the publication of results
FIFE COMMANDMENTS • “Thou shalt make thy data available even unto thine enemies. • Thou shalt release thy data from bondage. • Thou shalt not covet they neighbor’s data until they’ve had a crack at them.” FIFE: The First ISLSCP Field Experiment. ISLSCP: International Satellite Land Surface Climatology Project
USEPA’S RESPONSE TO IM CHALLENGES • Leverage information management technologies to support all aspects of the assessment process • Adopt or develop approaches, standards, and procedures to maximize integration of data, data systems, models, and other analysis tools • Integrate with ongoing national and international efforts
METADATA REPOSITORY • USEPA has invested heavily in building a robust web-based, metadata repository called EIMS • Repository contains descriptions (metadata) of data and uses FGDC content standard • Repository expanded to contain descriptions of models, documents, and projects • Repository linked to a national warehouse: the National Spatial Data Infrastructure
EIMS • EIMS: Environmental Information Management System • A system to capture, store, manage, and distribute information about environmental resources collected, developed, and used by the Agency and its regional, state, and private partners www.epa.gov/eims
METADATA 20-YEAR RULE • “Will someone 20 years from now, not familiar with the data or how they were obtained, be able to find data sets of interest and then fully understand and use the data solely with the aid of the documentation archived with the data set?” • Committee on Geophysical Data, National Research Council, Solving the Global Change Puzzle, National Academy Press, 1991.
EXAMPLES: US GLOBAL CHANGE RESEARCH • Data and documents from the US Global Change Research Program (USGCRP), Regional Assessments • Mid-Atlantic • Great Lakes • Gulf Coast • Links to the USGCRP National Assessment & other Regional Assessments
EXAMPLES: USEPA EMAP PROGRAM • EMAP: Environmental Monitoring and Assessment Program • Uses ecological indicators and a probability based sample design to estimate the spatial extent and condition of ecological resources • Potential information for WWAP include: • Indicator development approaches • Candidate indicators • Data for US aquatic resources
EMAP ASSESSMENT RESULTS • US aquatic resource assessment includes • 3.6 million miles of streams (19%) • 41.7 million acres of lakes (40%) • 303 million acres of wetlands (3%) • 61,000 miles of coastline (6%) • 40,000 square miles of estuaries (72%)
POTENTIAL WWAP APPROACH • Define assessment questions for biennial reports • Identify data needed to address assessment questions • Define and develop indicators • Define reference conditions (what’s good) and decision thresholds (what’s bad) • Search across distributed metadata repositories
REFERENCE CONDITIONS 40 30 Current Distribution 20 Reference Distribution 10 Number of streams 0 0 5 10 15 20 25 30 35 40 45 50 Indicator Score
GLOBALLY DISTRIBUTED METADATA REPOSITORIES MR MR MR MR MR MR MR World Water Resource Clearinghouse
CONTACTS • Jeffrey B. Frithsen • Senior Ecologist, National Center for Environmental Assessment • frithsen.jeff@epa.gov • Joel D. Scheraga • Director, USEPA Global Change Research Program • scheraga.joel@epa.gov • EIMS • www.epa.gov/eims