630 likes | 654 Views
Learn about the goals, approach, and accomplishments of the Central Coast Ambient Monitoring Program (CCAMP) and how their data can inform decision-making. Discover where CCAMP plans to go in the future.
E N D
CCAMP – Past, Present, and FutureCentral Coast Ambient Monitoring ProgramCentral Coast Regional Water Quality Control Board
ABOUT CCAMP Central Coast Ambient Monitoring Program • Program goals and objectives • CCAMP Approach • What we have accomplished • What our data can tell us • Where we would like to go
Original CCAMP Program Goals • Determine status and trends of surface, ground, estuarine, and coastal water quality and associated beneficial uses • Provide water quality information to users in accessible forms to support decision making • Collaborate with other monitoring programs to provide effective and efficient monitoring
AB982 established SWAMP in 2000 Legislative Report defined questions and objectives based on Beneficial Use impairment • Is it safe to swim? • Is it safe to drink the water? • Is aquatic life protected? • Is it safe to eat the fish and shellfish? etc…
Is there evidence that it is unsafe to swim? • Beneficial Use: Water Contact Recreation • Objective: Screen for indications of bacterial contamination by determining percent of samples exceeding adopted and mandated water quality objectives Monitoring Approach: Monthly monitoring for indicator organisms; compilation of other data • Assessment Limitations: CCAMP currently samples for fecal and total coliform • Criteria (90% probability): • 10% over 400 MPN fecal coliform • 10% over 235 MPN E. coli • 10% over 104 MPN Enterococcus (bays and estuaries) • Fecal to Total coliform ratio over 0 .1 when Total Coliform exceeds 1000 MPN (bays and estuaries)
Scientific and Technical Planning and Review • SWAMP Scientific Planning and Review Committee (SPARC) • Statewide and regional review this year • Local input each year from “rotational” TAC
We now have… • Funding • Six years of data • Website and data tools • CCLEAN • Agricultural Monitoring Program • Stormwater monitoring coordination • Characterization reports
CCAMP Approach • Watershed rotation • Tributary-based site placement • Multi-indicator, “weight of evidence” • Integrator sites
Tissue Bioaccumulation Water Column Toxicity and TIEs Sediment Toxicity and Chemistry Multiple lines of evidence • Monthly Conventional Water Quality • Benthic Invertebrate and Habitat Assessment
Watershed Rotation Each watershed area is monitored and evaluated every 5 years
Coastal Confluences Thirty-three coastal sites are monitored continuously above salt water influence
Robust Time series Unsafe to drink?
Web Site • Used extensively • When the server is down we receive “complaints” • SIMON • Templates for volunteer and ag websites • New mapping tools!
A few data uses… • IDing problem areas for NPS activities • Ag monitoring program development • 303(d) listing, 305(b) assessment, TMDL development and compliance monitoring • Basin Plan standards evaluation • Statewide nutrient objectives work • Public interaction
CCAMP Tools • Data analysis • Quality Assurance • Biostimulatory Risk Index • Pesticide Risk Index • Index of Biotic Integrity
Biostimulatory Risk Index Limekiln Creek (Rank = 0.0) • Ranking includes: • % algal/plant cover • Chlorophyll a • Nitrogen/phosphorus • Range of DO • Range of pH Franklin Creek (Rank = 85.3)
Biostimulatory Risk Scores (shown as quartiles)
Pesticide Risk Index Based on pounds applied and chemical diversity • Pesticide applications are a predictor of toxicity • In press, by Hunt and Anderson
Bioassessment • Biological endpoint for cumulative impacts • Growing regulatory use throughout the state and nation • Index of Biotic Integrity • Development of biocriteria for Region
5-year plan • Continually increase data use and utility (change detection, trends) • Publish findings and tools • “Institutionalize” data management approach; provide better support to partner organizations • Biocriteria and reference conditions • Augment monitoring efforts to support new office vision
Conservation Ethic • New RWQCB focus on riparian and wetland protection; pervious surface protection • CCAMP will help prioritize effort, document trends, and track progress.
New SWAMP Protocols • SWAMP physical habitat protocols provide a more complete picture of habitat quality • Canopy cover and complexity • Riparian width • In-channel habitat • Pebble counts etc…
California Rapid Assessment Method for Wetlands (CRAM) • Developed by a large multi-agency effort including CCC, EPA, SCCWRP, and SFEI • Piloted in the Morro Bay watershed • We hope to collaborate with CCC staff to use CRAM at our monitoring sites this season
CRAM sample map Lower Carmel River
Delineation of corridors and impervious surfaces using imageryNOAA Coastal Change Analysis Program
CCAMP Web Site www.CCAMP.org
Data QualityTips for getting it, keeping it, proving it! Central Coast Ambient Monitoring Program
Why do we care? • Program goals are key – what do you want to do with the data? • Data utility increases directly with quality • High quality data can support 303(d) listing decisions, regulatory decisions, management decisions • High quality data will be taken seriously • Useable data is well-documented data
Recent 303(d) listing guidance checks that data used to support a listing be associated with quality assurance program plan
Quick Overview • Data quality objectives • Data validation and verification • Quality assurance documents • Other QA tools
Data and Measurement Quality Objectives • Basis of data quality determinations • Underlies the concept of “SWAMP Compatibility” • Should be defined in an approved Quality Assurance Program Plan
Basic QA terms describing data quality: • Precision • Accuracy and bias • Representativeness • Completeness • Comparability • Sensitivity
Precision High Low High Bias Inaccurate Highly Inaccurate! Low Accurate! Inaccurate
Determining Precision • Precision is a measure of repeatability and consistency • Precision is calculated from sample replicates and/or splits • Precision is dependent on instrument or method, and variability from handling and the environment • You should be collecting at least 5% duplicates (or one per event) to be “SWAMP compatible”
Determining Precision For multiple samples calculate Relative Standard Deviation (RSD):RSD = (s / x) * 100 = % deviation of measurements For pairs of samples calculate Relative Percent Difference (RPD): RPD = (X1 – X2) * 100 where X1 is larger value (X1 + X2) / 2 = % deviation of measurements SWAMP RPD requirement is typically 25%
What’s the difference between a field duplicate and a field split? Field duplicates: collected side by side, a combined measurement of environmental variability and sampling and laboratory error Field splits: collected from a composite sample, factors out environmental variability. A measure of variability associated with sampling and laboratory error. May be sent to different laboratories to measure interlab variability.
Bias and Accuracy……how to hit the target! • Calibrate against standard reference material • Calibrate against an “expert” • Spike sample with a known concentration
Calculating accuracy • SWAMP requires that measured value is between 80% and 120% of the known value PD = (X1 – X2) * 100 where X1 is known value (X1) = % deviation of measurement from reference
Calibrating instruments • In order to track equipment accuracy, keep an instrument calibration log • Instrument drift is determined by calibrating pre- and post- sampling. RPD should be within DQO requirements. • Be sure to record instrument ID in your data records; in some cases bias can be adjusted out of data if it can be shown to be consistent.
Blanks • Assessing contamination • Different types for different purposes • Field, equipment, travel, lab • SWAMP suggests CWQ field blank during periodic field audit. If inadequate performance, increase to 5% • Blank must be < Minimum Detection Limit
Representativeness Think through your sampling design What are you trying to show? Consider: Statistical adequacy Seasonality Time of day Spatial variability