530 likes | 543 Views
This project overview summarizes data inputs, emissions data, monitoring data, modeling, analysis, and results on regional extinction, attribution results, and major findings for WRAP Class I areas. The report structure, deliverables, and primary data inputs are detailed.
E N D
Attribution of HazeProject UpdateStationary Sources Forum and Implementation Workgroup Meeting, PhoenixDecember 14, 2004 Joe Adlhoch - Air Resource Specialists, Inc.
Outline • Project Overview • Data Inputs • Emissions Data • Monitoring Data • Modeling • Analysis and Results • Regional Extinction and Emissions • Attribution Results • Initial Groupings • State Impacts on Class I Areas • Regional Assessment - Fire • Major Findings and Next Steps
Introduction • WRAP Strategic Plan (2003) • ARS contracted to integrate and synthesize available 2002 monitoring, modeling, and emissions data to identify: • Geographic source areas of emissions that contribute to impairment at federal and tribal Class I areas • Mass and species distributions of emissions by source category within each geographic source area • The amount of natural and anthropogenic emissions affecting, and the associated visibility impact at, each Class I area
AoH Phase I Report Structure Summary Report (Hardcopy and Web-based) Supporting Reports (Web-based) WRAP Data Centers
AoH Phase I Web Page http://wrapair.org/forums/aoh/ars1/AoH_Summary_Report_Draft.pdf
Deliverables • Prepared 96 regional/state emissions maps • Prepared data summaries for 85 sites • Reviewed and analyzed 2002 attribution results for over 100 federal and tribal Class I areas • Determined regional groupings of source impacts across the WRAP region • Developed project web page to support findings • Prepared draft summary report
Primary Data Inputs • Emissions Inventories (EI) • Tracks pollution estimates of source categories and aerosol species • Spatial variation and source strength in EIs affects monitoring data and model results • Confidence in these data is medium • Monitoring Data • Snapshot of aerosol pollution at a given location • Confidence in these data is high • Modeling • Allows us to estimate the transformation and movement and fate of emissions in the atmosphere • Model performance is tested by comparisons to monitoring data
AoH interim 2002 EI Emissions Data Set • EPA 2002 NEI not useable for this purpose → WRAP facilitated development of “interim” 2002 emissions • Point • Area • Mobile (On-Road & Non-Road) • Road Dust (Paved & Unpaved) • Fire • Windblown Dust • Biogenics • Modeling Domain Boundary Conditions
AoH interim 2002 EI Point and Area Sources • Geographic extent: US States, Canada and Mexico • Emissions outside the WRAP and CENRAP regions are from the other RPOs and the EPA • Prepared by E.H. Pechan and Associates
AoH interim 2002 EI Mobile Sources • Emissions were estimated for the 1996 base year and four future years – 2003, 2008, 2013, 2018 • Road Dust (2002 interpolated between 1996 and 2018) • 2003 Mobile Emissions used for “interim” 2002 • On-road (EPA MOBILE6 and PART5) • Off-road (EPA NONROAD2000) • One known limitation: • EPA has updated the NONROAD model twice in the last 3 years – NONROAD2000 emission estimates are noticeably higher than the NONROAD2004 estimates • California provided mobile source emissions estimates • Geographic extent: US States, Canada and Mexico • Emissions outside the WRAP region are from the other RPOs and the EPA • Prepared by ENVIRON
AoH interim 2002 EI Fire Emissions • Actual 2002 wildland fire and prescribed fire emission inventories • Agricultural fire emissions estimates were from Section 309 work • Specific location, date, size and fuel loading for each fire event. • Geographic extent: WRAP region • Some emissions outside the WRAP region were included, but were incomplete • Prepared by Air Sciences, Inc.
AoH interim 2002 EI Biogenics and Windblown Dust • Data Inputs: • Land Use/Land Cover (BELD3, NALCC, NLCD) • Windblown Dust Soil Characteristics (STATSGO; Soil Landscape of Canada; Intl. Soil Reference and Information Centre) • Meteorological Data (2002 36-km MM5 ) • Agricultural Data (BELD3, RUSLE2, CTIC) • Emissions Estimation Models: • Biogenic Emissions Inventory System (BEIS3) • WRAP Windblown Dust Emissions Model • EI includes modeling domain • Prepared by the Regional Modeling Center
AoH interim 2002 EI Modeling Domain Boundary Conditions • Boundary conditions derived from annual simulation of the GEOSCHEM global chemical-transport model. • GEOSCHEM by Daniel Jacob at Harvard • Key modeled species and grid information were mapped from “GEOSCHEM” to CMAQ species by Daewon Byun at University of Houston. • Earlier simulations used a seasonal average from GEOSCHEM. • AoH final simulations use an annual 2002 simulation with 3-hour time steps. • CMAQ initial conditions created by running a “spin-up” period from Dec 17-31, 2001.
AoH interim 2002 EI WRAP Annual 2002 Emissions NOx SO2 NH3 PM2.5 PMC
Monitoring Data Set • IMPROVE – National program to monitor atmospheric aerosols in mandatory federal and tribal class I areas (CIAs) • Speciated aerosol samples collected every 3 days • Data are used to calculate visibility impairment expressed as extinction, deciviews, or visual range: • Tied to Regional Haze Rule requirements • Some sites have > 15 year history
IMPROVE Aerosol Sampler Systems • Sulfur>>Sulfate • Fine Soil • Nitrate • Sulfate • (backup) • Organic Carbon • Elem. Carbon • Coarse • Mass
WRAP includes 116 (of 156) Mandatory Federal Class I Areas • WRAP Tribal Class I Areas: • Spokane Tribe • Northern Cheyenne Tribe • Fort Peck Tribes • Confederated Salish and Kootenai Tribes • Yavapai-Apache Nation • Hualapai Tribe (pending)
Spokane Indian Reservation (WA) Redwood NP (CA) Sequoia NP (CA) Sawtooth W (ID)
AoH Modeling • Regional dispersion modeling • Trajectory modeling
Regional Scale Modeling • WRAP Regional Modeling Center • http://pah.cert.ucr.edu/aqm/308/ • CMAQ model runs using 2002 “interim” EIs and MM5 data • CMAQ – EPA-developed model for regional analysis • Tagged Species Source Apportionment (TSSA) • Use “Tagged Species” tracers to track chemical transformations and deposition across domain • Add source type tracers for key species and for defined regions and source categories • Contribution results at each receptor site – no need for aerosol samplers to be present
Traced Area: WRAP Modeling Domain Each state is distinguished by a unique number in the source area mapping file
Trajectory Regression Analysis (TRA) • Desert Research Institute • Part of WRAP Causes of Haze Assessment (COHA) • http://coha.dri.edu/index.html • Meteorological back trajectories run for 2000 – 2002 to determine flow patterns for each IMPROVE site • TRA finds the best fit between the time air spends over a defined area (source region) and the air quality parameter measured at an IMPROVE site • Contribution results at each IMPROVE site • No results for unmonitored CIAs • Monitored locations must have sufficient data
20% worst sulfate days (2000 – 2002) W, SW, SE show highest residence times at Great Sand Dunes NW, W, S show highest residence times at Craters of the Moon Great Sand Dunes, Colorado Back Trajectory Residence Time Summaries Craters of the Moon, Idaho
Source Region Grouping - Example • Used for comparison of TSSA and TRA results • Boundary states (inner circle) • U.S. regions (outer circle) • International (Can., Mex.) • Other (ocean, gulf, boundary conditions, unknown or not able to attribute)
Modeling Uncertainties • TSSA • Errors and uncertainties in gridded meteorological data • Emissions inventories uncertain and in some cases incomplete • TSSA – new application in CMAQ • 36 km grid resolution is too coarse to resolve near field effects • TRA • Statistical technique – has associated uncertainty limits • Based on EDAS back trajectories – uncertainty increases as you move away from the end time and date • “Edge effect” for CIAs or source regions near the boundary of a state
Integrated Analysis and Results • Weight of evidence approach: • Less confident in any single analysis • Multiple, independent analyses are necessary to gain more confidence in findings • Integrated analysis looked at: • Accuracy and reliability of EIs, monitoring data, model results • Geographic source regions for SO4 and NO3 • TSSA – Point and Mobile emissions • TRA – Did not distinguish between source categories • Logical groupings of CIAs exist based on attribution of these pollutants
Weight of Evidence Approach – Considerations • TSSA Results • Supported by other analyses? • What if contradicted by other analyses? • TRA Results • Noisy data – statistical interpretation required • Farther source regions must be larger to compensate for increasing uncertainties in longer trajectories • Do we see possible “edge effects”? • Monitoring Data • Reasonably accurate and certain measurements • As a snap shot, can’t demonstrate cause and effect • Emissions • EI are estimates, not directly measured • Do EI inputs support attribution results?
2002 WRAP Aerosol/Species Extinction Aerosol Extinction Sulfate Extinction Nitrate Extinction Organics Extinction
Sample State Emissions Summary for NOx • State map with 36 x 36 km gridded emissions • Brief text description of NOx • Breakdown of state-wide NOx emissions by source type
Attribution Summary for Rocky Mountain NP • Colorado emissions are a major contributor in both methods • Border states impact ROMO in both methods (49-67%) • Other US regions also contribute in both methods (16-27%) • Largest differences between TSSA and TRA seen in attribution from Colorado and Wyoming
Attribution Summary for Mesa Verde NP • New Mexico emissions are the major contributor in both methods • Border states impact MEVE in both methods (57-80%) • Other US regions also contribute in both methods (6-16%) • Largest differences between TSSA and TRA seen in attribution from New Mexico and Arizona – May be “edge effect”
“Edge Effect” Explanation for Sulfate TRA Results • Trajectory points every 3 hours may not accurately represent high emission source regions near the edges of states • Therefore, TRA results may miss or underestimate the impact from these regions
Initial Grouping of CIAs by Sulfate and Nitrate Source Attribution • 24 groupings • Based on source region attribution and species signal strength and similarity • Groupings similar for SO4 and NO3
Available Attribution Information • More information for some species than others • Credibility of results depend on how well all categories of information agree
Attribution Matrix <<<<< Source Regions (States, etc.) >>>>> <<<<< Class I Areas >>>>> • Contributions (%) to CIAs by Source Regions: • Which source regions affect CIAs? • Which CIAs do source regions affect?
Regional Assessment - Fire • The difference between the CMAQ-modeled visibility impacts with and without all fire emissions. • This model run is designed to isolate the effect of fire emissions on visibility from the effect of all other emissions sources in the model. • The difference between the CMAQ-modeled visibility impacts with “natural” fire emissions, and all fire emissions. • This model run is designed to isolate the effect of “anthropogenic” fire emissions on visibility from the effect of all other emissions sources in the model, including “natural” fire. • The difference between the CMAQ-modeled visibility impacts with all fire emissions, and without anthropogenic fire emissions. • This model run is designed to isolate effects of “natural” fire emissions on visibility from the effect of all other emissions sources in the model, including “anthropogenic” fire.
A. Fire Assessment – Visibility Impact of All Fire Emissions
B. Fire Assessment – Visibility Impact of Anthropogenic Fires