350 likes | 493 Views
CLARREO: Science Value Matrix. Bruce Wielicki. CLARREO Science Meeting July 8, 2010. Decadal Survey defines CLARREO. NOAA CLARREO CERES (Clouds and Earth’s Radiative Energy System) TSIS (Total Solar Irradiance Sensor) NASA CLARREO
E N D
CLARREO: Science Value Matrix Bruce Wielicki CLARREO Science Meeting July 8, 2010
Decadal Survey defines CLARREO NOAA CLARREO • CERES (Clouds and Earth’s Radiative Energy System) • TSIS (Total Solar Irradiance Sensor) NASA CLARREO • Solar reflected spectra: SI traceable relative uncertainty of 0.3% (k=2) • Infrared emitted spectra: SI traceable uncertainty of 0.1K (k=3) • Global Navigational Satellite System Radio Occultation: SI traceable uncertainty of 0.1K (k=3) CLARREO is a Cornerstone of the Climate Observing System
Decadal Survey defines NASA CLARREO Societal BenefitsEnable knowledgeable policy decisions based on internationally acknowledged climate measurements and models through:- Observation of high accuracy long-term climate change trends- Use the long term climate change observations to test and improve climate forecasts. Science Objectives Make highly accurate and SI-traceable decadal change observations sensitive to the most critical but least understood climate radiative forcings, responses, and feedbacks - Infrared spectra to infer temperature and water vapor feedbacks, cloud feedbacks, and decadal change of temperature profiles, water vapor profiles, clouds, and greenhouse gas radiative effects - GNSS-RO to infer decadal change of temperature profiles - Solar reflected spectra to infer cloud feedbacks, snow/ice albedo feedbacks, and decadal change of clouds, radiative fluxes, aerosols, snow cover, sea ice, land use - Serve as an in-orbit standard to provide Reference Intercalibration for broadband CERES, and operational sounders (CrIS, IASI), imagers such as VIIRS, AVHRR, geostationary A Mission with Decadal Change Accuracy Traceable to SI Standards
CLARREO Science Requirements: Process and Pre-Phase A Team • NRC Decadal Survey: original science community input and mission • Requirements further developed over 3 years of science studies: • 2 open science community workshops (3 days each) • 4 science team meetings (2 to 3 days each) • Weekly telecons for science review and input • Pre-phase A Science Team:Organization Role Relevant ExpertiseNASA Langley Mission Lead FIRST/CERES/CALIPSO/SAGE, RS/IR intercalibration IR Inst Lead RS/IR orbit sampling, RS fingerprinting, Radxfer Models Climate Obs IIP for IR instrument.Harvard Univ. IR Science/Inst INTESSA/IR Spectra fingerprinting, dec. change accuracy GNSS-RO RO science, sampling, instruments, SI traceability, IR IIPUniv. Wisconsin IR Science/Inst SHIS/CrIS/AIRS, IR intercalibration, SI traceability, IR IIPGSFC/GISS RS Lead/RS Inst MODIS/VIIRS/APS/SeaWiFS lunar cal, SI traceabilityCU-LASP Solar Cal/RS Inst SORCE/TSIS, RS SI traceability, IIP for RS instrumentNIST SI traceability IR and RS standards, SIRCUS, HIP, LUCI lunar calJPL GNSS-RO/IR AIRS/GNSS-ROUniv. Maryland IR orbit sampling Diurnal sampling studiesGFDL/Berkeley Climate Models CLARREO climate OSSEs: Obs System Simulation ExpUK NPL/Imp Coll. International TRUTHS RS SI traceability/GERB/IR interferometers A diverse expertise science team to set requirements
CLARREO and Climate Science CLARREOClimate ChangeCalibration 1stApproach Climate Benchmarks Process Inter-calibration
Science Value Metrics • Science Value of a Science Objective =Science Impact * Trend Accuracy * (Record Length)0.5 * Verification * Risk • Science Impact • Uniqueness of CLARREO contribution • Importance of science objective to reducing climate change uncertainties • Accuracy • Accuracy in decadal change trends • Climate Record Length • Sqrt(record length) reduction in noise from natural variability • Verification • SI traceable calibration verification • Independent instruments, analysis, observations • Risk • Technological, budget, schedule, flexibility of mission options Instrument Absolute Accuracy set for < 20% Trend Accuracy Degradation
Science Value Factors Multiplicative • Science Value of a Science Objective =Science Impact * Trend Accuracy * (Record Length)0.5 * Verification * Risk • Why not additive instead of multiplicative? • If no climate science impact, no value: despite accuracy • If poor accuracy in trends: reduces value of the entire mission • If record length too short: start all benchmarks will be lost in short termclimate noise. Example: perfect high impact data for one month • If accuracy cannot be verified: science and societal impact is reduced • If a high risk approach is taken, reduces chance of success • All of these factors tend to act multiplicatively for mission value • Primary questions are: • Are we missing any key factors? • Do we have the power of each factor correct? Linear? Other?
Science Impact • Select Science Objectives • Recall our long discussions about whether to use our measurements or climate variables as science objectives: • Measurements: • IR spectral radiance, • RS spectral nadir reflectance, • RO doppler shift, refractivity • Climate variable decadal change whose information content is contained in our measurements: • temperature profile, water vapor profile, • reflected and emitted fluxes • water vapor feedback, lapse rate feedback, cloud feedback • Information content can be through either: • Spectral Fingerprinting decadal change • Reference Intercalibration of other instruments (CrIS, CERES, VIIRS) Measurements vs Climate Variables
Science Impact • Measurements: Arguments For: • Most direct use of SI traceable measure • Can be directly compared to climate model predicted spectral change in the future (e.g. our climate OSSEs) • Easier communication of goals to engineering team • Measurements: Arguments Against: • Obscures relationship of observation to climate variable goals, assessment reports (IPCC) • Multiple climate science goals in single measurement and composite weights will be needed • Less clear for combined RO/IR or IR/RS goals • Not GCOS climate variables • No other satellite missions use this approach • Looks too instrument focused and not science focused • Doesn’t communicate as well to NASA HQ, OMB, OSTP
Science Impact • Climate Variable Decadal Change: Forcing/Response/Feedback • Arguments For: • Easiest to relate to IPCC, GCOS, other assessment reports • Communicates more easily to HQ, OMB/OSTP • Can be compared to climate model predictions (current and future) • More direct relation to science goals (forcing, response, feedback) • Easier to accommodate multi-instrument objectives (e.g. IR/RS) • Arguments Against: • Less clear relation to SI traceable measurements • Less clear communication to engineering community
May09 ST Meeting Science Objectives May 2009 Science Objectives: Climate Forcing and Response
May09 ST Meeting Science Objectives May 2009 Science Objectives: Climate Feedback
Science Impact Blue = CLARREO Solar Reflected Spectra Science Red = CLARREO IR spectra & GNSS-RO Science - Temperature - Water Vapor - Clouds - Radiation - Snow/Ice Cover Earth's Climate • Greenhouse Gases • Surface Albedo Cloud Feedback Water Vapor/Lapse Rate Feedback Snow/Ice Albedo Feedback Roe and Baker, 2007 50% of CLARREO Science Value is in Reflected Solar Spectra 50% of CLARREO Science Value is in Infrared Spectra & GNSS-RO 100% of CLARREO Science Value is in the Accuracy of the Data
Climate Sensitivity uncertainty isdriven by uncertain feedbacks: Factor of 3 uncertainty in response to doubled CO2 Climate Change Response: Temperature Profile, Water Vapor Profile, Cloud Properties, Surface albedo (snow, sea-ice, land cover) Radiative Forcings: Verify greenhouse gas infrared radiation effects Aerosols advances by GLORY APS and NRC ACE missions. Science Impact 0.26 0.24 0.09 0.05 Effective Climate Forcings (W/m2): 1750-2000
Science Impact Metric • Forcing, Response, and Feedback uncertainties are weighted equally • Try to avoid a long list of climate variables: lacks focus • For feedbacks: total value of 7.5, use a weighting factor proportional to uncertainty (e.g. IPCC (2007), Soden and Held (2006), Bony et al., etc) • Cloud feedback total weight: 4: largest uncertainty for climate sensitivity • Water vapor/lapse rate weight: 2: uncertainty is 1/2 of cloud feedback and accounts for negative correlation of lapse rate and water vapor feedbacks • Snow/Ice albedo feedback weight: 1.5 uncertainty is 1/3 of cloud feedback • For responses: total sum of value is 9 (~ equal to feedbacks) • Cloud radiative flux and cloud properties response weight: 4 • Temperature/Water Vapor profile response weight: 4 • Note that for responses: Temp/W.Vapor = Cloud Fluxes/Properties to give added weight to value of temperature water vapor trends. • Vegetation Index weight: 1 • Snow/ice response not currently included: include and reduce others? • Global net flux change currently not included: Trenberth paper? Add?
Science Impact • Forcing Weights: Total value is 8 for a full climate observing system, but following our APS/ACE team discussions and review, CLARREO only achieves a weight of 2 out of 8. Rest is APS/EarthCARE/ACE • Total uncertainty in radiative forcing given a weight of 8, just like feedbacks and responses. • Largest uncertainties are aerosol indirect effect and direct effect • Aerosols: CLARREO contribution is set to a value of 1.5 out of 8 for value of improved calibration of VIIRS (includes CLARREO ability to determine VIIRS polarization sensitivity with scan angle) for aerosol forcing (not as accurate as APS or ACE missions) • Land albedo forcing: CLARREO contribution is set to a value of 0.5 out of 8 for improved calibration of VIIRS surface albedo. • Rest of aerosol value is assumed to come from aerosol information provided by other missions: e.g. APS, EarthCARE, and ACE • Add greenhouse gas radiative forcing verification? Weight? • Total current value is 2 out of possible 8.
Fall, 2010 Baseline: IR-RS-RO repeat in 3 to 6 months Framework ties to decadal change observation accuracy requirements
Value Metric: Decadal Change Accuracy • Degradation of accuracy of an actual climate observing system relative to a perfect one (fractional error Fa in accuracy) is given by:Fa = (1 + Sf 2i)1/2 - 1 , where f 2i = s 2iti / s 2vartvarfor linear trends where s is standard deviation, t is autocorrelation time, svar is natural variability, and si is one of the CLARREO error sources. • Degradation of the time to detect climate trends relative to a perfect observing system (fractional error in detection time Ft) is similarly given by:Ft = (1 + Sf 2i)1/3 – 1 • For small values of Fa, Ft ~ 2/3 Fa • CLARREO Level 1 Requirement: Fa = 0.2 (20%), Ft = 0.15 (15%) • Science Value Equation Accuracy Metric: Va = 1.2/(Fa+1) • Example factor of 2 loss in accuracy is factor of 2 loss in science value • Do we need a different power law on this metric? Different form? Logic for selection?
Goal of within 20% accuracy of an ideal climate observing system is critical! • Example for Temperature Trends • CLARREO accuracy goals are optimal cost/value • High confidence critical for policy decisions • CLARREO accuracy designed to provide that confidence. Climate Change Accuracy is Critical to Making Difficult Policy Decisions
Value Metric: Length of Record • Decadal change trend accuracy (Leroy et al., 2008) [ (dm)2 ] = 12(Dt)-3(s2vartvar + s2meastmeas) (dm) ~ (Dt)-3/2 • Accuracy increases as Dt -1as a result of increasing anthropogenic trend signals over time (“baseline”). This value is present as long beginning of CLARREO record is at least 5 years: i.e. gaps can be tolerated. Would not work for very short CLARREO beginning record (i.e. 1 month or 1 year). • Accuracy further increases as (Dt) -1/2 as a result of time averaging of noise from natural variability: directly related to the total CLARREO climate record length over multiple missions • Value metric for length of record: Vt = (Δt)0.5whereDt is the CLARREO record length expected with 75% probability of occurrence. Anthropogenic trend accuracy increases with climate record length
Value Metric: Length of Record • Summary of launch, satellite, instrument reliability determination • Probability of survival on orbit as a function of time: • launch vehicle success rate 97% (mature launch vehicle such as Delta 2) • spacecraft survival for 3 years: 95% (98.3% survivability per year) • instrument survival for 3 years: 90% (96.6% survivability per year) • These reliability values are moderate and typical of class C missions: class D missions are lower reliability, class B missions (Terra, Aqua, NPOESS) are higher reliability • Uses selective redundancy of components (e.g. electronics) • Engineering experience with 100s of missions shows that survivability over time for spacecraft and instruments is roughly a constant survivability per year, i.e. P(n) = Sn , where P(n) is the likelihood of surviving n years on orbit, and S is the survivability per year for the spacecraft or instrument.
Value Metric: Length of Record • Most spacecraft and instrument failures are caused by electronics failures • Failure of launch, spacecraft and instruments are independent • P(n) = Sl * (Ss)n * (Si)n, where the subscripts indicate survival probability for launch, spacecraft, and instrument • Example: probability of one IR spectrometer surviving 5 years on orbit for one CLARREO spacecraft is then:P(5) = Sl * (Ss)5 * (Si)5= 75% n(P=75%) = 5 yrs • Probability of 2 spectrometers surviving 5 years on orbit for one spacecraft: P(5) = Sl * (Ss)5 * (Si)5 * (Si)5 = 53% n(P = 75%) = 3 yrs • Probability of at least 1 IR spectrometer surviving 5 years with two satellites on orbit, each with an IR spectrometer (i.e. total redundancy) is the same as the probability of neither surviving: i.e. (1-P(n))2:P(5) = (1 – Sl * (Ss)n * (Si)n )2 = 94% n(P = 75%) = 13 yrs • Probability of both IR spectrometers surviving 5 years with 1 IR spectrometer on each of 2 satellites:P(5) = ( Sl * (Ss)5 * (Si)5 )2 = 0.752 = 56% n(P = 75%) = 2.5 yrs
Value Metric: Length of Record • Climate record length for any CLARREO science objective is then: • The number of years that the required instrument or instruments for that science objective, as well as the supporting spacecraft will survive with 75% likelihood. • Record length will be reduced for any science objectives that require more than one instrument: whether they are on one spacecraft or two. • Most science objectives can be achieved with one instrument • Cloud feedback requires both IR and RS spectrometers • Current science value matrix assumes that the requirement for on orbit verification by an independent instrument requires one year of overlap • If the length of the climate record is defined as only when two IR or two RS instruments are operating: it will shorten from 13 years to 3 years. • For one of 2 IR and one of 2 RS surviving: 8 year record length at 75% • Cost of the mission will increase with reliability, but so will science value. • Record length will increase with increased instrument & s/c reliability • Climate focused missions need a more rigorous focus on mission length to date only CERES, SeaWiFS, and CLARREO have seriously looked at this as a system with multiple instruments and spacecraft involved.
Verification Metric • For climate change, verification is a key element for high confidence in use of the data for climate research and for societal decisions • While CLARREO is leading this effort, it is an issue for all climate missions • March Geneva workshop: climate and metrology researchers • 3 key characteristics of internationally accepted SI standards • Peer Review of the observations, methodology, and uncertainty analysis • Open documentation • Comparison of independently derived observations, methodology, and uncertainty • Verification has several levels for CLARREO instrument SI traceability: • Ground calibration and verification • In-orbit calibration and verification • Verification with a single instrument (e.g. blackbody emissivity) • Verification using multiple copies of the same instrument (e.g. IR to IR) • Verification using independent instruments (e.g. CLARREO RS and TRUTHS RS, aircraft underflights, CLARREO RS use of lunar and solar cal targets, compared to independent SI traceable TSIS spectral solar irradiance, and NIST lunar irradiance from high altitude balloon experiments. • Verification against independent instruments is most like the metrology standards
Verification Metric • Verification of uncertainties in the entire chain from SI to decadal change • Orbit sampling (e.g. predict then verify error) • Spectral Fingerprinting • Reference Intercalibration (e.g. CLARREO to CrIS and IASI, CrIS to IASI) • Retrieval bias stability (e.g. optimal climate change retrieval methods may differ from optimal weather or instantaneous process mission retrieval methods) • Current verification factor Vv is very simple. • Vv = 2, if at least 1 year of overlap for 2 identical CLARREO instruments • Vv = 1.5 for mid-IR verification by aircraft underflights with only 1 IR instrument • Vv = 1.0 for far-IR verification if only 1 IR instrument (aircraft verification difficult) • Likelihood of achieving 1 year of overlap on orbit (see survivability discussion in record length section) is used to weight the values between the cases above • Currently not accounting for any higher verification factors for fully independent in-orbit verification such as CLARREO RS vs TRUTHS RS, because TRUTHS is only a proposed mission. This could change in the future.
Risk Metric • Risk has currently only been evaluated by the engineering team for the instrument/spacecraft/launch vehicle risks, which usually dominate mission risk analysis for a program. • Overall, the engineering team evaluates the risks of the RS and IR instruments as roughly equivalent. Neither are pushing a technological low TRL level requirement. Risk of RO instrument is also low. • Risks including technical, schedule, cost, and programmatic are typically handled by the project as separate from science value (e.g NASA SP-2007-6105 Rev 1) • Current science value matrix has not used risk as a discriminator for overall mission design, but this could be included in the future based on: • Desire to combine project risk and science value in an overall metric • Changing launch vehicle risks • Experience with breadboard developments changing technology risks • Desire to explicitly include science risks as separate from technology risks • Risk metric should continue to be examined through phase A
Example Mission Scenarios • Following charts give examples of taking the strawman Science Value Matrix approach and applying to a range of mission designs.
IR-RS-RO in 2017, repeat in 2020 98% vs 2010 fall baseline, verification reduced, record length increased
IR-RS-RO single s/c, minimum mission 47% vs 2010 fall baseline: greatly reduced verification and record length
IR-RO in 2017, 3yr gap IR-RS-RO in 2020 65% vs 2010 fall baseline, verification reduced, record length reduced
IR-RO in 2017, repeat in 2020 38% vs 2010 fall baseline, solar science eliminated, verif. & rec length reduced
Example of Mission Options & Science Value A way to more quantitatively map mission options to science value…
Science Values in Matrix Structure • The CLARREO science objectives • CLARREO science impact for each objective relative to uncertainties in climate forcing, response, and feedbacks • The climate trend accuracy that CLARREO can achieve relative to a perfect climate observing system including effects of absolute calibration, orbit sampling, and instrument noise. • The effect of climate record length and therefore mission reliability in launch, spacecraft, and instrument design life, as well as mission launch schedules. • The ability to verify the accuracy of calibration in orbit. An integrated view of ties between science value and mission design.