260 likes | 439 Views
Trends in Data Processing for Space Science Missions. James L. Green and Rick Burley Space Science Data Operations Office, GSFC. Science Data Processing Workshop February 27, 2002. Outline. Characteristics of previous space science missions Current trends in space science missions
E N D
Trends in Data Processing for Space Science Missions James L. Green and Rick Burley Space Science Data Operations Office, GSFC Science Data Processing Workshop February 27, 2002
Outline • Characteristics of previous space science missions • Current trends in space science missions • New approaches for new missions • End-to-end data flow analysis • IMAGE data processing system • Summary
Characteristics of Previous Space Science Missions • Space science data processing systems are intimately tied to flight missions • Mission types: • Space physics: Exclusively Principal Investigator (PI) missions • Astrophysics: PI plus a number of observatory class missions • Projects did not have full life cycle budget responsibility • Mission operations (MO) performed by a service organization with separate budget • Ensured keeping the science operation systems separate from MO • Development costs not MO&DA costs drove the mission design • Project’s philosophy was to “fix problems on the ground” • End-to-end mission data systems trade studies were rarely performed
Example #1: Previous Mission Data System • Projects did not considered the full effect on the ground data system to all mission design decisions • Astrophysics Mission launched in the early ‘90s • Low Earth orbit (~90 min. period) • Observations of the faintest objects were as long as 4-5 hours • Must make non-contiguous observations of same object over multiple orbits • Result: extensive ground system problems and work arounds • Science scheduling system become complicated and labor intensive to execute • Science data processing system had to accommodate all types of observations
Lack of Ground Data System Trade Studies • Scientific Data Processing System Functions • Group the observations by object for non-contiguous targets • Calibrate the data and format the result (FITS) • Analysis of the software system components: • Science algorithms: 11% • Data management: 78% • Orbit/aspect + miscellaneous: ~11% • 306K lines of code • ~100 MY effort (estimated) over 7 years • Expect a higher orbit vs. ground system software development trade study could have saved costs • New Requirement: Plan all observations of targets to be contiguous making operations simpler • Personnel and hardware/software systems development and maintenance costs would be significantly reduce • Mission could operate for less for longer (plan on mission extensions!)
Mission and Science Operations Manpower Estimate • These functions were spread out over 5 or 6 buildings • Separate staff for each function and no cross training • Dependent on a large number of missions before systems became cost effective
Current Trends In Space Science Missions • Types of Space Science missions: • Observatory class: Hubble, Chandra, … • PI class: Explorers (ie; IMAGE), Solar Terrestrial Probes, … • Regardless of mission type the science community expects: • Rapid access to all mission data products and services • Short or no proprietary periods • Better data sets into the long term archive • Cost Capped Mission must cover total mission costs • Forced to find simpler and cost effective ways to satisfy more requirements
New Approaches for New Missions • Use lessons learned from the previous missions • Include ground data system into overall mission design and perform the appropriate trade studies • Consider the effect of operations on the end-to-end data system • Assume success and plan on MO into extended mission phase • Recognize that a lot of operations personnel over time is expensive • Automation where possible reducing operations staff • Co-locate facilities and functions provide cross training • Recognize software development costs • Maximize the use of standard formats and software reuse
End-to-End Data Flow Analysis • Data requirements • Define the users of the data • Flight operators (housekeeping) • Investigators (science data) • Engineers (housekeeping) • Other scientists (archived data) • Public (science results) • What data products are needed for which users? • Time scale for delivery (real-time or later) • Quality of the data and associated documentation • Other issues (security, proprietary rights, …) • Characterize the data flow to each user • Determine what processing is needed and where it is to be done • Identify what associated data is required • Form and format of all data produced • Determine transport or delivery protocol • Determine any derived requirements
End-to-End Flow Process (Cont’d.) • Evaluate approach • Determine the cost and the schedule • Identify driving requirements (most cost/most time) • Develop alternative scenarios and refine • System improvements • Standards (interface, transport, form/format) • Software reuse • Commercial off-the-shelf software • Hardware technologies migration strategies • Process improvements • Co-locate functions (SOC/MOC, product generation, etc.) • Distribute functions (government, university, industry, etc.) • Multi-mission infrastructure • Requirements reduction • Other trades (i.e., orbits)
Imager for Magnetopause-to-Aurora Global Exploration Facts in Brief: Launched: March 25, 2000 Launch Vehicle: Delta Launch Site: Vandenberg AFB Orbit: 1000 km x 7 RE altitude polar orbit Telemetry: 2.2Mbps via DSN 34m Mission Duration: Two years (prime) 3 more years extended Science Objectives: • Identify the dominant mechanisms for injecting plasma into the magnetosphere on substorm and magnetic storm timescales, • Determine the directly driven response of the magnetosphere to changes in the solar wind • Discover how and where magnetospheric plasmas are energized, transported, and subsequently lost during substorms and magnetic storms. IMAGE is the first of the MIDEX series which feature a capped MO&DA cost. http://image.gsfc.nasa.gov/
IMAGE REQUIREMENTS SUMMARY Instruments: 3 Neutral Atom Imagers from 10ev to 200kev. FUV Wideband Imaging Camera, Spectrographic Imager and Geocorona photometers, EUV Imager, Radio Plasma Imager. Uplink of consolidated science commands once per week - nominally. Data Processing: Level-0 processing done for each science instrument and housekeeping. Browse Products for each science instrument and housekeeping. Emphasis on quick data processing and distribution. Data Distribution:All data is public - no proprietary period. Level-0 and BP data will be available via world-wide-web site within 48 hours after each DSN pass All data delivered to NSSDC for permanent archive, and public distribution. IMAGE mission has been designed to be low maintainence.
Cost Effective Elements of the IMAGE DS • Put mission operations and science data processing into one facility with one staff to perform required functions • Science and Mission Operations Center - SMOC • IMAGE plan for automation based on the requirements of minimizing mission operations • Planned for lights out operations • Selective data retransmission was automated • 8 x 5 shifts only • Large margin simplify operations • Data volume, command storage, thermal, power, … • SERS (s/c emergency response system) - page operator when sustained limits violated, or health & safety limits violated • Maximize the use of standards (including network protocols) • Used the I&T based tools for Mission Ops (ASIST, FEDS) • Use existing science correlative tools (CDAWeb, SSCWeb) • Science and ancillary mission data from the open archive • Engineering/housekeeping data to perform all standard trend analysis
IMAGE Ground Data System Overview • DS development budget less than 10% of cost of similar ground systems • Schedule: 36 months development/test schedule • Spacecraft Control Team (SCT) approach within the SMOC • Flight Operations • Command management, data acquisition, heath & safety of s/c • Mission Planning • Ground data processing (level 0 & 1) • System Administration • Operations staff: • All staff brought on board before spacecraft I&T (~ 1 year prior to launch) • 4 person Spacecraft Control Team through launch & early orbit • 2.5 person Spacecraft Control Team through End-of-Mission • 0.3 person DSN scheduling • Total Mission Operations cost $450K/year (excludes DSN costs)
IMAGE Science Data Process System • Use IDL scripts for automated data processing • Allowed PIs to independently develop their algorithms and integrate into the SMOC software system • Processed data files stored on FTP web server • No commercial DB systems needed • Engineering data also processed • Twice daily processing (2 DSN passes) • Quick-look • Daily processing forwarded onto the NSSDC • NSSDC loads into CDAWeb daily • SMOC maintains ~2 month of processed data online • No pushing of QL data to investigators: they look at it at their convenience • No project dependent archive needs to be maintained
IMAGE Data Flow • Science data products were generated into standard formats • Universal Data Format (UDF) - for low level telemetry data • Common Data Format (CDF) - for higher level and “browse” products • NSSDC provided software to IMAGE to bundle its data products for archiving
Coordinated Data Analysis Web (CDAWeb) • CDAWeb is a publicly-accessible data “browsing” service (plots and listings) that provides simultaneous access to multi-mission data needed to understand global interactions, structure & dynamics • Part of the NSSDC archival system • Unique joining of data • NASA: ACE, FAST, IMAGE, IMP-8, Polar, SAMPEX, Wind • Other US: GOES, LANL • International: Geotail, SOHO, Ulysses, Interball, Cluster • Continuing growth in popularity • An international community of users • Service mirrored in Europe and Japan • Large community of users in 2001 • ~330 distinct (heavy) users made >100 requests • ~1080 distinct (regular) users made >10 requests
Growth in CDAWeb Database • Now serves 420,000 files, 1300+ science parameters • Contains ~6 years of simultaneous data from >100 datasets on every day
Satellite Situation Center Web • Multi-mission spacecraft positions both definitive and predictive, in a science framework, to answer science questions • ~30 current space physics missions maintained in the database • Geophysical coordinate systems, geophysical models, conjunctions among s/c & ground stations • Products include graphics & listings • Science planning & data understanding • Web-accessible to the international science community • SSCWeb finds & characterize times of interesting s/c configurations
IMAGE Engineering Data • Over 150 instrument and s/c parameters are written as CDF files and treated just like the science data • Processed daily delivered to the NSSDC for archiving and loaded into CDAWeb • Use CDAWeb to perform all standard trend analysis • Sustaining engineering can be done anyway, only access to the internet is needed • No specialized system needed!
Summary • Entire mission data system needs to be part of the mission design and a part of any mission trade study • Science data processing should be success oriented - automate it, test it, implement it, and assume it works • Scientists look at it daily and if it doesn’t work we know about it immediately • QC done by the scientists no operations personnel used • Look at all aspects of consolidation, cross training, automation • Using current technology for automation based on a cost-benefit analysis • Every year the mission is extended saves money for doing science
Workshop Goals • Explore ways to deliver data to users in a more timely, cost effective manner • Reduce the development time and costs associated with getting data from missions to all users • Share methodologies across multiple projects • Activities: • Explore new methodologies • Exchange new ideas for innovative approaches • Examine lessons learned • Identify areas for collaboration • Topics: • Off the shelf science data processing sw tools • Improved and innovative cost-effective approaches to process & distribute • Common suites of ground data processing system designs and features • Improvements towards maximizing science data return • Challenges of on-board processing