1 / 6

TRENDS ANALYSIS OVERSIGHT

TRENDS ANALYSIS OVERSIGHT. Dan Schwartz Development and Operations Science Support/ Technical Support Team. 12 January 2004. CXC Users Committee. SOT/DAS. OUTLINE. MISSION STATEMENT OVERSIGHT ISSUES IMPLEMENTATION PLANNING. 12 January 2004. CXC Users Committee. SOT/DAS.

katy
Download Presentation

TRENDS ANALYSIS OVERSIGHT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TRENDS ANALYSIS OVERSIGHT Dan Schwartz Development and Operations Science Support/ Technical Support Team 12 January 2004 CXC Users Committee SOT/DAS

  2. OUTLINE • MISSION STATEMENT • OVERSIGHT ISSUES • IMPLEMENTATION PLANNING 12 January 2004 CXC Users Committee SOT/DAS

  3. MISSION STATEMENT “(7)Monitoring and Trends Analysis – The CUC recommends that the CXC provide a higher level of oversight to the SI performance monitoring and trending analysis program to ensure that appropriate performance quantities are being tracked.” GOAL: Identify ANY changes in instrument performance before they occur. (Walkthrough example: ACIS gain) 12 January 2004 CXC Users Committee SOT/DAS

  4. ISSUES Technical: • Identification of specific, potential, changes • Identification of indicators for each such change • Calculation of quantities to provide the indication (the trended quantity) • Assessment of the sensitivity of the trended quantity to Instrument change • What quantities must be determined more precisely (than currently), so that we are sensitive to any changes? 12 January 2004 CXC Users Committee SOT/DAS

  5. ISSUES CONT… Software: • Database review • Report generation • Correlations and predictions Organizational: • Who is the monitor of each quantity? • How does the software get developed? • How do we educate CXC to use the existing tools? CXC Users Committee 12 January 2004 SOT/DAS

  6. IMPLEMENTATION PLAN • Team is drawn from DOSS, CAL, SI IPI teams, FOT, project-wide • Divide by subsystems: • ACIS • HRC • LETG • HETG • HRMA • ACA • Other PCAD (e.g., IRU, fid lights and transfer, mounting stability, FSS • Other spacecraft equipment (e.g., reaction wheels, jitter stability, optical bench, focus and translation, EPHIN) • Other infrastructure (e.g., ACE, GOES, internet connectivity) • List the quantities which may change • Make two rank orders for each subsystem • Likelihood of occurrence • Effect of occurrence • Group assesses what can/must/should be monitored KEY POINT: One person responsible for each identified quantity. 12 January 2004 CXC Users Committee SOT/DAS

More Related