1.01k likes | 1.2k Views
Gamma-ray Large Area Space Telescope. GLAST Large Area Telescope: Instrument Flight Software WBS: 4.1.7.9 Presenting for the FSW group: Gunther Haller Stanford Linear Accelerator Center Manager, Electronics, DAQ & FSW LAT Chief Electronics Engineer haller@slac.stanford.edu
E N D
Gamma-ray Large Area Space Telescope GLAST Large Area Telescope: Instrument Flight Software WBS: 4.1.7.9 Presenting for the FSW group: Gunther Haller Stanford Linear Accelerator Center Manager, Electronics, DAQ & FSW LAT Chief Electronics Engineer haller@slac.stanford.edu (650) 926-4257
Content • Overview • Peer CDR-Review RFA Status • Interface • Requirements • Mapping of Requirements/Functions/Tasks/Packages • Boot • Event-Filtering • File/Object Management • Development • Verification
LAT FSW – Part of DAQ Subsystem FSW is an integral part of the data acquisition (DAQ) subsystem and is managed, budgeted and scheduled as part of the DAQ subsystem TKR Front-End Electronics (MCM) ACD Front-End Electronics (FREE) TKR CAL Front-End Electronics (AFEE) • 16 Tower Electronics Modules • DAQ electronics module (DAQ-EM) • Power-supplies for tower electronics CAL Global-Trigger/ACD-EM/Signal-Distribution Unit* 3 Event-Processor Units (2 + 1 spare) • Event processing CPU • LAT Communication Board • SIB Spacecraft Interface Units • Spacecraft Interface Board (SIB): Spacecraft interface, control & data • LAT control CPU • LAT Communication Board (LCB): LAT command and data interface Power-Distribution Unit (PDU)* • Spacecraft interface, power • LAT power distribution • LAT health monitoring * Primary & Secondary Units shown in one chassis
FSW Organization Chart Project Manager(IPM)W. Althouse Electronics & FSW ManagerG. Haller Performance &Safety AssuranceD. Marsh DAQM. Huffer FSW LeadJ. J. Russell FSW Test/QAOversightS. Sawyer Configuration ManagerA. P. Waite Thermal ControlJ. Swain Boot &S/C InterfaceD. Wood AlgorithmsJ. J. Russell LAT ConfigurationJ. Swain SoftwareArchitecturesA. P. Waite I&T SupportC. Brune Test ExecutiveS. Maldonado Front End SimulatorsO. Saxton Cmd & TelemDatabaseB. Davis RAD750ProcessorR. Caperoon
LAT FSW Team Heritage • Small effective group • Very experienced • HEP >50 man-years • FSW >20 man-years • Successful track record • Leads are developers • Leads are scientists • Employ highly interactive development process • All members are expert in LAT architecture, able to contribute in many areas • Independent oversight provided by systems engineering • Produce fully documented design • Process allows/requires software to be in use from early subsystem development/testing to full LAT verification
Changes Since PDR • Processor selection • BAE RAD750 has become baseline processor • Number of processors has been determined • 2 SIU (1 active, 1 cold spare) • 3 EPUs (2 active, 1 cold spare) • SIU and EPU crates now look alike • Interface to SSR has become part of GASU • Some SIU code has migrated to EPU or common code
Review RFA Status Summary • Delta-PDR software related RFA’s (Request for Action) • “Determine the need date for processor down-select based on software design impact” • Have selected and placed order for BAE RAD750 • “Finalize the flight-software management plan and test plan” • Flight software management plan (LAT-MD-00104) and Flight software test plan (LAT-TD-00786) released and in cyberdocs • Peer-CDR software related RFA’s • Generated 12 software related RFA’s • 9 responses accepted • 3 responses need more work • Listing in appendix
Requirements (Example) • Example requirements • Full listing of requirements appears in appendix • Released in cyberdocs: LAT-SS-00399
SIU FSW External Interfaces Via 1553 Ground / SC Commands / Uploads SRS 5.3.1.1 Via 1553 Telemetry to SC SRS 5.3.1.1 Via 1553 SC Ancillary/Attitude Data Via 1553 SRS 5.3.1.1 LAT Repoint Request to SC SRS 5.3.1.1 Via 1553 SC TimeTone Via LCB Telemetry to SSR SRS 5.3.1.1 SRS 5.3.1.4 Via 1553 Via LCB GRB Telecommand from GBM SIU FSW Communications to EPU SRS 5.3.1.1 SRS 5.3.3.2 Command / Response CMDs to LAT HW (includes configuration data) Via LCB Discrete Immediate Trigger from GBM SRS 5.3.3.1 SRS 5.3.1.2 Boot Status Outputs (2 levels – i.e. 2 bits) Discrete Discrete SRS 5.3.1.3 1 PPS Time Hack from SC SRS 5.3.1.2 PCI TCS Heater Control Signals SRS 5.3.1.3 Via LCB Command / Response Data SRS 5.3.3.1 PCI PDU / GASU Power On Signals SRS 5.3.1.3 Via LCB Communications from EPU SRS 5.3.3.2
EPU FSW External Interfaces Via LCB SC Ancillary/Attitude Data SRS 5.2.1.1 Via LCB SC TimeTone Via LCB SRS 5.2.1.1 Processed Events to SSR SRS 5.3.1.4 EPU FSW Via LCB Communications to SIU SRS 5.2.1.1 Discrete 1 PPS Time Hack from SC SRS 5.3.1.2 Via LCB Communications from SIU SRS 5.2.1.1 Via LCB Event Data SRS 5.2.1.3
SIU/EPU LCB SIB TCS PDU CRU EBM TEM GEM AEM FES GCFE GTCC GCRC GAFE GTRC GCCC GTFE GARC FSW ICDs • FSW interfaces to SC managed by Spectrum Astro • 1553 Bus Protocol ICD • SC-LAT ICD • FSW interfaces within the LAT are detailed at right • All released or in release process LAT-SS-01539 LAT-SS-00606 LAT-SS-00860 LAT-TD-01547 LAT-SS-01546 LAT-SS-01545 LAT-SS-01543 LAT-SS-00605 LAT-SS-01825 LAT-TD-01544 LAT-SS-00176 LAT-SS-00238 LAT-SS-00363
FSW Introduction to Terms High level statements of specific characteristics or capabilities necessary to the FSW Requirements Categories or broad areas of capability (functionality) that the FSW must implement to satisfy the requirements Functions Concurrent processes executing on the processor that perform the required functions of the FSW (equivalent to the concept of threads) Tasks Logical organization of actual code into groupings of files and associated data for documentation, testing and compilation; packages provide specific services or carry out specific functions that are building blocks of tasks Packages Specific collections of packages (or partially implemented packages) that compile properly and execute to implement a specific subset of the total defined FSW tasks, providing a subset of the overall FSW functions that satisfy a subset of the FSW requirements; the full LAT FSW release must satisfy all FSW requirements Releases Each release must be tested to verify that it does what it was designed to do and that the design satisfies the intended requirements; the full LAT FSW release must be shown to satisfy all FSW requirements Testing
SIU Functions • Boot (SRS 5.3.4.1) • Command processing and distribution (SRS 5.3.4.2) • Telemetry management (SRS 5.3.4.4) • Time, attitude and ancillary data processing (SRS 5.3.4.3.4-6) • Configuration of LAT (SRS 5.3.4.6) • Health, status and safety monitoring • Housekeeping and low rate science (SRS 5.3.4.8) • Software watchdog (SRS 5.3.2.1) • Load shed, safe mode, SAA (SRS 5.3.4.3.7-9, 5.3.4.12) • File upload/download management (derived SRS 5.3.4.2) • Calibration and diagnostics (SRS 5.3.4.7) • Mode control (SRS 5.3.4.5) • Thermal control system (SRS 5.3.4.13) • Instrument physics • GRB processing (SRS 5.3.4.3.1-3, 5.3.4.9-10) • Summary analysis, statistics (derived, 5.2.2.3, 5.3.4.11)
EPU Functions • Boot (SRS 5.2.2.1) • Command receipt management (SRS 5.2.2.2, derived) • Telemetry management (derived) • Health, status and safety monitoring • Software watchdog (SRS 5.2.1.2) • Calibration and diagnostics (derived) • Instrument physics • Event filtering (SRS 5.2.2.4) • Filter configuration (SRS 5.2.2.5, 5.2.2.6)
SIU Tasks, Functions, Requirements • Correlates SIU functions with the tasks that perform them
EPU Tasks, Functions, Requirements • Correlates EPU functions with the tasks that perform them
FSW Task Framework • Fundamental construct for LAT FSW is Master/Slave tasks • Master running in SIU • Slaves running in SIU or in EPUs or in both • Communications between master and its slaves is full-duplex • Slave tasks may have multiple inputs • E.g. a slave task receiving instrument data as well as messages from its master task • Slave will have two input queues with priority given to messages from the master task • Master tasks may also have multiple inputs • Needed to achieve connectivity back to the spacecraft • Master task will also have two input queues, one from the slave(s) and one from the spacecraft 1553 dispatch, with priority given to the 1553 messages • Structure of masters and slaves can be replicated as often as necessary to accomplish all the functions required of FSW
FSW Architecture with Task Framework Event Builder (EB) output side. The EB is an element of the GASU. LAT Instrument Spacecraft To SSR To SIU To EPU(s) SolidStateRecorder Spacecraft Interface Unit Event Processing Unit(s) Other Tasks Other Tasks 1553 Rx service Software Watchdog 1 PPSInterrupt GBM Interrupt Software Watchdog 1 PPSInterrupt Discretes(to RAD750 PIDs) LCB Rx service LCB Rx service Masters Slaves Slaves 1553 Q Q Q SC Att./Time SC Att./Time SC Att./Time Q Q Q Instr. Phys. Instr. Phys. Q Q Q Q Q File/Object File/Object File/Object Q Q Q Q HSK HSK HSK Q Q Primitive Q Q Q Q 1553 Tx service LCB Tx service LCB Tx service Q Q Q From SIU From EPU(s) Event Assembly Event Builder (EB) input side. The EB is an element of the GASU. Command/Response Unit (CRU). The CRU is an element of the GASU.
Description of Master Tasks • SC Att./Time deals with dispatching the seven messages per second from the spacecraft • 5 attitude • 1 time-tone • 1 ancillary (containing orbit information as well as status info) • Instr. Phys. master deals with all instrument data related processing • May execute different code depending on operating mode • GRB detection and performance monitoring in normal mode • Other algorithms in calibration/diagnostics modes • File/Object master deals with all file upload/copy/delete/… processing • HSK master deals with accumulating and examining housekeeping • Acquires information from SIU (self), EPUs, electronics hardware • Provides monitoring and alarming • Outputs telemetry • Primitive (or immediate) master deals with the very primitive LAT configuration command set
FSW Packages • FSW partitioned into functional blocks, then tasks, based on the SRS • Tasks are then mapped into packages, the fundamental unit of the code management system • Package Development • Detailed design elements (algorithms, finite state diagrams, logic flows, etc.) and development notes are generated on a per package basis • Design information is stored in a Software Development Folder (SDF) which accompanies each package • Contents of SDF are version controlled alongside the package’s code using the code management system • As the software matures, design descriptions from the SDFs evolve along with the code to provide a complete set of detailed design documentation • Unit tests are developed and code managed within the package
Package Descriptions Common code – SIU and EPU SIU specific code EPU specific code Boot code Test and verification code *See next slide for discussion of contingency Grand Totals: 91,140 LOC with 26,235 contingency (~30%) *Contingency algorithm is described in Appendix
Gamma-ray Large Area Space Telescope Design & Development Verification Program
Boot • Principal requirements: SRS 5.2.2.1, 5.3.4.1 • Boot document: LAT-TD-001806-04 • Boot proceeds in two stages • Primary boot (from on-board SUROM) • Secondary boot (from EEPROM on SIB board) CPU Crate (SIU or EPU) RAD750 SIB LCB SUROM (256 kB) 750 CPU SDRAM (128 MB) EEPROM Bank 0 Reserved for Secondary Boot (Managed byTFFS software) EEPROM Bank 1 Bridge Chip (Managed by TFFS software) Discrete I/O 1553 communications to SC(not used by EPU boot) LCB communications to SIU(not used by SIU boot)
Primary Boot • CPU reset from SUROM • Run bridge chip initialization procedure • Set initial watchdog timeout • Map out SDRAM, SUROM and PCI I/O spaces • Enable processor L1 instruction cache • Disable interrupts • Memory test SDRAM • Memory test (all 0’s, all 1’s, checkerboard) (runs from ROM/cache) • Start primary boot shell (now using RAM resources) • Enable processor L1 data cache • Configure PCI bus • Configure 1553 device (SIU) or LCB device (EPU) • Go into command loop () • Initial command timeout for automatic start • Poll for new commands • Send housekeeping telemetry • Reset watchdog timer
Primary Boot Command Processing Parse Upload Packets Parse Operational Command RTOS Execute Command Operational Command Packet Received Upload Packet Received Record Time Information Poll 1553 Remote Terminal Prepare Next HKP Telemetry Packet SIANCILLARY Packet Received Last HKP Packet Sent SIANCILLARY Packet Received Command Start Telecommand Received Last HKP Packet Sent Load and Execute RTOS Poll 1553 Remote Terminal / Initial Command Timeout Timeout - No Command Message Received Initialization Startup
Secondary Boot • Secondary boot functions • Inflate (ZLIB algorithm) VxWorks image to prepared memory location • Branch to VxWorks entry point • Execute secondary boot script to run application code • Inflate (ZLIB algorithm) and link application code modules from EEPROM • Call application initialization functions • The system is running
Boot Status • Development environment at NRL (Dan Wood) • Prototype version RAD750 • JTAG programming environment • Engineering version SIB (access to 1553 and EEPROM) • Recently added man-power: Brian Davis, Ray Caperoon • Boot code progress:
Event Filtering • Principal requirement: SRS 5.2.2.4 • Numerology • Event size: ~1kB • Physics signal: ~10Hz (whole orbit) • Background: ~2kHz (orbit min) to ~10kHz (orbit max) • Orbit average: ~6kHz trigger rate (6MB/sec) • Allowable data rate to SSR: ~35kB/sec • Filter rejection efficiency required: ~99.6% • Filter must keep up with maximum rate: ~100 sec/event (orbit max) • Status • 98.4% background rejection achieved at 14 sec/event; executing on RAD750 (see previous Boot Status slide) • How to get from 98.4% to 99.6% and when? • Implement final set of cuts being used by ground software • Investigate data compression techniques (more sophisticated than ZLIB) • Target date for completion: EM2 release
File/Object Management • Principal requirement: 5.3.4.2.4 • Adopting a file system and TFFS reduces object management to file management • File system provided by VxWorks • TFFS (True Flash File System) • Balances writes across EEPROM memory • “Bad blocks” bad EEPROM memory locations • File uploads go to RAM disk first and are then (by command) committed to EEPROM CPU Crate SIB (Hardware View) RAD750 (Software View) Applications EEPROM Bank 0 Managed by TFFS software (Secondary Boot Code) Code written by LAT FSW Posix file calls VxWorks EEPROM Bank 1 DosFs (Fat16 file system) VxWorks base product Managed byTFFS software RAM disk TFFS VxWorks layered product (Other SIB functions) EEPROM driver
FSW Resource Usage Current Estimates • Principal requirement: SRS 5.4.3
Development Environment • Embedded System • Processor / operating system: BAE RAD750 / VxWorks • Toolset (Wind River Systems): • Language: C • Development platform: Sun / Solaris • Compiler / linker / binutils: GNU cross compiler suite • Debugger: Crosswind • Host System • Processor / operating system: Sun / Solaris or Intel / Linux • Toolset (host simulation or cooperating processes): • Language: C • Development platform: Sun / Solaris or Intel / Linux • Compiler / linker / binutils: GNU compiler suite • Debugger: GDB / DDD • Toolset (test executive and scripting): • Python / XML / MySQL / Qt / Perl • Other Tools • Requirements management: DOORS • Code / configuration management: CMX / CMT / CVS • Autogeneration of documentation: Doxygen • Documentation: Microsoft office suite (also Adobe / Framemaker, etc.)
Software Development Approach • Software lifecycle model • Iterative / incremental development model • Multiple builds with increased capability with each build • Regression testing on each build • Requirements flowdown, analysis, review • Flowdown from program and system specs • Peer reviews • Design and code inspections / review • Top-level design review • Detailed design reviews and code inspections on per release basis • Continuous cycle of development and test • Code management • Formal control through the CMX / CMT / CVS toolchain • Configuration management • Formal control through project management tools • Cyberdocs • Non conformance reporting system • Independent quality assurance and test oversight manager • Reviews test plans, procedures, scenarios, data • Reports directly to LAT QA, systems engineering
Software Design for Safety • The software safety environment • Software cannot damage hardware (hardware protects itself) • Reprogrammable on orbit (except for primary boot code) • The software safety philosophy during development • Leverage the fact that software cannot damage hardware • Make unexplained conditions “fatal but not serious” and reboot • Decreases complexity • Increases reliability / robustness • Immediate and graceful exit quickly identifies code weaknesses • Improves efficiency for producing reliable / robust final code • On a case by case basis, develop recovery strategies • Not recoverable and CPU compromised: Stay with reboot strategy • Always attempt to save a block of information describing the fault condition in a known fixed memory location so that it can be picked up and sent to ground after the reboot • Not recoverable but CPU integrity good: Report to ground and await intervention • Fully recoverable: Perform recovery action, continue operation
FSW Fault Detection • Hardware Fault Detection • Run bridge chip built-in test • Examine checksums on SC communications • Detect missing messages from SC • Look for parity errors on LAT internal communications • Check housekeeping of LAT voltages, currents and temperatures • Software Fault Detection • Keep CPU housekeeping metrics: memory usage, idle time • Enforce the software watchdog • All registered tasks must regularly report progress in order for the software watchdog to reset the hardware watchdog • Instrument Data Fault Detection • Monitor low rate science (counter) readings • Compare instrument configurations read out from beginning and end of data collection runs (must agree) • Examine single event data for correct format, completeness • Check single event data for physics consistency
Development Process • Initial design effort • Define hardware interfaces and • architecture • Build stable development infrastructure • Generate high-level requirements (SRS) • that capture scope of project • Generate high-level design that captures • basic architecture and interfaces • For each FSW release • Generate detailed design of new functionalities • Employ iterative design/code/test process to converge on the detailed design (“little spirals”) • Allows experienced developer to proceed more rapidly to explore the design parameter space, discover issues, and resolve them • Ultimately produces a more optimal design than one selected in advance based only on analysis and limited data • Extensive documentation of resulting code is produced as it is built • Iterative process is a continuous rapid prototyping cycle that supports higher productivity and a higher quality final product Qualitative example for 3 major spirals Architecture, design Activity Code/Test Time
1st 2003 2nd 2003 3rd 2003 4th 2003 1st 2004 2nd 2004 3rd 2004 4th 2004 Breakdown of Development Cycles CDR 4/29/03 EM2 Peer Review 10/1/03 FU Peer Review 4/1/04 FU release to I&T 10/1/04 EM1 Code Release 7/1/03 EM2 Code Release 3/1/04 FU Code Release 9/1/04 EM1 cycle (Single Tower, Single CPU) EM2 cycle (Multi-Tower, Single CPU) FU cycle (All) Design/Develop Develop/Test System-Level Test Design/Develop: Start design, code small prototypes, no hardware available, only descriptions Develop/Test: Code and test against real hardware, take snap-shot at end (i.e. define release) System-Level test: Test against system-level test scenarios, release to I&T at end
Hardware 1 Partially populated tower 1 Tower Electronics Module 1 COTS CPU (VME) Ethernet Serial port LCB EM1 FSW Release Goal: Demonstrate Single-Tower, Single-CPU Operation • Software • Interfaces (other than VxWorks) • LCB command/response • LCB event acquisition • TEM configuration setting and read-back • Read/write all TEM/TKR/CAL registers • Format and export event data from tower • Charge injection calibration • Inject a known charge signal directly into the (TKR, CAL) electronics in lieu of the detector output • Read the resulting event data output • Collect TEM housekeeping and LRS data • In Parallel • Filter development and testing • Boot, 1553 development Status: Development complete against preproduction electronics with the exception of LCB support (using VME I/O communications boards instead) Deployed to field in I&T test stands
EM1 FSW Architecture LAT “Instrument” “Spacecraft” (Host System) Spacecraft Interface Unit Other Tasks “1553 Rx” (e-net) Software Watchdog LCB Rx service Masters Slaves Ethernet Single Tower Q Q Instr. Phys. Instr. Phys. Q Q Q Q HSK HSK Q Q Primitive Q Q Q “1553 Tx” (e-net) “LCB Tx” (e-net) Q
SIU Functions Command Distribution Telemetry Management Configuration of (subset of) LAT Health, status monitoring Mode control “EPU” Functions Event acquisition and formatting EM1 Function/Task/Package Mapping
EM1 FSW Packages (1 of 2) • GNAT – Physical IO & Protocol to Command/Response Fabric • Controls the access to the physical layer of the Command/Response Protocol • %age of final package needed for EM1: 100% • GCFG – Configuration of Front-End Electronics • Configures the LAT electronics by sending commands to the various boards and their associated registers • Readback of the configuration is also supported here • 50% needed (70% planned) • Only need TEM specific code • SOP – SIU Event Output Package • Attaches auxiliary data and packages events with CCSDS format for output to SSR • 30% • HSK – Housekeeping and Low Rate Science • Handles housekeeping and low rate science data • 50% • Need infrastructure and ability to handle 40 TEM telemetry points • MCP – Mode Control • Handle run control • 50%
EM1 FSW Packages (2 of 2) • SDF – Frameworks • New feature in top-level design to uniformly handle communications needs across all major functional blocks and shelter application developer from dealing with task-to-task communications • 100% • SWD – Software Watchdog • Monitor activity in other tasks • 100% • LIO – LAT Communication Board I/O • Hardware interface for all LAT internal communications • 100% • PBS – Processor Basic Services • Resource allocation and management tools • 100% • PCI • Provide PCI interface • 100% • CCSDS – Format CCSDS packets • Used to wrapper events • 100%
Hardware Multiple towers (real or FESs) Multiple TEMs GASU Command Response Unit (CRU) Event Builder Module (EBM) ACD Electronics Module (AEM) Global Trigger Module (GEM) 1 COTS “SIU/EPU” CPU (cPCI) Ethernet Serial port SIB LCB EM2 FSW Release Goal: Demonstrate Multi-Tower, Single-CPU Operation with 1553 interface • Software • All of EM1 functionality • Multiple tower capabilities • AEM configuration • AEM event acquisition • Capability to inject marker events into event streams to provide notice of filter parameter changes • LAT mode transitions • Engineering and safe modes • LAT spacecraft interface • 1553 • Command and telemetry • File management system • Charge injection calibration • In Parallel • Filter development and testing • Boot development and testing
EM2 FSW Architecture “Spacecraft”(SIIS or SBC) Event Builder (EB) output side. The EB is an element of the GASU. LAT “Instrument” To SSR To SIU “SolidStateRecorder” Spacecraft Interface Unit Other Tasks 1553 Rx service Software Watchdog 1 PPSInterrupt GBM Interrupt Discretes(to RAD750 PIDs) LCB Rx service Multiple TowersorFront EndSimulators Masters Slaves 1553 Q Q SC Att./Time SC Att./Time Q Q Q Instr. Phys. Instr. Phys. Q Q Q Q File/Object File/Object Q Q Q HSK HSK Q Q Primitive Q Q Q 1553 Tx service LCB Tx service Q From SIU Event Assembly Event Builder (EB) input side. The EB is an element of the GASU. Command/Response Unit (CRU). The CRU is an element of the GASU.
SIU Functions … EM1 plus Extended LAT configuration Extended health, status monitoring SC Att./Time message processing File management Mode control EM2 Function/Task/Package Mapping • “EPU” Functions … EM1 plus • Event filtering • Charge injection calibration (all subsystems)
EM2 FSW Packages (page 1 of 2) • All of EM1 100% packages plus • FMP – File Management Package • 70% (use with RAM disk only) • EDP – EPU Event Dispatch • 100% • GCFG – Configuration of Front-End Electronics • Complete to 100% • SOP – SIU Event Output Package • Complete to 100% • HSK – Housekeeping and Low Rate Science • Handles housekeeping and low rate science data • 80% • Handle all hardware test points in EM2 • Monitor CPU metrics • CHP – CPU Housekeeping • Generate CPU metrics (memory, idle time) • 100%
EM2 FSW Packages (page 2 of 2) • MCP – Mode Control • Handle run control • 50% • EFP – Event Filtering • Filters out background events • 100% • GPS – Global Positioning System • Handles GPS time hack/message/LAT time correlation • 100% • LCP – LAT command handling • Dispatch of 1553 messages • 100% • CO1553 – 1553 driver • Interface to 1553 hardware • 100% • ZLIB – Data deflate/inflate • Compress/decompress files • 100%
Hardware All towers / FESs / TEMs ACD FES GASU Command Response Unit (CRU) Event Builder Module (EBM) ACD Electronics Module (AEM) Global Trigger Module (GEM) Multiple engineering RAD750s SIIS Full LAT FSW Release Goal: Demonstrate Full LAT Operation (Multi-Tower, Multi-CPU, SC interfaces) • Software • All of EM2 functionality • Boot and startup operations • LAT hardware power control • Thermal control system • Multiple processor capabilities • CPU to CPU communications • Scatter/gather synchronization by SIU • EPU configuration by SIU • Spacecraft message processing • Attitude, time, ancillary data • Event filter operation • Transient detection and reporting
Full LAT FSW Architecture Event Builder (EB) output side. The EB is an element of the GASU. LAT Instrument Spacecraft To SSR To SIU To EPU(s) SolidStateRecorder Spacecraft Interface Unit Event Processing Unit(s) Other Tasks Other Tasks 1553 Rx service Software Watchdog 1 PPSInterrupt GBM Interrupt Software Watchdog 1 PPSInterrupt Discretes(to RAD750 PIDs) LCB Rx service LCB Rx service Masters Slaves Slaves 1553 Q Q Q SC Att./Time SC Att./Time SC Att./Time Q Q Q Instr. Phys. Instr. Phys. Q Q Q Q Q File/Object File/Object File/Object Q Q Q Q HSK HSK HSK Q Q Primitive Q Q Q Q 1553 Tx service LCB Tx service LCB Tx service Q Q Q From SIU From EPU(s) Event Assembly Event Builder (EB) input side. The EB is an element of the GASU. Command/Response Unit (CRU). The CRU is an element of the GASU.