1 / 50

Algorithm Verification for AATSR

Algorithm Verification for AATSR. Andrew R. Birks Rutherford Appleton Laboratory Chilton, Didcot Oxfordshire OX11 0QX UK ENVISAT Calibration Review: 12 September 2002. Level 1b Product Verification. Overview: The Algorithm Verification Plan Format and Header Verification

rad
Download Presentation

Algorithm Verification for AATSR

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithm Verification for AATSR Andrew R. Birks Rutherford Appleton Laboratory Chilton, Didcot Oxfordshire OX11 0QX UK ENVISAT Calibration Review: 12 September 2002

  2. Level 1b Product Verification • Overview: The Algorithm Verification Plan • Format and Header Verification • Verification of ADS • Verification of MDS • Auxiliary File Tuning • Browse Product • Conclusions on L1b Data Quality • Proposals for enhancements

  3. Overview • The Algorithm Verification Plan • Preliminary Issues • Changes to IPF in early commissioning • Data for Algorithm Verification

  4. AATSR Operational And Reference Processors • Operational Processor (OP) • Component of ENVISAT Ground Segment: conforms to Ground Segment architecture • Procured by ESA from Industry • Processing Algorithms (Detailed Processing Model) specified by RAL and based on ATSR-2 processing • Reference Processor (RP) • Independent implementation of DPM developed at RAL • Source of Test Data Sets for acceptance testing of Operational Processor • Algorithm Test Platform • Stand-alone Processor

  5. Processor Acceptance Testing • Test Data Sets for AATSR OP acceptance • generated by the AATSR Reference Processor (RP) • based on pseudo-AATSR data derived from ATSR-2 source packets • OP Acceptance Testing made significant use of intermediate breakpoints • RP was tested against ATSR-2 data products • testing to date may not fully address new elements not present in ATSR-2 processing • limited range of test conditions

  6. Objectives • To verify that • the algorithms used by the OP work correctly when presented with real AATSR data • the AATSR products are being correctly generated • To verify, and if necessary regenerate, auxiliary data files used by AATSR OP • Algorithm Verification is distinct from instrument commissioning activities (described in AATSR Commissioning Plan) • Algorithm Verification logically precedes Product Validation (described in AATSR Validation Plan), which is concerned with the geophysical validation of the products

  7. Basis of Verification Plan • Verification primarily based on scrutiny of AATSR Products • Plan does not assume availability of RP data • In fact RP data was available and has been used • Processor breakpoints not routinely available • Plan provides for use of ATSR-2 data if available • Not all Level 1B processing modules can be verified directly in this way since intermediate results do not appear in products • Problems will appear indirectly • Breakpoints required for direct verification • Operational processor breakpoints may also be required to investigate specific problems should they arise

  8. Verification environment • AATSR platforms (Sun Ultra running SunOS 5.6) • aatsr: original RP platform • aatsr2: engineering data system platform • solstice: development machine • IDL tools implemented on aatsr and solstice • Level 1B verification tools developed from RP testing utilities • gbtr_comp: displays and compares channel/view images • cloud clearing verification • display of confidence flags • ENVIVIEW • Used for header verification and inspection of single fields • Available on any machine supporting JAVA

  9. Visible Calibration Issues • Issues arising prior to or outside the scope of algorithm verification proper • VISCAL algorithm did not work with real AATSR data because of undocumented differences in VISCAL monitor sampling between ATSR-2 (basis of test data) and AATSR • Solution: New algorithm (update to DPM) and new PC1 file • VC1 File: difference in units of visible calibration coefficients between test and pre-launch INS file (traceable to differences in specification of ATSR-2 and AATSR VISCAL gains) • Interim solution - new VC1 file. This will be resolved automatically when the new calibration algorithm becomes operational

  10. Other Preliminary Issues • Scan jitter: IPF wrongly treated scans affected by scan mirror jitter as invalid, resulting in missing scans on images • Solution: IPF corrected • I/R calibration: Errors in pre-launch INS file (BB temperature conversion parameters) led to erroneous calibration and associated log file errors • Solution: New INS file • Browse algorithm: issues of visual appearance • Modifications proposed

  11. Changes to IPF in early commissioning • NRT data supplied for algorithm verification processed by IPF Version 5.01.02 • Scan jitter error corrected (V5.02.00) • Modified Browse algorithm (V5.02.00) • Histogram equalisation removed from daytime algorithm • Modifications to night-time algorithm concept • New VISCAL algorithm (pending)

  12. Data for Algorithm Verification

  13. Level 1b Product Verification • Overview: The Algorithm Verification Plan • Format and Header Verification • Verification of ADS • Verification of MDS • Auxiliary File Tuning • Browse Product • Conclusions on L1b Data Quality • Proposals for enhancements

  14. PRODUCT STRUCTURE • All ENVISAT Products conform to the following structure: • Main Product Header (MPH) - Identical format for all ENVISAT Products • Specific Product Header (SPH) - Product specific. Contains Data Set Descriptors (DSD) that point to the subsequent data sets • One or more Data Sets: these may be: • Annotation Data Sets (ADS) - containing auxiliary data • Measurement Data Sets (MDS) - containing instrument data • Granule: the minimum length of an orbit segment used as an increment for definition of floating scenes • For AATSR, 1 granule = 32 image rows

  15. Format and Header Verification • Verify that: • MPH is present and has the expected size • SPH is present and has the expected size • All ADS and MDS types as specified in the IODD are present • SPH contains one DSD corresponding to each data set present • SPH contains all required reference and spare DSDs • Total product size is consistent with the sum of its component data sets, as defined by the Data Set Descriptors • Specified MPH field contents have realistic magnitudes • Specified SPH field contents have realistic magnitudes • Each product type independently verified

  16. Format verification • IDL tool verifies that: • First line of MPH starts with correct keyword string ‘PRODUCT=’ • First line of SPH starts with correct keyword string. (This verifies indirectly that the length of the MPH is correct.) • The SPH length is correct • The number of DSDs is correct as specified in the IODD • DSD parameters are internally consistent • File size calculated by the procedure should equal that specified by the operating system directory listing • This test utility was run successfully on each of the supplied files • No errors were reported, and the calculated file size matched that given by the directory listing in each case

  17. Specific Product Header • Detailed scrutiny of product SPH has revealed two minor discrepancies: • The auxiliary temperatures are misplaced • The geolocation mid-values (first_mid_lat, …, last_mid_long) are not compliant with the DPM

  18. Level 1b Product Verification • Overview: The Algorithm Verification Plan • Format and Header Verification • Verification of ADS • Verification of MDS • Auxiliary File Tuning • Browse Product • Conclusions on L1b Data Quality • Proposals for enhancements

  19. LEVEL O DATA AUXILIARY DATA CLOUD CLEARING & SURFACE CLASSIFICATION GEOLOCATION & RESAMPLING RADIOMETRIC CALIBRATION LEVEL 1B DATA Level 1b Processing Architecture

  20. GBTR PRODUCT STRUCTURE (1)

  21. GBTR PRODUCT STRUCTURE (2)

  22. GBTR PRODUCT STRUCTURE (3)

  23. Verification of ADS • Summary Quality ADS • Geolocation ADS • Scan Pixel x and y ADS • Solar and Viewing Angles ADS (nadir and forward view) • Scan and Pixel Number ADS • VISCAL GADS • Products to date do not include the VISCAL GADS • Verification cannot take place until products using the new VISCAL algorithm become available

  24. Summary Quality ADS • ADS contains packet validation words • Verification requires that • numerical values are within allowed limits • non-zero fields correspond to actual packet errors • Inspection of SQADS from L1B product shows • Only non-zero fields corresponded to scan jitter • Other invalid scans, visible in the images, are not flagged in the SQADS

  25. Geolocation ADS • ADS includes latitudes and longitudes for image tie points • Verification procedure: • Calculate distance between pairs of tie points using exact geodetic formula • Confirm that these distances correspond to the correct grid spacing • Along-track comparison: • compare calculated interval between adjacent rows with first differences of scan y co-ordinate • Across-track comparison • Compare calculated intervals with nominal grid spacing

  26. Geolocation results • In each direction the discrepancies between the calculated and expected values lie within the bounds expected on the basis of the numerical precision • Verification satisfactory

  27. Scan Pixel x and y ADS • ADS contains linear, x and y co-ordinates for instrument tie pixels • Confirm that: • The x co-ordinate varies monotonically and in the correct range from one end of the record to the other • For a given scan, the y co-ordinate shows a minimum (nadir view) or maximum (forward view) in the swath centre. These should differ by approximately 900 km • The y co-ordinate increments by 1 granule (about 32 km) between consecutive tie scans

  28. Scan pixel x and y results • Representative results for individual scans: • x co-ordinate • range +267,076 to -267,832 (nadir view) • range -281,343 to +267,683 (forward view) • y co-ordinate • extrema of typical scan separated by 928,742 m • mean increment for selected records 32,185 m; variations about mean consistent with expected errors • No discrepancies against formal test criteria • Discrepancy between time of final ADS record and sensing stop time suggests final record may be written incorrectly

  29. Solar and Viewing Angles ADS • Azimuth and elevation of sun and of satellite as seen from tie pixels • Verify that azimuth and elevation angles show physically realistic values and variation around orbit • Quantitative results not yet available • Investigation shows that the IPF is not implementing cosmetic fill of data gaps correctly. (Cosmetic fill is required to ensure that the cloud clearing modules do not use invalid angles in data gaps.)

  30. Scan and Pixel Number ADS • Information to relate the image pixels to the instrument scans from which they came • Provided to allow (partial) reconstruction of ungridded products • Pixel number should vary systematically along the record, within the correct range defined by the n/f view pixel selection map • Scan number should increment by 32 ± 1 between consecutive records • Inspection has identified two deviations from the DPM affecting these ADS • The cosmetic fill of nadir scan numbers defined by the DPM appears to be absent. (This is required to guarantee the correct time tagging of AST product cells in Level 2 processing) • Each ADS appears to start with a supernumerary record of zeros

  31. Level 1b Product Verification • Overview: The Algorithm Verification Plan • Format and Header Verification • Verification of ADS • Verification of MDS • File Tuning • Browse Auxiliary Product • Conclusions on L1b Data Quality • Proposals for enhancements

  32. Verification of MDS • Image Plane • Confidence Words • Land and Cloud Flags

  33. Image Plane Consistency • Full resolution image products have been inspected using a product comparison utility. Side-by-side comparisons have included • Comparisons of different channels and views within the same product • Comparisons of IPF product with the corresponding AATSR Reference Processor (RP) product for the same orbit • Visual appearance of images consistent between channels and views within the same product • Visual agreement between IPF and RP products good • Good agreement between processors where exception values (pixel saturation or zero values) flagged

  34. Further MDS Comparisons • Brightness temperature histograms of infra-red pixels show good agreement between processors • Visible channel reflectivities could not be compared because IPF products used pre-launch VC1 file • Coastline superposition shows some mis-registration of forward and nadir views • This is to be expected because the IPF is still using the pre-launch CH1 file • Geolocation will be corrected when new CH1 file containing instrument misalignment parameters become available • No unexpected artefacts

  35. Treatment of Invalid Scans • IPF products show scans affected by jitter as missing. This is corrected in IPF Version 5.02.00 • Invalid scans are infrequent, but differences are apparent between the IPF and Reference Processor in their treatment • e.g. At one point a missing scan (exception value -3) appears in the IPF product but not in the RP product • In two other cases the treatment appears to differ between processors • It is not clear at present whether this indicates an error in either processor. Further investigation is required

  36. Confidence Words • Preliminary inspection of the confidence words shows no obvious discrepancies or artefacts • At night, noise in the visible channels cannot be distinguished from pixel exception values, and can be masked into the confidence flags • This is not a deviation from the DPM, and so is strictly outside the scope of algorithm verification. However, consideration should be given to modifications to address this issue

  37. Land and Cloud Flags • Cloud clearing algorithm was exhaustively verified during IPF development, so major problems are not expected in this area • Availability of RP products permits direct comparison of land and cloud flags between processors using cloud viewer utility • Discrepancy in medium / high level test at start of orbit 1165 product. This has been traced to the absence of cosmetic fill in the solar angles ADS

  38. Level 1b Product Verification • Overview: The Algorithm Verification Plan • Format and Header Verification • Verification of ADS • Verification of MDS • Auxiliary File Tuning • Browse Product • Conclusions on L1b Data Quality • Proposals for enhancements

  39. Auxiliary File Tuning Level 1 • Level 1B Processor Configuration Data File ATS_PC1_AX • Revised file required for new VISCAL algorithm • Initial revision supplied with test data for new algorithm; second revision envisaged when new algorithm becomes operational • Instrument Data File ATS_INS_AX (Telemetry Conversions) • Errors in BB conversion parameters identified; new version supplied • General Calibration Data File ATS_GC1_AX • Derived from pre-launch calibration - no updates currently envisaged

  40. Auxiliary File Tuning Level 1B • Level 1B Characterisation Data File ATS_CH1_AX • Update required to include instrument misalignment parameters • This will also include revised AOCS values • May include modified regridding tolerance values • Cloud LUT Data File ATS_CL1_AX • No update currently envisaged • Visible Calibration Data File ATS_VC1_AX • Not part of algorithm verification. New file supplied to correct scaling errors in pre-launch file, but will be replaced in due course by routine operational updates

  41. Level 1b Product Verification • Overview: The Algorithm Verification Plan • Format and Header Verification • Verification of ADS • Verification of MDS • Auxiliary File Tuning • Browse Product • Conclusions on L1b Data Quality • Proposals for enhancements

  42. Browse Product • Three-colour image product at reduced resolution • Daytime browse product based on Ionia scheme using the 11 micron channel plus two visible channels • Night-time algorithm is based on the 11 micron brightness temperature image alone • Post-launch modifications to algorithm • IPF Version 5.01.02 implements the DPM algorithm • IPF Version 5.02.00 implements modified algorithm • histogram equalisation removed from day-time algorithm • simplification of night-time algorithm

  43. Browse Product Verification • Browse product from the IPF Version 5.01.02 available • Format verified; no discrepancies • Verification of MDS contents premature until products from the revised processor become available • Browse LUT Data File ATS_BRW_AX • No update currently envisaged, pending final form of new algorithm

  44. Level 1b Product Verification • Overview: The Algorithm Verification Plan • Format and Header Verification • Verification of ADS • Verification of MDS • Auxiliary File Tuning • Browse Product • Conclusions on L1b Data Quality • Proposals for enhancements

  45. Conclusions on L1b Data Quality • Format tests have been applied to all products, and quantitative and qualitative tests have been applied to the Level 1b products • No format errors found • No major or blocking errors identified to date • Inspection of the Measurement Data Sets indicates that the instrument data is of good quality • No major unexpected artefacts or anomalies identified to date • Open issue of treatment of invalid scans • Detailed scrutiny of the product headers and Annotation Data Sets (ADS) has revealed a number of minor discrepancies • Problem reports have been raised and are under review

  46. Level 1b Outstanding Issues • The following Level 1b verification tests are still incomplete • Quantitative verification of solar and viewing angles • Comprehensive checks of cloud flags according to the Verification Plan • Verification of confidence flags for a complete orbit • The following awaits data from the new VISCAL algorithm • VISCAL GADS Verification • The following requires results using the revised CH1 file • Full verification of geolocation

  47. Level 1b Product Verification • Overview: The Algorithm Verification Plan • Format and Header Verification • Verification of ADS • Verification of MDS • Auxiliary File Tuning • Browse Product • Conclusions on L1b Data Quality • Proposals for enhancements

  48. Exception flags • Pixel exception values are small negative values • At night, noise in the visible channels can mimic exception values • Causes confidence flags to be wrongly set • Potential solution: modify exception values • Internally - solves problem with confidence words • Externally - resolves ambiguity

  49. Regridding Scheme • Original ATSR scheme modified for AATSR • Requirement driven by need to reduce size of scan/pixel number ADS (ADS #1, #2) • These ADS permit reconstruction of original (ungridded) pixel distribution on scans • Modifies destination grid point of some pixels to allow reconstruction over granule • Unfilled pixels in forward view caused by interaction of x - y geolocation errors with regridding algorithm changes • Modify x - y geolocation algorithm to reduce errors

  50. Level 2 Product Verification • Format verification has been applied to all Level 2 products received: no format errors found • Full resolution product ATS_NR_2P (GST) • Preliminary inspection of MDS with image viewer • ADS not yet inspected (contents should agree with corresponding Level 1 product ADS) • Averaged Products ATS_AR_2P (AST) and ATS_MET_2P (Meteo) • Preliminary inspection with display tool and ENVIVIEW

More Related