1 / 46

Software Benchmarking Results V. Husson

Software Benchmarking Results V. Husson. Benchmark Report Card (Orbit Comparisons). Benchmark Report Card (Residual and Correction Comparisons). Benchmark Report Card (SINEX File Comparisons). Orbit Definitions. Orbit A - Nominal Model (Initial orbit, NOTHING adjusted during the run)

mariell
Download Presentation

Software Benchmarking Results V. Husson

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Benchmarking ResultsV. Husson

  2. Benchmark Report Card(Orbit Comparisons)

  3. Benchmark Report Card(Residual and Correction Comparisons)

  4. Benchmark Report Card(SINEX File Comparisons)

  5. Orbit Definitions • Orbit A - Nominal Model (Initial orbit, NOTHING adjusted during the run) • Orbit B - Fixed EOP and Station Coordinates (Iterated orbit, ONLY orbit adjustment!) • Orbit C - Final Orbit (ALL adjusted: Orbit, Station Positions, Biases, EOP)

  6. Summary of Solutions

  7. Orbit B Orbit A Radial Comparisons(Orbit A & B) CRL-CSR IAAK-CSR JCET-CSR NERC-CSR AUSLIG-CSR NASDA-CSR

  8. Radial Comparisons (Orbit A & B)

  9. GEODYN Radial Analysis (JCET vs GEOS)

  10. Radial Comparisons (Orbit C)

  11. Radial Comparisons (Orbit C)

  12. Radial Comparisons (Orbit C)

  13. Radial Discontinuities (Orbit C)

  14. Radial Discontinuities (Orbit B)

  15. Radial Discontinuities (Orbit A)

  16. Residual Comparisons

  17. Residual Comparisons (Orbit A)

  18. Residual Analysis of 1st Pass (Orbit A)

  19. JCET - ASI GEOS - ASI JCET - GEOS GEODYN Residual Analysis Orbit A B C

  20. GEODYN Residual Analysis(Midnight Crossing)

  21. NASDA Residual (Orbit B) Large residuals on 1st 3 normal points on Nov 1

  22. Residual Comparison (Orbit B)

  23. Residual Comparison (NERC)

  24. Refraction Analysis

  25. Refraction Analysis

  26. NASDA/IAAK Refraction Analysis

  27. NASDA Refraction Analysis

  28. IAAK Refraction Analysis

  29. Refraction Analysis of 1st pass

  30. CSR Refraction Analysis

  31. CSR Refraction Analysis

  32. GEODYN Refraction Analysis

  33. DGFI Refraction Analysis

  34. Center of Mass Corrections • Everyone is using .251 meters for LAGEOS • JCET CoM corrections in their V4 .cor files are in error (these files state a CoM of .252 vs .251m, but .251m was actually used) • Software changes may be necessary to accommodate system dependent LAGEOS CoM corrections.

  35. Relativity Correction

  36. Relativity Corrections

  37. Relativity Corrections

  38. Relativity Corrections

  39. SINEX File Comparisons(parameters and unknowns)

  40. Range Bias Comparison (Orbit C)

  41. Height Comparisons (Orbit C)

  42. Height Comparisons (Orbit C)with Range Bias removed

  43. Lessons learned • Need to specify minimum resolution of parameters to be compared • Need clearer definition of model standards

  44. Future Software Modificationsthat may require benchmark testing • Station dependent CoM corrections • Use Bias File for apriori Biases • Mutli-color data capability • Weight data based on #obs/bin

  45. Recommendations • QC your own files before you submit your solution • Report all range corrections in the residual file to at least 0.000001 meter (i.e. 0.001 millimeters) • Need to verify if any problems found in the benchmark will impact the corresponding POS/EOP solution(s) • Put benchmarking presentations on-line, ASAP • Distribute finding to ACs not in attendance, ASAP • In the POS/EOP pilot project, submit at least 1 solution with the .orb and .res files to ensure problems identified in the benchmark do not “sneak” back in.

  46. What’s next • What analysis should be performed that has not yet been performed? • Establish pass/fail criteria for report card • Test for time-of-flight and epoch rounding/truncation issues • Do we need to modify our modeling requirements? • Should we test and isolate any particular types of models (e.g. range bias estimation, along track acceleration) • Should we expand the dataset to include LAGEOS-2 and/or Etalon, other satellites? • SP3 format for Orbits (are we ready?) • Separate Orbits and Software Benchmarking? • Document and distribute results • List action items • Anything else??

More Related