1 / 20

PLATO Data Processing Methods for Planetary Transits & Star Oscillations

Explore the onboard processing modes and functions of the PLATO mission in February 2009. Learn about smearing correction, weighted mask photometry, kinematic aberration correction, jitter correction, and more. Understand the impact of various corrections on star observations and data quality. Dive into topics like jitter correction necessity, PSF determination, sky background modeling, and onboard processing architecture. Discover how star samples are processed and dimensioned onboard for accurate data treatment. Join the PLATO data treatment team on this fascinating journey through celestial data processing.

smithmanuel
Download Presentation

PLATO Data Processing Methods for Planetary Transits & Star Oscillations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1.  PLATO PLAnetary Transits & Oscillations of stars Data onboard treatment PPLC study February 2009 on behalf of Reza Samadi for the PLATO data treatment team

  2. Onboard processing modes

  3. Main functions of onboard/onground processing • Observation mode: • smearing correction • weighted mask photometry / aperture photometry • kinematic aberration correction • jitter correction • Configuration mode: • measuring/modelling PSF • measuring/modelling sky background

  4. Smearing correction • Normal telescopes: • sampling: 25s • integration time: 23s • readout time: 2s • Fast telescopes: • sampling: 2.5s • integration time: 2.25s • transfer time: 0.25s overscan rows CCD registers • measure smearing thanks to overscan rows • subtract from each image before after

  5. PSF target star mV= 11 nearby faint star weighted mask Weighted mask photometry minimizing the impact of confusion cf CDF study • the target star can be polluted by a neighbouring star • to avoid confusion : use of a weighted mask • weights = integral of PSF over pixel • need to know the PSF • normal aperture photometry to be used for brighter stars ~ 90 % of the flux

  6. flux time Differential kinematic aberration PLATO: • large field of view : 42° • pixel size : 12.5” (14.3”) • the effect is much more important than for CoRoT • star displacements over 1 month : ~ 7 pixels (worst case) • will induce an unacceptable decrease of the flux • thermoelastic variations of the telescope pointing direction can also induce star displacement mV=11 5 months - update the mask position frequently - avoid flux loss - introduce periodic perturbation - need to limit impact of this perturbation - update every hr (tbc) - hourly update is entirely predictible - less frequent update for telescope variations

  7. Is jitter correction at all necessary? • CoRoT : 0.25'' rms + orbital components • PLATO : specified : 0.2'' rms with ref to photon noise • PRNU does not seam to be a problem • Depending of the jitter noise level and nature : the perturbation can be important or negligible • For bright star the contribution can be important if the jitter is ~ 0.5'' rms or more • Aperture photometry results in negligible perturbations

  8. Jitter correction • from the knowledge of the PSF, we can predict the perturbations induced by any displacement: PSF mask mask Surface for the jitter correction Fialho et al (2007, PASP)‏ • This method also corrects for differential aberration • The presence of polluting sources can be accounted for in the correction surface • Accurate knowledge of the star displacements: x, y is needed • Accurate PSF is needed

  9. PSF determination (configuration mode) Assumptions, for each telescopes : • The PSF varies slowly across the field of view • We have available N (=1000) reference stars with associated image time series • We have a functional form of the PSF as a function of K parameters ai (eg. width , skewness, etc): PSF(x,y) = fa1,a2,…,aK(x-x0,y-y0)‏ For each star, for each telescope: • We constrain the parameters using the image time-series. The fitted parameters ai (j) are then considered as a function of the position [x0(j) and y0(j)] of the star j. A 2D polynomial interpolation is then performed to derive the values of the parameters at any position across the field of the telescope. However, PSF can depend on the star colour => 3D polynomial interpolation (x,y,colour) ? Procedure to apply at TBD frequency (once a week?) PSF used to calculate mask weights and jitter correction surface

  10. Sky background determination (configuration mode) • set 400 background windows per telescope (100 per CCD)‏ • collect a long enough time series of background measurements • background is modeled using a 2D polynomial fit • The sky background level can then be estimated at any position, then for all stars in the FOV.

  11. Onboard processing dimensioning: star samples • Sample P1 : mv < 9.6 - 11.15 ; noise level < 27 ppm/h • 10 000 stars : photometry @ 50s , centroids @ 600 s • Subset : N = 1000 references stars, mv= 8.6-9.6, individual light curve • Sub-images (imagettes) : n = 400 stars @ 25 s sampling • Sample P2 : mv < 12 ; noise level < 80 ppm/h • 20 000 stars @ 600s • Oversampled : 400 stars @ 50s sampling • Sample P3 (P4) : 4.75 < mv < 7.3 noise level < 27 ppm/h • 500 (1 000) stars @ 50s • Subset: 100 stars centroids @ 2.5 s • Sub-images (imagettes) : m = 100 @ 50 s • Sample P5: mv < 13.5 ; noise level 80 ppm/h ; no centroids measured • 80 000 stars @ 600s • Oversampled : 1000 stars @ 50s • Background windows : 400

  12. Onboard processing architecture 1 DPU per telescope + 1 ICU (+1 redondant) - case 1: perform onboard average - case 2: downlink all individual LC trade-off needed very soon !

  13. Normal telescope DPU processing

  14. Normal telescope data flow and TM volume

  15. Normal telescope data flow and TM volume

  16. Fast telescope data flow and TM volume

  17. Total TM rates case 1: perform onboard average case 2: downlink all individual LC

  18. Case 1 .vs. Case 2 trade-off • Case 1 : only 1000 LCs from Sample P1 are downloaded : • 31 Gb/day (with compression)‏ • Case 2 : all LCs are downloaded : • 71 Gb/day (with compression)‏ • Case 1 : jitter correction to be done onboard ! Outlier discarding and LC average to be done on board. Strong constraints on the onboard processing, no replay possible. • Case 2 : jitter correction can be done onground ! Outlier discarding and LC average done on ground.

  19. Onboard processing H/W dimensioning CPU for one DPU LEON processor at 100 MHz CPU occupation rate = 40%

  20. Open issues • Trade-off between Case 1 and Case 2. Case 2 is preferred, but can we afford to downlink 71 Gb/day of science data ? • Pointing performances ? Level and nature of the jitter ? Is jitter correction needed? • Exact threshold in magnitude between weighted photometry and aperture photometry ? • Model for the PSF ? • Resolution required for the jitter correction ? • Resolution required for the calculation of the weighted mask ? • Photometry of the saturated stars ? Down to which magnitude ? • Calculation of the barycenter : thresholding ? simple mask ?

More Related