160 likes | 313 Views
OULE3 - Weak Lensing WP. Managers: Jean-Luc Starck , Filipe Abdalla (implementation) Benjamin Joachimi , Reiko Nakajima (validation) ~25 people ~23 implementation ~7 validation. Mainly to raise questions and discussions!. The WL WP – validation and implementation:.
E N D
OULE3 - Weak Lensing WP • Managers: • Jean-Luc Starck, Filipe Abdalla (implementation) • Benjamin Joachimi, Reiko Nakajima (validation) • ~25 people ~23 implementation ~7 validation. Mainly to raise questions and discussions!
The WL WP – validation and implementation: • Definition, development and testing of the algorithms for: • the computation of the tomographic (redshift-space) 2-pt (requirement) and 3-pt (goal) shear-shear, shear-galaxy, galaxy-magnification and shear-magnification correlation functions and power spectra/bispectra. Effects of masking, intrinsic alignments and nullingshould be taken into account. • The computation of the continuous (spherical harmonic) shear-shear and shear-galaxy power spectrum and bispectrum (goal). Effects of masking, intrinsic alignments and nulling should be taken into account. • Likelihood code to include all potential cosmology-model-independent errors on the shear-shear, shear-galaxy, galaxy-magnification and shear-magnification 2pt correlation functions and power spectra (requirement). • the algorithms for computation of the tomographic 2D mass, convergence and potential maps (requirement). Effects of masking should be taken into account. • The computation of the tomographic 3D mass, convergence and potential maps and 3D maps as a function of redshift (goal). Effects of masking should be taken into account.
Other relevant WP’s for the WL crowd • Definition and testing of the mask: • algorithms for the computation of the selection functions in the photometric Wide and Deep galaxy surveys (requirement) • algorithms for the computation of the selection functions in the spectroscopic Wide and Deep galaxy surveys (requirement) • algorithms for the computation of the luminosity function in different bands and of the stellar mass function, and corresponding error analysis (goal) • Testing of the algorithms for the computation of the selection using mock photometric and spectroscopic surveys (requirement) • Testing of the algorithms for the computation of the luminosity and stellar mass functions (goal)
Computation of the power spectra and correlation functions • From SWG discussion yesterday: • there are many codes which produce a power spectrum and a correlation function • lots of codes are publically available, correlation function codes more well develloped, Athena etc... however: • none power spectra codes have been used on data yet! • need to assess the accuracy of such codes: are they good enough for Euclid? • need to see if there are better ways of measuring the anisotropies spectrum: talk by Rassat: 2D vs 3D vs spherical Bessel • measurement of the cls via maximum likelihood? Is a maximum likelihood accurate under the assumption that the filed is not Gaussian? Does that include biases in the final DE/cosmology requirements? • measurements of the noise properties of galaxies -> Poisson shot noise? Correlated noise? What is the l_{max} needed? • How to deal with PDFs? Is using the photo-z bins ideal? What about shear PDFs?
Intrinsic alignments: nulling • Data products need to science ready: • Intrinsic alignments have to be dealt with on a data level (nulling) and probably on the modelling side as well by taking into account the cross correlation functions. • There should be overlap between SWGs and OULE3 here. • position shear cross correlation • position-position will give magnification. • nulling on IA in the data -> LE3 • modelling probably SWG ground. • Estimators for the nulled power spectra? Have they been defined, are they optimal? Problems with photo-z errors at this stage? • Can we filter the maps and create nulled maps?
Mass maps and intrinsic alignments • Many standard mass maps reconstruction methods. Standard aperture mass methods, recently wavelet methods. • Mass maps have to be reliable e.g. For mass calibration for clusters. Should not be just pretty pictures. • What is the bias from intrinsic alignments on the mass maps? What is the bias from different method reconstructions? What are the requirements on the bias on our maps? • Is there a method that takes account of intrinsic alignments? Currently I think not! • Is it possible to devise one with the maps of the cross correlations?
Likelihood codes • These will include data covariance (OU ground) and model covariance (SWG ground) • Possible that these are independent but possible that they can be significantly correlated. • In the former case a close work needs to happen in this area between the SWG and the OUs. • Structures for this need to be defined and outlined, to be agreed... • Need to define the inputs for such a code, are the data and model covariances are independent or not? • If we have a set of nulled spectra, how to write a weighted likelihood?
WL validation – Work Package tasks • Validation: check that implemented code meets requirements on both statistical & systematic errors • Required tasks: • Validate algorithm for 2-point real-space statistics of shear-shear, shear-galaxy, and possibly (photometric) galaxy-galaxy correlations (to measure magnification effects) • Validate algorithm for 2-point Fourier-space statistics • Validate algorithm for tomographic 2D mass/potential maps • Validate algorithm for covariances/full likelihoods of all 2-point statistics • Goals: • Validate algorithm for 3-point statistics of shear and galaxy correlations • Validate algorithm for 3D mass/potential maps
WL validation – Systematics validation for shear statistics • Stellar shape/position – shear correlations: • test for PSF effects, masking, background subtraction • → partly/completely done by OU-SHE? • Nulled shear statistics as function of survey time, solar aspect • angle, solar radiation, dithers, etc.: • test for PSF effects, CTI, background subtraction • → need to avoid cosmology dependence due to cosmic variance in • uncertainty of residuals! • Shear statistics in solar system/galactic/detector coordinates: • test for PSF effects, CTI, zodiacal light, masking • Inhomogeneity/anisotropy/parity-dependent statistics: • test for PSF effects, CTI, photometric zero-points • → ask Theory SWG for fundamental limits of inhomogeneity etc. in • exotic cosmological models?
WL validation – Systematics validation for mass maps etc. • B-mode maps • → cosmology-dependent for Euclid accuracy • EB map correlation • → cannot be used if parity-violating cosmologies viable • Consistency relations between convergence and shear (?) • Null tests with 2 maps computed from various galaxy samples • Further validation tests needed for: • 3point statistics • 3D mass maps • Galaxy-galaxy lensing • Magnification (both size and number density modifications) • Covariances (and cross-variances?) for all probes
WL validation – Open issues • Need S/N requirement on shear power spectrum to set cosmology- • independent requirements on validation of statistical errors? • Need new science requirements on tolerable errors on errors of • cosmological parameters? What about systematic effects on power • spectrum/ correlation function covariances? • Who validates validation code? → Implementation WP? • Who validates SDC code? → Validation WP? • Should the final data product be a shear power spectrum or an • ellipticity power spectrum? Or: Are intrinsic alignments a systematic • or a cosmological signal? • The systematics testing is going to be a joint project with OU-SHE • → Coordinate with OU-SHE systematics WP
WL validation – Plans for the next ~1 year • Locate & recruit expertise/manpower/code/simulations available within • Euclid ground segment and SWGs • Identify effects in the data/steps in the pipeline that require validation • at the 2-point statistics level (in coordination with OU-SHE) • Identify the key validation algorithms for all required data products • Requirements flowdown from LE0 to statistical validation requirements • Systematics requirement flowdown from LE1 to systematic validation • requirements • Set requirements on simulations/mock surveys needed for testing • validation algorithms as well as WP Implementation algorithms with • validation code (e.g. number of realisations, systematics included)
Simulations: short term • One possibility is to create simulations with simple Gaussian assumptions and the correct amount of correlations. • Some codes available: Manchester/UCL • *) Fast simulation code • - by Michael Brown (2011MRAS..410..2057B)/ indep version at UCL • - contains IA • - fast but not based on N-body sims, but a simple Gaussian realization of shear and IA • - combine with Cl code to produce input model power spectra • Problems that these will not have the correct non-Gaussianities. • is it a short task to create log-normal simulations of the Cls? Should we go directly to n-body? • log-normal code at UCL but no IA, no cross correlations. • stepwise -> MINI simulations slides from Jean-Luc. Adding in a progressive manner: IA, masking, correlated noise/mask, non-Gaussian density fields...
Simulations wish list mid/long term: • Have few sets of simulations that can: • be understood • have the right number of ‘degrees of freedom’ • some lower level simulations • as flexible as possible, • i.e. simulations for 3D might be v different. • hopefully be a small set with increasing complications. • All simulations needed even if made in-house should pass by OU-SIM and simulations SWG if only for a sanity check. • Ultimatelly we want an end-to-end simulation from OU-SIM/Simulations SWG
Gathering of current codes • Some preliminary list of software is already being gathered: • Cl codes (at UCL, Edinburgh, ManUni,...), • correlation function codes(Athena (Kilbinger), Hilbert) • three point code by Jarvis available • 2d and 3d mapping code • (PatricSimon, Saclay, Edin...) • talk by Rassat, 3DEX • list needs to be comprehensive at some point... • BUT probably a task for later when we have sims in place
Conclusions • Outlined the data products • Some caveats on each task outlined • Outlined the activities validation tasks • Next steps are: • *to define the document, links, sub tasks in much more detail. This will be one of our aims over the next year. • *to discussion the mocks needed for the tasks above to be performed to the needed accuracy. • -> we want these to arise over the splinter session.