380 likes | 522 Views
The LaSilla/QUEST Variability Survey. A Progress Report. Charles Baltay Yale University February 7, 2012. Supernova Survey Started. Started Supernova search Dec 7, 2011 Started using reference images from last year, so we were finding supernovae of all ages
E N D
The LaSilla/QUEST Variability Survey A Progress Report Charles Baltay Yale University February 7, 2012
Supernova Survey Started • Started Supernova search Dec 7, 2011 • Started using reference images from last year, so we were finding supernovae of all ages • Making new reference images as we go, are now using new references east of the galaxy • Finding interesting candidates which are listed on the Archival Data Base on the web.
Supernova Survey Started • Started Supernova search Dec 7, 2011 • So far had very limited spectroscopic follow up facilities (almost none in January) • Nevertheless so far we have 8 spectroscopically confirmed supernovae (6 Type1a’s, 1 Type 1c, 1 Type II) • Of these we are following 4 with the SWOPE Telescope in 6 filters (uBVriz). • 3 have between 24 and 40 points on the lightcurve, the 4th we are just starting • Mark Phillips is following these in IR on the DuPont 2.5 m
Survey running smoothly since Sept 2009 • Searched for TNO’s, RRLyra stars, reference frames • Scanned ~ 1500 square degrees every night • Broadband 4000 to 7000 A filter • 2 or 3 exposures per night of each area • Typically 60 sec, also some 180 sec exposures • Close to daily cadence • Covered close to 20,000 sq degrees various no of times
Choice of filters for LaSilla • At Palomar had (still have) three filter sets: • Johnson UBRI • Gunn rizz or griz • RG610 red filter • Fringes in Point-and-Track exposures a bother in I, i, and z filters (not a problem in driftscan) • Designed and built a broad band filter ~4000A to 7000A ( close to g+r )
Seeing at LaSilla Exposure time FWHM 60 sec 1.7 arcsec 180 sec 2.0 arcsec 60 sec 180 sec For all nights that the dome was open Depth limit for 60 sec exps 21 on dark nights 20 with bright moon
Data Reduction • Berkeley Subtraction program (Peter Nugent), to be run daily by Yale on LBL computers, for Supernova discovery, using data transferred to LBL in close to real time---now running, still fine tuning parameters • Yale Pipeline, using DeepSky front end and Sextractor, for transient and TNO searches and variability studies, run daily at Yale using data transferred to Yale in close to real time—this has been running for the last year and a half • DeepSky COADD pipeline, run at LBL, for deep survey.
Subtraction for SNE-20100413 New-Reference Reference Image New Image
La Silla Transient Search Supernova Plan Scheduler TNO Plan QUEST Camera Data Archive Raw Data Pre Processor Yale Pipeline Data Quality Check C Scanning Data Base Subtraction Search Prog Screening Data Base Archival Archival Data Base OARICAL Classifier Various Catalogs
LaSilla/QUEST Flowchart Telescope Camera Scheduler
The Screening Data Base • Used by human screener to reject remaining bad subtractions and other artifacts, select credible transient candidates • Has a web based interface, can be accessed from anywhere in the world • Screening is presently done by Ellie Hadjiyska at Yale and Uli Feindt (Bonn); could use more help later • Click button automatically passes on credible candidates to the Archival Data Base
The Archival Data Base • Has a web based interface accessible to all members of the consortium • Has three kinds of “pages” • Main list of candidates-has one line for each candidate selected by the candidate selection program (OARICAL later) and passed by the human screener • History page-has all QUEST observations both before and after discovery, as well as all followup images and spectra • Details page-detailed comments and other information • Data Base where all collaborators can choose the candidates they wish to follow up
Proceedures and Data Bases • A first version of all of the procedures and data bases are now in place • The first few months will be the start of a learning curve • Continue to tune parameters of the subtraction program to improve supernova finding efficiency. We have been at this for a few months, finding efficiency now around 80%. • Implement the OARICAL classifier • Continue improving data bases • Supernova Candidates are available on the Web
Typical numbers at this point • Output of candidates from SExtractor run on subtracted images • Millions per night • Crude cuts (mag error less then 0.5, no bad SExtrator flags,etc) to put candidates on Scanning Data Base • Typically thousands to tens of thousands per night • Quality cuts to send candidates to human screener (Screening Data Base) • Typically around 500 per night • Human Screening removes remaining bad subtractions and artifacts, passes good candidates to Archival Data Base • Typically about a dozen per night
Quality Cuts to send Candidates to Human Screener • Magnitude error ≤ 0.17 (S/N ~ 6) • 1.5 ≤ FWHM ≤ 5 arcsec • No of 2σ undershoots ≤ 6 • No of 3σ undershoots ≤ 2 • Symmetry parameter ≤ 10 • Semimajor axis ≥ 0.7 arcsec • Semiminor axis ≥ 0.65 arcsec • Remove variable stars that have a pointlike image on the reference image with • Less then 3 arcsec from candidate • FWHM consistent with the seeing on that frame to within 2 arcsec
Fake Supernovae in Raw Data • Inserted fake supernovae to tune candidate selection cuts and estimate supernova finding efficiencies • Started with Sloane galaxies with z ≤ 0.1 • Moved isolated star image on top of galaxy in raw data • Normalized fake magnitude to a Type 1a ten days before peak with the same z as the galaxy • Probability of placing fake on galaxy proportional to total light in galaxy • Distribution of distance of fake from center of galaxy follows light distribution of galaxy
Cut on magnitude error Fake supernova Candidates Magnitude error
Cut on Full Width Half Maximum Fake supernova Candidates FWHM (arcsec)
Efficiency for Fakes vs. Magnitude Eff % magnitude
Efficiency of Fakes vs. Distance from Galaxy Eff % Distance from center of galaxy (arcsec)
Selecting Candidates for followup • Start with the Archival Data Base. • We expect that this will have a dozen entries per night, too many to follow up. • Look at the two columns labelled • Type-assigned by human scanner for “good subtractions” • Supernova-looks like a galaxy nearby • Variable-reference frame has star like image in same place • Asteroid-nothing on reference fr, two scans within few mins • Unknown-none of the above but good subtraction • Catalog Type-assigned after looking at catalogs • Asteroid • Star, AGN • Galaxy nearby • Unknown—can not find on catalog
Selecting Candidates for followup • Start with the Archival Data Base. • Veto candidate if Catalog Type is Asteroid, Star, AGN • Veto if History page shows previous observation with similar magnitude many days before discovery • Select if TypeCatalog Type Supernova galaxy Supernova unknown Variable galaxy Asteroid galaxy Unknown galaxy Unknown unknown We expect that this will get us down to a few candidates to follow up per night
Some statistics • Searched Dec 7, 2011 to Jan 4, 2012 • 27 nights, posted 86 candidates~3.2 per night • Jan 5 to jan 19 • No spectroscopy followup, so stopped searching, concentrated on getting reference images east of the galaxy • Jan 21 to now (Feb 3) • 13 nights, posted 44 candidates, ~3.4 per night • Of the 130 candidates posted • had spectroscopic followup for 16 • Spectroscopically confirmed supernova 8
Initial Plans for the Survey • Run supernova search 9 months of the year September 1 to May 31 • Rolling search with a 2 or 3 day cadence • Two observation a night separated by ~ 1 hour to eliminate asteroids, airplanes, cosmic rays etc • 60 second exposures • 1000 to 1500 sq degrees (twice) a night • Broad 4000 to 7000 A filter • Schedule areas of the sky that can be followed for at least 60 days ( a program to do the scheduling exists)
Who will do what • Survey scheduling Ellie Hadjyska,Yale • LaSilla Telescope and Camera operations Dave Rabinowitz,Yale • Run subtraction program Nan Ellman, Dave Rabinowitz, Yale • Continue tuning subtraction program Peter Nugent,LBL, and Dave Rabinowitz, Yale • Run program to select candidates- Dave Rabinowitz, Yale • Implement and run OARICAL Classifier- Josh Bloom, Berkeley • Human scanning of candidates- Ellie Hadjyska, Yale, UliFeindt, Bonn • Keep Archival Data Base up to date- Ryan McKinnon, Yale
Followup Responsibilities • SNIFS spectrometer on the Hawaii 2.2m • lightcurves from spectra taken every 3-4 days • Supernova Factory(Greg Aldering, Saul Perlmutter et al) • Infrared with the Las Campanas 2.5m DuPont Tel • CSP, Mark Phillips et al • Imaging on the Las Campanas 1m Swope Tel lightcurves from imaging in several color filters • Scheduled by Ellie Hadjyska, Yale • Data analysis-? Mark Sullivan, Oxford, Kowalski et al, Bonn ?? • PESSTO- Spectra on the La Silla NTT Telescope