290 likes | 302 Views
This document outlines the prerequisites for analysis, including calibration constants, software framework, simulation data, online trigger, filter and reconstruction algorithms, monitoring facility, data quality assurance, and organization of analysis effort.
E N D
Prerequisites for Analysis(in addition to a detector that is producing data) • calibration constants • software framework • simulation and simulated data • functioning online trigger • filter and reconstruction algorithms • monitoring facility • data quality assurance • organization of analysis effort • equivalent of CERN “Yellow Book” theoretical discussion of possible analysis topics Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Prerequisites for Analysis(in addition to a detector that is producing data) • calibration constants (underway) • software framework (ready) • simulation and simulated data (almost ready) • functioning online trigger (basic trigger ready) • filter and reconstruction algs. (basic ones ready) • monitoring facility (almost ready) • data quality assurance (underway) • organization of analysis effort (underway) • equivalent of CERN “Yellow Book” theoretical discussion of possible analysis topics (underway) Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Organization of Analysis Effort • (Gary Hill will describe fully) • Organization: • analysis coordinator (Gary) • analysis working groups with appointed leaders • overall collaboration analysis strategy • internal dissemination of results • defined analysis and publication approval mechanism Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Data Quality Assurance • Perform tests to demonstrate that detector is producing data of sufficient quality for physics analysis • right after deployment: “Commissioning” • thereafter on a regular basis: “Verification” • Using in-situ flashers and down-going muons, check things like • timing, gain, linearity, saturation response • geometry • down-going muon (atm. nm) angular distribution • pointing resolution (using IceTop) • energy resolution (using standard candle) Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Some Verification results Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Intra-string Timing & Geometry (via Flashers) Method t • flash each DOM on a string • look at earliest hit time on DOM immediately above flashed DOM • plot t-to Mean of Earliest Hit (ns) to String 38 77 ns (expect from c/n) ~ same difference seen for string 21 Probably due to some combination of scattering, threshold crossing algorithm, and possible turn-on delay of LEDs. Learn from Results RMS of Earliest Hit (ns) Mean: timing+geometry of each DOM RMS: timing resolution of each DOM String 38 • Almost all top-48 DOMs look good ... ... need to understanding outliers (probably just statistics/fitting issues) • Data for bottom-12 DOMs at Pole Michelangelo D'Agostino/UC-B
String 29 (JDR Cable) Deployment Measurements Suggested Anomalous DOM Spacing… Mean of Earliest Hit (ns) String 29 77 ns (expect from c/n) 16-18-16-... DOM spacing clearly visible Mean of Earliest Hit (ns) String 38 Michelangelo D'Agostino/UC-B
Intra-string Timing & Geometry (via Flashers) Mean of Earliest Hit (ns) String 29 77 ns (expect from c/n) 16-18-16-... DOM spacing clearly visible RMS of Earliest Hit (ns) String 29 Michelangelo D'Agostino/UC-B
Muon Occupancy Study Method • look at occupancy for 9-string muon data, requiring 8 hits per string to mimic single string triggering condition • compare different strings to see if there is significant efficiency, ice property difference dust layer minimum (m)relative depths (m) string 21: -113 m not available string 29: -113 m +2.7m relative to string 39 string 38: -105 m +11.4m relative to string 39 string 39: -111 m not applicable string 40: -110 m -0.5m relative to string 39 string 49: -109 m +1.6m relative to string 39 string 50: -103 m not available string 59: -107 m not available GEOMETRY FILE Dawn Williams/Penn State
IceTop-InIce Coincidence Timing • For events with >= 8 hit DOMs, and 1 hit station on surface, for each hit InIce DOM look at • 2d plot: (Ticetop – Tinice) vs. (Dsurface – Dinice) • histograms: residual = (DT – d/c)DOM_i and <residual>DOM_i • N.B.: No fitting involved, but some tracks are more vertical than others Tonio Hauschildt/UDel
Down-Going Muon: data vs. MC Method # of DOMs Hit • MC: corsika + amasim, bulk ice model (82 seonds) • data: 8-string muon data taken with testDAQ (15 minutes) • reconstruct tracks with dipole fit 20 60 20 60 data # of String Hit MC 2 2 Preliminary Conclusion: ... Agreement is not so bad ... Zenith Angle Hagar Landsman/UW normalized by livetime normalized by area
First Long-Term Study: Intra-String Timing & Geometry Stability Method • look at time difference between earliest hit on DOM n and n+1 for muon data with more than 15 hits on the same string • plot time difference vs. real time, look for drifts, steps, etc. String 21 is stable to within ~2 ns during 2005 ... looking at timing residuals also and will append existing report accordingly Carsten Rott/Penn State
Detailed Verification results • Intra-string • Inter-string Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Plans for (post-deployment) Calibrations Timing [Verify RAPCal intra- and interstring] Charge Gain calibration Linearity, saturation Geometry Stage 1: string by string non-optical data ~days after deployment Stage 2: inter-string flasher data ~weeks after deploym. Stage 3: muon tomography ~months after deploym. Directionality (tracks) IceTop coincidences Moon shadow Energy/vertex (cascades) Standard Candle lasers Flashers Low level calibrations Ice properties Scattering, absorption, dust Dust layer tilt Hole ice High level calibrations Slides from Kurt Woschnagg/UC-B
Calibration examples Stage-1 geometry Ice properties String depth from pressure sensors dust layers Vertical string profile from drill data
Calibration examples D L z (m) z = 0 38 39 Stage-2 geometry 13 14 15 16 17 13 14 15 16 17 L Surface locations from surveys D Distance From Timing (m) depth from intra-string geometry (m)
Monitoring Scheme • Three sources of data for monitoring • DOM Monitoring Stream (HV, temperature, scaler rate...) • Physics Data (events) • DAQ Configuration socket file DAQ EventDispatch Processing 'n Filtering EventDaemon Root file DAQ MonitorDispatch Monitor Daemon Root Satellite Web Interface South Pole Web Interface North DAQ XMLConfiguration WinteroverReport PhysicistReport ExperimentControl D.B. D.B. Slides from Ignacio Taboada/UC-B
Monitoring Schedule • Shifts (in the north) will begin by April 1. Shifts distributed among IceCube institutions. • Updates to Web interface and system 3 times per year: March/June/September • Updates to system at South Pole during austral summer • We plan the following upgrades during 2006 • Automatic trend analysis for detection of failures (detector is not stable enough yet to allow this analysis now) • Increase in the quantity of variables that are monitored • Long term study of the detector (different web interface) Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Link to monitoring page Run Number Triggers enabledin this run Red Known Bad DOMs Green is good!All (good) DOMsConfigured Each “light” representsa DOM InIce Strings IceTop Tanks We use the XML DAQ configuration to display detector status Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Click to get String Plots Click to getStation Plots How often doesa DOM report asignal? Click on a squareto get Plots for thatspecific DOM Everything in the web interface is “clickable”We produce detector-,subdetector-, string- and dom-wise plots Even though configured inDAQ these DOMs are not taking data Subdetector Multiplicityin this run We use the Monitoring and Physics stream to study relevant quantities Doug Cowen/High-Level Analysis Preparation/SAC Meeting
END Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Commissioning Results • High Level: Preliminary Criteria and Summary • Intra-String Flasher Analysis • Number of DOMs : • only took data for top 48 DOMs: 331 DOMs out of the 346 look good • top 48 DOMs on all strings, minus ~100 ones with quad issues that were resolved after the data were collected • we need to do a bit of makeup here to complete the dataset • Criteria : • Mean within 5 ns of the average for a string and • RMS less than 2 ns • Intra-String Muon Analysis • Number of DOMs : • 500 DOMs out of the 516 DOMs passed outright • another 9 DOMs passed with slightly poorer performance • another ~20 DOMs have insufficient statistics • Criteria : • timing residual peak w/in 3 ns of 0 => good timing • residual peak betw. 3 & 6 ns of 0 => okay • otherwise => bad • Detailed summaries: Intra-string and Inter-string Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Intra-string Timing & Geometry (via Muons) Method Timing Residuals (ns) String 29 • use ~6 live-hours of realDAQ/PnF data • fit muons to data minus DOM n • compute time residual between track and earliest hit time for DOM n • fit resulting distribution to find peak Almost all DOMs (~530) look good ... ... bottom DOMs limited statistics Distribution of Above mean=-1.5; rms=1.82 String 29 mean=-1.4; rms=1.9 for string 21 Dima Chirkin/LBNL
Inter-string Timing & Geometry (via flashers) Z=0 Method • flash all horizontal LEDs on DOM on one string • look at earliest hit time on DOMs on adjacent string at +/- 5 DOMs relative to depth of flasher Distance From Timing (m) 38 39 D 13 14 15 16 17 13 14 15 16 17 L depth from intra-string geometry (m) Verification Results L icecube_20060130_9strings.geo D DOM # L (m) D (m) 29-15 109.3 + -0.2 -0.2 +- 1.1 29-45 109.1 +- 0.3 -0.3 +- 0.7 38-15 115.7 +- 0.5 10.9 +- 1.3 38-45 114.3 + -0.3 10.6 +- 0.8 Chihwa Song/UW
PMT Gain Sample SPE charge (pC) String 39 Method DOM 39-29 • look at flasher data for DOM distant from flasher (no noise due to LC trigger) • isolate single photo-electrons using cuts on waveform • compare position with FAT expectation Looked at string 21/29/39 so far ... will look at all strings by Collaboration Meeting String 39 Tom McCauley/LBNL
DOM Phi Orientation Study (via flashers) Method • flash single LEDs on DOM on one string • look at earliest hit time for DOMs on adjacent string for each flashed LED • find minimum as a function of LED pointing angle • check precision against angle between 3 strings from geometry file #39 fit to parabola MEAN (ns) SIGMA (ns) f1 #29 f2 phi (deg) phi (deg) #21 icecube_20060130_complete.geo string 29 @ ( 371.6, -92.2) string 21 @ ( 443.6, -194.2) string 39 @ ( 411.8, 13.0) 123.8 : Expected Openning Angle D7 (f=0) RESULTS USING 3 FLASHED DOMs on String 29 Bad fit to leading edge of hit time distribution for one of the LEDs Chihwa Song/UW (parabola fit of mean)
High-Level Commissioning Criteria • High Level criteria: • Timing • Mean time within 5ns of expectation • RMS less than 2ns • see Timing Page for detailed info • 331 of 346 • Geometrical position agrees with expectation within X meters • DOM Occupancy vs. depth agrees with expectation to within X% • Gain agrees with expectation to within X% Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Plans for Commissioning and Verification • Repeat low-level commissioning suite of tests next season. Some streamlining will be useful. • Continue analyzing 1 TB of high-level commissioning data taken at Pole this season and hand-carried north • About half of what we planned on getting • Hold regular weekly conference calls to discuss analysis of this data • Held working meeting 13-16 March to discuss results; had ~20 active participants • Currently taking additional 1 TB of data at Pole, remotely • Will analyze this data on computers at Pole and transfer just small root files north (this is our long-term model for verif.) • Our experience with the usefulness of the various tests and the types of data taken will be used to streamline verification this year, and high-level commissioning next season Doug Cowen/High-Level Analysis Preparation/SAC Meeting
Plans for Commissioning and Verification • Anticipate much faster turnaround on high-level commissioning results next season • Working towards a few days after DOMs deemed okay by low-level commissioning • Work with Monitoring to display Commissioning and Verification results in a user-friendly way • see Intra-String Timing/Geom or Inter-String Timing/Geom page • Figure out a way to archive results for easy future access • Determine CPU and disk space requirements for commissioning and verification at Pole for next and future seasons • Get more non-US involvement. At present it is less than 1 FTE. Doug Cowen/High-Level Analysis Preparation/SAC Meeting