130 likes | 224 Views
Some Grid Science. Roy Williams Paul Messina. California Institute of Technology. Grids and Virtual Observatory Grids and and LIGO. Virtual Sky: Image Federation. http://virtualsky.org/ from Caltech CACR Caltech Astronomy Microsoft Research. Virtual Sky has 140,000,000 tiles
E N D
Some Grid Science Roy Williams Paul Messina California Institute of Technology Grids and Virtual Observatory Grids and and LIGO
Virtual Sky: Image Federation http://virtualsky.org/ from Caltech CACR Caltech Astronomy Microsoft Research Virtual Sky has 140,000,000 tiles 140 Gbyte Change scale Change theme Optical (DPOSS) Xray (ROSAT) theme Coma cluster
VO and Grid Computing Supercomputer center support for Data Pipelines (not just MPI) VirtualSky.org Ingestion HP Superdome 1 TB VS1 Science product HPSS system HPSS 12 hr 250 Gb Raw data Today ~ 3 TB 140 GB Computation also takes about 12 hrs. CPU Utilization VS2 Web product Win2000 database Resampling Build VS2 Copy raw data and filter (background subtraction) Write VS1 Write VS2
VO and database Join Need same sorting for crossmatch Who does it? Or can we impose a standard? RA Dec mag J mag H mag K glon glat flux I flux Q flux U Radio catalog (indexed by glon) SQL select order by glon? Infrared catalog (indexed by RA) SQL select order by RA? Crossmatch engine sort here? Sorting 10^9 objects is expensive
VO Interoperability • Directory white/yellow pages • VO Schema repository • How to publish, how to define content • action> Document schema for • archive content • web service capability • table/image/spectrum • How to publish algorithms • Plug and play data services • SOAP, UDDI, WSDL, Jini? • Semantic web • “Question and answer in the language of the client” • Topic maps
VO and Interoperability What is the standard interaction with a catalog service? Client Service Client application (e.g., OASIS) asks for available catalogs then for attributes makes a query then displays the result Want to be dynamic, be able to add catalogs and the applications still work
VO and International • Of course there are many astronomy data collections in the UK, Europe, Japan, etc. • We intend to collaborate/coordinate with international efforts
LIGO Laser Interferometric Gravitational wave Observatory Listening to Collisions of Black Holes and Neutron Stars
Grid LIGO Architecture Data GridFTP HPSS Objective: Add security & wide-area data services LDAS is Ligo Data Analysis System GriPhyN LDAS Text request Request Manager Gatekeeper (GRAM) Local Disk Science Algorithms Software Collaboratory Parallel Computing Replica Catalog Replica Management Transformation Catalog Virtual Data Catalog Clients eg Web, Script, Agent GridFTP other LDAS Condor jobs Virtual Data Request Data Movement Globus RPC
Grid LIGO Logic • VD Request Request Manager • XML, Key-value, script? • Action> Build document schema for this • Do we have it already in data caches • if yes, get from caches, use simple transformations • elsebuild LDAS script • Move relevant data to LDAS local disk • Queue job to LDAS, get jobID • Pass updates, then result pointer back to user • Perhaps cache result
International GW Grid GEO 50 kbyte/sec
International GW Grid • Coincidence in Ligo-Virgo data • Not astrophysics data yet…. • Try seismic, electromagnetic data for now Currently using rsync Correlate signals (LDAS) Coincidence events! LHO ~12 kbyte/sec each way Merging 3 streams to 1 Virgo Electromagnetic Seismic Caltech LLO