1 / 8

TeraGrid Arch Meeting RP Update: ORNL

TeraGrid Arch Meeting RP Update: ORNL. January 15, 2008 (MLK + 80y) John W. Cobb. Outline. SNS Status Neutron Science Portal Data Storage Experience Wider TeraGrid integration with Neutron Science Portal NSTG Cluster Operations Move the data: Support the experiment: connect to CI.

haig
Download Presentation

TeraGrid Arch Meeting RP Update: ORNL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TeraGrid Arch MeetingRP Update: ORNL January 15, 2008 (MLK + 80y) John W. Cobb

  2. Outline • SNS Status • Neutron Science Portal • Data Storage Experience • Wider TeraGrid integration with Neutron Science Portal • NSTG Cluster Operations • Move the data: Support the experiment: connect to CI

  3. SNS Status • Ongoing instrument run times (SNS and HFIR) • New instruments in commissioning, joining user program (Beamline: Name) • 2 Backscattering Spectrometer (BASIS) • 3 * Spallation Neutrons and Pressure Diffractometer (SNAP) • 4 A Magnetism Reflectometer (MR) • 4 B Liquids Reflectometer (LR) • 5 *Cold Neutron Chopper Spectrometer (CNCS) • 18 * Wide Angular-Range Chopper Spectrometer (ARCS) • Instruments in Commissioning (shutters opened) • 6 Extended Q-Range Small-Angle Neutron Scattering Diffractometer (EQ-SANS) • 11A Powder Diffractometer (POWGEN) • 17 Fine-Resolution Fermi Chopper Spectrometer (SEQUOIA) • More in the pipeline (10 more at last count) – “shovel ready”

  4. Neutron Science Portal • http://neutrons.ornl.gov/portal/ (SNS with NSTG) • Holiday success: Over break: only 3 files failed to automaticallymove from Data Acq to portal archive.“I used the portal and it really works”

  5. Data Storage Experience

  6. Wider TeraGrid integration with NS Portal • DOE SBIR with TechX corp (Mark Green) thick client portal and “virtual file system” for remote replication of SNS data (dist. Storage and disconnected use cases) • Examining SNS data replication on archive storage targets. SULI semester Student David Speirs is also working here • Jimmy Neutron Community Account continues use • TG dist. Jobs • Simulation tab • Generalized fitting service (DAKOTA) • Distributed Data Reduction

  7. NSTG Cluster Operations • GridFTP: • We are moving to unstripped GridFTP because of “adverse interactions” between GridFTP, PVFS, and local HW – not completely specified • Working to implement, as planned, an enhancement path to support 10gbs transfer, especially SNS<>NSTG<>TG at large • Will remain unstripped until then • ESG: New local ESG node in TeraGrid enclave at ORNL – undergoing installation now • Planning to work to mount Wide Area Lustres • General NSTG cluster usage: “canary in the coal mine” • Interesting large job count experience in Dec.

  8. StageSub (Data Butler) Update • See related file under this agenda on TG Wiki • Idea: For submitted jobs, separate data movement (stage-in/out) from execution • Dedicated, (privileged) job queue for Data transfer jobs to/from scratch • Utility: (Users) • Not waste allocation during job execution waiting for job movement. And/or • Not risk data purge before data gets stored and/or job schedculed • Utility: (Centers) • Increased Job throughput • Increased “effective” scratch storage • Ability to manage data storage bandwidth • How: • Batch script directives • Parser to submit munged scripts to compute and data queues • “Watcher” to coordinate progress between compute and data queues • Multiple file movement tools including standard center tools as well as archive and Wide area tools including P2P tools (freeloader) • Current Deployment (and future deployment)

More Related