1 / 6

Research Infrastructure

Research Infrastructure. Simon Hood Research Infrastructure Coordinator its-ri-team@manchester.ac.uk. Campus CIR Ecosystem. On campus. Off campus. Router ACLs 130.88.99.0/27. Firewall. RDS - Isilon. Home dirs. Shared areas. Zreck. Backend ET nodes (FPGA, GPU, Phi). RQ. CSF. iCSF.

Download Presentation

Research Infrastructure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Infrastructure Simon HoodResearch Infrastructure Coordinatorits-ri-team@manchester.ac.uk

  2. Campus CIR Ecosystem • On campus • Off campus Router ACLs130.88.99.0/27 Firewall RDS - Isilon • Home dirs • Shared areas Zreck Backend ET nodes(FPGA, GPU, Phi) RQ CSF iCSF Firewall Job Queue Job Queue BackendLVS nodes(VMs, GPUs) Campus only10.99.0.0/16 Backend nodes(compute) Backend nodes(compute) Research Virt. Desktop Service Router ACLs “mounts” Cmd-line Router ACLs NX X2GO • 20 Gb/s RVMS Michael Smith (FLS) SSH SSHFS Research VMs MIB? MHS? Materials(EPS)? • 20 Gb/s

  3. Ecosystem Workflow SSHFS RVDS RVDS SSH RVDS RVMS RDS RDS iCSF iCSF CSF CSF iCSF RDS RDS RDS RDS • 2. Submit compute job.EG: long running parallel high memory heat/stress analysis from home. • 3. Check on compute job.EG: while away at conference in Barcelona. Submit other jobs. 1. Input preparation.EG: upload data to RDS, set job parameter in application GUI, in office on campus. • 5. Publish results.EG: Front-end Web Server running on RVMS accessing Isilon share. • 4. Analyse resultsEG: In application GUI on laptop in hotel & back in office.

  4. CIR Stats • CSF • £1.3m academic contribution since Dec 2010 • 5000 CPU cores • £175k more lined up (Jan 2014?) • Awaiting outcome of some big bids…Kilburn??? • Storage – Isilon • 500 PB per year • Current: 120 TB for each faculty – going fast! • Network • July: £1.5m on Cisco kit • 80 Gb core, 10Gb buildings • People • Pen, George, Simon

  5. Recent and Current Work • Redqueen • Summer: RGF-funded refresh of 50% of cluster • Integration with Isilon (RDN) • CSF (mostly batch compute) • Summer: £300k procurement • RDN: moving all home-dirs to Isilon (keep local Lustre-based scratch) • Gateways • SSH and SSHFS/SFTP • Research Virtual Desktop Service: NX, X2GO • New Clusters • Incline/iCSF: interactive compute • Zreck: GPGPUs, Xeon Phi, FPGA, … • RDN (20 Gb) • CSF, Redqueen , Incline/iCSF, Zreck • Michael Smith (FLS)

  6. Thankyou! Email questions to me: Simon.Hood@manchester.ac.uk

More Related