70 likes | 210 Views
Research Infrastructure. Simon Hood Research Infrastructure Coordinator its-ri-team@manchester.ac.uk. Campus CIR Ecosystem. On campus. Off campus. Router ACLs 130.88.99.0/27. Firewall. RDS - Isilon. Home dirs. Shared areas. Zreck. Backend ET nodes (FPGA, GPU, Phi). RQ. CSF. iCSF.
E N D
Research Infrastructure Simon HoodResearch Infrastructure Coordinatorits-ri-team@manchester.ac.uk
Campus CIR Ecosystem • On campus • Off campus Router ACLs130.88.99.0/27 Firewall RDS - Isilon • Home dirs • Shared areas Zreck Backend ET nodes(FPGA, GPU, Phi) RQ CSF iCSF Firewall Job Queue Job Queue BackendLVS nodes(VMs, GPUs) Campus only10.99.0.0/16 Backend nodes(compute) Backend nodes(compute) Research Virt. Desktop Service Router ACLs “mounts” Cmd-line Router ACLs NX X2GO • 20 Gb/s RVMS Michael Smith (FLS) SSH SSHFS Research VMs MIB? MHS? Materials(EPS)? • 20 Gb/s
Ecosystem Workflow SSHFS RVDS RVDS SSH RVDS RVMS RDS RDS iCSF iCSF CSF CSF iCSF RDS RDS RDS RDS • 2. Submit compute job.EG: long running parallel high memory heat/stress analysis from home. • 3. Check on compute job.EG: while away at conference in Barcelona. Submit other jobs. 1. Input preparation.EG: upload data to RDS, set job parameter in application GUI, in office on campus. • 5. Publish results.EG: Front-end Web Server running on RVMS accessing Isilon share. • 4. Analyse resultsEG: In application GUI on laptop in hotel & back in office.
CIR Stats • CSF • £1.3m academic contribution since Dec 2010 • 5000 CPU cores • £175k more lined up (Jan 2014?) • Awaiting outcome of some big bids…Kilburn??? • Storage – Isilon • 500 PB per year • Current: 120 TB for each faculty – going fast! • Network • July: £1.5m on Cisco kit • 80 Gb core, 10Gb buildings • People • Pen, George, Simon
Recent and Current Work • Redqueen • Summer: RGF-funded refresh of 50% of cluster • Integration with Isilon (RDN) • CSF (mostly batch compute) • Summer: £300k procurement • RDN: moving all home-dirs to Isilon (keep local Lustre-based scratch) • Gateways • SSH and SSHFS/SFTP • Research Virtual Desktop Service: NX, X2GO • New Clusters • Incline/iCSF: interactive compute • Zreck: GPGPUs, Xeon Phi, FPGA, … • RDN (20 Gb) • CSF, Redqueen , Incline/iCSF, Zreck • Michael Smith (FLS)
Thankyou! Email questions to me: Simon.Hood@manchester.ac.uk