180 likes | 334 Views
Chris Cartledge Deputy Director Corporate Information and Computing Services, The University of Sheffield C.Cartledge@sheffield.ac.uk +44 114 222 3008. White Rose Grid Infrastructure Overview. Contents. History Web site Current computation capabilities Planned machines Usage. YHMAN
E N D
Chris Cartledge Deputy Director Corporate Information and Computing Services, The University of Sheffield C.Cartledge@sheffield.ac.uk +44 114 222 3008 White Rose Grid InfrastructureOverview
Contents History Web site Current computation capabilities Planned machines Usage YHMAN Grid capabilities Contacts Training FEC, Futures
White Rose Grid History 2001: SRIF Opportunity, joint procurement Leeds led: Peter Dew, Joanna Schmidt 3 clusters Sun SPARC system, Solaris Leeds, Maxima: 6800 (20 processors), 4*V880 (8 proc) Sheffield, Titania: 10 (later 11)* V880 (8 proc) York, Pascali: 6800 (20 proc), Fimbrata: V880 1 cluster 2.2, 2.4 GHz Intel Xeon, Myrinet Leeds, Snowdon 292 CPUs, linux
White Rose Grid History continued Joint working to enable use across sites but heterogenous: a range of systems each system primarily to meet local needs up to 25% for users from the other sites Key services common Sun Grid Engine to control work in the clusters Globus to link clusters registration
WRG Web Site There is a shared web site:http://www.wrgrid.org.uk/ Linked to/from local sites Covers other related projects and resources e-Science Centre of Excellence Leeds SAN and specialist graphics equipment Sheffield ppGrid node York, UKLight work
Current Facilities: Leeds Everest: supplied by Sun/ Streamline Dual core Opteron: power & space efficient 404 CPU cores, 920GB memory 64-bit Linux (SuSE 9.3) OS Low latency Myrinet interconnect 7 * 8-way (4 chips with 2 cores), 32GB 64 * 4-way (2 chips with 2 cores), 8GB
Leeds (continued) SGE, Globus/GSI Intel, GNU, PGI compilers. Shared memory & Myrinet MPI NAG, FFTW, BLAS, LAPACK, etc Libraries 32- and 64-bit software versions
Maxima transition Maintenance to June 2006, expensive Need to move all home directories to SAN Users can still use it, but “at risk” Snowdon transition Maintenance until June 2007 Home directories already on the SAN Users encouraged to move
Sheffield Iceberg: Sun Microsystems/ Streamline 160 * 2.4GHz AMD Opteron (PC technology) processors 64-bit Scientific Linux (Redhat based) 20 * 4-way, 16GB, fast Myrinet for parallel/large 40 * 2-way, 4GB for high high throughput GNU and Portland Group compilers, NAG Sun Grid Engine (6), MPI, OpenMP, Globus Abaqus, Ansys, Fluent, Maple, Matlab
Also At Sheffield GridPP (Particle Physics Grid) 160 * 2.4GHz AMD Opteron 80* 2-way, 4GB 32-bit Scientific Linux ppGrid stack 2nd most productive Very successful!
Popular! Sheffield Lots of users: 827 White Rose: 37 Utilisation high Since installation: 40% Last 3 months: 80% White Rose: 26%
York £205k from SRIF 3 £100k computing systems £50k storage system remainder ancillary equipment, contingency Shortlist agreed(?) - for June Compute, possibly 80-100 core, Opteron Storage, possibly 10TB
Other Resources YHMAN Leased fibre 2Gb/s Performance Wide area MetroLAN UKLight Archiving Disaster recovery
Grid Resources Queuing Sun Grid Engine (6) Globus Toolkit 2.4 is installed and working issue over GSI-SSH on 64-bit OS (ancient GTK) Globus 4 being looked at Storage Resource Broker being worked on
Training Available across White Rose Universities Sheffield: RTP - 4 units, 5 credits each High Performance and Grid Computing Programming and Application Development for Computational Grids Techniques for High Performance Computing including Distributed Computing Grid Computing and Application Development
Contacts Leeds: Joanna Schmidt j.g.schmidt@leeds.ac.uk , +44 (0)113 34 35375 Sheffield Michael Griffiths or Peter Tillotson m.griffiths@sheffield.ac.ukp.tillotson@sheffield.ac.uk +44 (0) 114 2221126, +44 (0) 114 2223039 York: Aaron Turner aaron@cs.york.ac.uk ,+44 (0) 190 4567708
Futures FEC will have an impact Can we maintain 25% use from other sites? how can we fund continuing GRID work? Different Funding models a challenge Leeds: departmental shares Sheffield: unmetered service York: based in Computer Science Relationship opportunities NGS, WUN, region, suppliers?
Achievements White Rose Grid: not hardware, services People(!): familiar in working with Grid Experience of working as a virtual organisation Intellectual property in training Success: Research Engaging with Industry Solving user problems