490 likes | 593 Views
Cyberinfrastructure and the Transformation of Science, Education and Engineering. Tony Hey Director of UK e-Science Core Program Tony.Hey@epsrc.ac.uk. J.C.R.Licklider’s Vision.
E N D
Cyberinfrastructure and the Transformation of Science, Education and Engineering Tony Hey Director of UK e-Science Core Program Tony.Hey@epsrc.ac.uk
J.C.R.Licklider’s Vision “Lick had this concept of the intergalactic network which he believed was everybody could use computers anywhere and get at data anywhere in the world. He didn’t envision the number of computers we have today by any means, but he had the same concept – all of the stuff linked together throughout the world, that you can use a remote computer, get data from a remote computer, or use lots of computers in your job. The vision was really Lick’s originally.” Larry Roberts – Principal Architect of the ARPANET
A Definition of e-Science ‘e-Science is about global collaboration in key areas of science, and the next generation of infrastructure that will enable it.’ John Taylor Director General of Research Councils Office of Science and Technology • Purpose of e-Science initiative is to allow scientists to do faster, different, better research
The e-Science Paradigm • The Integrative Biology Project involves the University of Oxford (and others) in the UK and the University of Auckland in New Zealand • Models of electrical behaviour of heart cells developed by Denis Noble’s team in Oxford • Mechanical models of beating heart developed in Auckland • Need to be able to build a ‘Virtual Organisation’ allowing routine access for researchers to specific resources in the UK and New Zealand
e-Infrastructure/Cyberinfrastructurefor Research Common Fabric Generic services Group A Resources Private Resources Group B Private Resources
Grids in Education • Education is a classic distributed organization • New multi-disciplinary curricula require distributed experts interacting with mentors and students • Education requires rich integration of data sources with people and computing • Grids will ‘democratize’ resources enabling universal and ubiqitous access • Learning Management systems such as WebCT, Blackboard, Placeware, WebEx and Groove all have natural Gridimplementations
R1 R2 Enterprise Grid Dynamic light-weight Peer-to-peer Collaboration Training Grid Students Information Grid Compute Grid Campus Grid Teacher Overlapping Heterogeneous Dynamic Grid Islands
Grid Middleware Infrastructure • Global e-Science Infrastructure must support genuine needs of users/applications • To support ‘routine’ collaboration between institutions in different countries need set of robust middleware services and agreed‘policies’ • Require global Authentication, Authorization and Accounting (AAA) Services • Need robust and secure middleware services supported on top of network services
Global Terabit Research Network The Grid software and resources run on top of high performance global networks
First Phase: 2001 –2004 Application Projects £74M All areas of science and engineering Core Programme £15M OST £20M DTI Collaborative industrial projects Second Phase: 2003 –2006 Application Projects £96M All areas of science and engineering Core Programme £16M OST DTI Technology Fund UK e-Science Funding
Steve Lloyd Tony Doyle John Gordon GridPP Presentation to PPARC Grid Steering Committee 26 July 2001
Powering the Virtual Universehttp://www.astrogrid.ac.uk(Edinburgh, Belfast, Cambridge, Leicester, London, Manchester, RAL) Multi-wavelength showing the jet in M87: from top to bottom – Chandra X-ray, HST optical, Gemini mid-IR, VLA radio. AstroGrid will provide advanced, Grid based, federation and data mining tools to facilitate better and faster scientific output. Picture credits: “NASA / Chandra X-ray Observatory / Herman Marshall (MIT)”, “NASA/HST/Eric Perlman (UMBC), “Gemini Observatory/OSCIR”, “VLA/NSF/Eric Perlman (UMBC)/Fang Zhou, Biretta (STScI)/F Owen (NRA)” p13 Printed: 03/09/2014
Comb-e-Chem Project Video Simulation Properties Analysis StructuresDatabase Diffractometer X-Raye-Lab Propertiese-Lab Grid Middleware
DAME Project In flight data Global Network eg: SITA Ground Station Airline DS&S Engine Health Center Maintenance Centre Internet, e-mail, pager Data centre
Computational science • Molecular dynamics • Mesoscale modelling • High throughput experiments • High performance visualization • Computational steering • Terascale parallel computing
myGrid Project • Imminent ‘deluge’ of data • Highly heterogeneous • Highly complex and inter-related • Convergence of data and literature archives
KEGG Inter Pro SMART Execute distributed annotation workflow SWISS PROT EMBL NCBI TIGR SNP GO Interactive Editor & Visualisation Discovery Net Project Nucleotide Annotation Workflows Download sequence from Reference Server Save to Distributed AnnotationServer • 1800 clicks • 500 Web access • 200 copy/paste • 3 weeks work in 1 workflow and few second execution
BioSim Grid molecular cellular organism Sansom et al. (2000) Trends Biochem. Sci. 25:368 • An e-science challenge – non-trivial • NASA IPG as a possible paradigm • Need to integrate rigorously if to deliver accurate & hence biomedically useful results Noble (2002) Nature Rev. Mol. Cell.Biol. 3:460
GENIE: Delivering e-Science to the environmental scientist http://www.genie.ac.uk/
MIAS Devices Project • Easy Plug and Play of Sensors • Wireless connection using 802.11 • Positioning information from GPS • Mobile medical technologies on a distributed Grid Sensor bus GPS ariel
eDiamond Applications of SMF Teleradiology and QC VirtualMammo Training and Differential Diagnosis “Find one like it” ? Advanced CAD SMF-CAD workstation Epidemiology SMFcomputed breast density
e-Science Core Program (CP) Overall Rationale: Four major functions of CP • Assist development of essential, well-engineered, generic, Grid middleware usable by both e-scientists and industry • Provide necessary infrastructure support for UK e-Science Research Council projects • Collaborate with the international e-Science and Grid communities • Work with UK industry to develop industrial-strength Grid middleware
Support for e-Science Projects • Grid Support Centre in operation • supported Grid middleware & users • see www.grid-support.ac.uk • National e-Science Institute • Research Seminars • Training Programme • See www.nesc.ac.uk • e-Science Certificate Authority • Issue digital certificates for projects • Goal is ‘single sign-on'
UK e-Science Grid Edinburgh Glasgow DL Newcastle Belfast Manchester Cambridge Oxford Hinxton RAL Cardiff London Southampton
Access Grid – Group Conferencing All UK e-Science Centres have AG rooms Widely used for technical and management meetings Multi-site group-to-group conferencing system Continuous audio and video contact with all participants Globally deployed
e-Science Centres of Excellence • Birmingham/Warwick – Modelling • Bristol – Media • UCL – Networking • Leeds, York, Sheffield - White Rose Grid • Lancaster – Social Science • Leicester – Astronomy • Reading - Environment
UK e-Science Grid: Second Phase OGSA Grid Edinburgh Glasgow DL Newcastle Belfast Manchester Cambridge Oxford RL Hinxton Cardiff London Soton
Networking Research Projects GRS, GRID resource management GRID Infrastructure ‘ FutureGRID, P2P architecture Service Infrastructure GridMcast, Multicast-enabled data distribution Network Infrastructure MB-NG, QoS Features GRIDprobe, backbone passive monitoring at 10Gbps
UKLight • Infrastructure to support the UK part of an International facility – the Global Lambda Integrated Facility - for Network R&D • e.g. international Gigabit ethernet ‘channels’ • An example of the kind of infrastructure which might be incorporated in to SuperJANET5 • Implemented with: • Telco Wavelength services • Multiplexed channels (SDH) to carry Gigabit ethernet
Local Research Equipment UK Researchers International Point-of-Access Extended JANETDevelopment Network Existing connections Proposedconnections CA*net UKLightLondon StarLightChicago 10Gb/s 2.5Gb/s 10Gb/s Abilene CERN 10Gb/s 10Gb/s 10Gb/s NetherLightAmsterdam CzechLight GEANT UKLight – showing connections to selected International peer facilities
The UK Grid Experience: Phase 1 • UK Programme on Grids for e-Science • £75M for e-Science Applications • UK Grid Core Programme for Industry • £35M for collaborative industrial R&D • Over 80 UK companies participating • Over £30M industrial contributions • Engineering, Pharmaceutical, Petrochemical • IT companies, Commerce, Media
e-Science CP – Next Steps Deploy ‘production National Grid Service’ based on four dedicated JISC nodes plus the two UK Supercomputer Facilities • Develop operational policies, security, … • Gain experience with genuine user community Develop ‘OGSA’ based e-Science Grid • Based on two OGSA Grid projects and e-Science Centres • Work with EGEE project
Research Prototype Middleware to Production Quality • Research projects are not funded to do the regression testing, configuration and QA required to produce production quality middleware • Common rule of thumb (Brooks) is that it requires at least 10 times more effort to take ‘proof of concept’ research software to production quality • Key issue for UK e-Science projects is to ensure that there is some documented, maintainable, robust grid middleware by the end of the 5 year £250M initiative
Open Grid Services Architecture • Development of Web Services • OGSA and WSRF will provide Naming /Authorization / Security / Privacy/… • Projects looking at higher level services: Workflow, Transactions, DataMining, Knowledge Discovery… • Exploit Synergy: Commercial Internet with Grid Services
The Key Problem: Research Prototype Middleware to Production Quality • Research projects are not funded to do the regression testing, configuration and QA required to produce production quality middleware • Common rule of thumb is that it requires at least 10 times more effort to take ‘proof of concept’ research software to production quality • Key issue for UK e-Science projects is to ensure that there is some documented, maintainable, robust grid middleware by the end of the 5 year £250M initiative
The UK Open Middleware Infrastructure Institute (OMII) • Repository for UK-developed Open Source ‘e-Science/Cyber-infrastructure’ Middleware • Documentation, specification,QA and standards • Fund work to bring ‘research project’ software up to ‘production strength’ • Fund Middleware projects for identified ‘gaps’ • Work with US NSF, EU Projects and others • Supported by major IT companies • Southampton selected as the OMII site
Digital Curation Centre (DCC) • In next 5 years e-Science projects will produce more scientific data than has been collected in the whole of human history • In 20 years can guarantee that the operating and spreadsheet program and the hardware used to store data will not exist • Research curation technologies and best practice • Need to liaise closely with individual research communities, data archives and libraries • Edinburgh with Glasgow, CLRC and UKOLN selected as site of DCC
The UK ‘Dual Support’ System UK Government provides two streams of public funding for university research: • Funding provided to the universities for research infrastructure – salaries of permanent academic staff, premises, libraries & central computing costs • Funding from the Research Councils for specific projects – in response to proposals submitted & approved through peer review • This ‘Well Founded Laboratory’ concept needs to be extended to support ‘virtual laboratories’
Initial Research Infrastructure Portfolio • Prototype ‘National Grid Service’ based on dedicated Compute and Data Clusters • Semantic Grid development projects • AccessGrid Support Service • e-Social Science Training material • Intelligent Text Mining Service for Biosciences • Digital Curation Centre
New Research Infrastructure Funding • £3M for Security Development Projects • Combine Shibboleth with PERMIS Authorization Services • Joint project with NSF Internet2 NMI project on Security Services for ‘Virtual Organizations’ • £2M+ for Identified JCSR Topics • Data Handling, Visualization, Knowledge Management, Collaborative Tools, Human Factors • New ‘Grids for Education’ program? • £3M for ‘Collaborative e-Research Environments’ • Data Analysis and Visualization Centres?
New Research Infrastructure Funding 2 • £3.4M for ‘National Middleware Services’ • Deployment of NationalAuthentication Framework based on Shibboleth • Provide support Digital Library and e-Science communities • Goal is to develop global AAA infrastructure - Australia and Switzerland exploring this route • Work with US NMI and EU EGEE projects • Work with Internet2, TERENA and the NRENs
What is Shibboleth? An architecture developed by the Internet2 middleware community • NOT an authentication scheme (relies on home site infrastructure to do this) • NOT an authorisation scheme (leaves this to the resource owner) • BUT an open, standards-based protocol for securely transferring attributes between home site and resource site • Also provided as an open-source reference software implementation
UK e-Science Timeframes 2001 2002 2003 2004 2005 2006 2007 SR2000 * * * SR2002 * * * SR2004 * * * SJ5/AAA Service * * LHC/LCG *
e-Science Infrastructure beyond 2006 • Persistent UK e-Science Research Grid • Grid Operations Centre • Open Middleware Infrastructure Institute • National e-Science Institute • Digital Curation Centre • AccessGrid Support Service • e-Science/Grid Legal Service • International Standards Activity
Cyberinfrastructureand Universities ‘e-Science will change the dynamic ofthe way science is undertaken.’ John Taylor, 2001 • Need to break down the barriers between the Victorian ‘bastions’ of science – biology, chemistry, physics, …. • Problems with tenure and publications • Develop ‘permeable’ structures that promote rather than hinder multidisciplinary collaboration • Need to engage University IT Service Departments – Computing, Library, ..
A Definition of the Grid ‘[The Grid] intends to make access to computing power, scientific data repositories and experimental facilities as easy as the Web makes access to information.’ Tony Blair, 2002