130 likes | 229 Views
Cyberinfrastructure in the College of Letters and Sciences. Louise H. Kellogg Professor & Chair, Geology Department. The college contains three divisions, each with its own cyberinfrastructure requirements: Mathematical and Physical Sciences Humanities, Arts, and Cultural Studies
E N D
Cyberinfrastructure in the College of Letters and Sciences Louise H. Kellogg Professor & Chair, Geology Department • The college contains three divisions, each with its own cyberinfrastructure requirements: • Mathematical and Physical Sciences • Humanities, Arts, and Cultural Studies • Social Sciences
Cyberinfrastructure for Education:Requirements and Current Limitations • Need a strong commitment to provide space and resources for cyberinfrastructure for education. • Example: Computer lab for undergraduate students in Statistics: • Would enable the use of specialized computer packages in statistics classes • Supported by Academic Senate program review (TPPRC) • Would put the program in a stronger position for requesting extramural funds • Currently limited by available space & resources.
Cyberinfrastructure for Education:The research / education interface • Research infrastructure should provide the means to bring undergraduates into cutting edge research and creative activity • A discussion of research computing on campus should include consideration of requirements to support undergraduate involvement in research computation • Most students who take on such effort work through an advisor’s existing computer resources. • Would undergraduate participation be better served through remote access? • Should funds for instructional use of computing be increased to permit programs to expand their research capabilities?
Cyberinfrastructure for Research: Requirements • At least 30 faculty in L&S use high performance computing as a primary component of their research and teaching. These faculty need: • Adequate space with associated power and cooling for the high performance computing hardware that is being acquired by our faculty using extramural and university funds. • High speed networking to allow rapid transfer of data both within and beyond campus, and • Technical support for the above and for specialized research software.
Cyberinfrastructure for Research: Examples • High performance computing for computational materials physics and chemistry in the ANGSTROM group. • High speed and high bandwidth networking and High performance computation for the LHC project in physics. • Visualization of large datasets in the KeckCAVES.
Cyberinfrastructure for Research: Examples • High performance computing for computational materials physics and chemistry in the ANGSTROM group. • High speed and high bandwidth networking and High performance computation for the LHC project in physics. • Visualization of large datasets in the KeckCAVES.
The ANGSTROM group (http://angstrom.ucdavis.edu/) is located in the Chemistry Department at UCD and headed by Prof. Giulia Galli. Its research activity focuses on the development and use of quantum simulation tools to understand and predict the properties and behavior of materials (solids, liquids and nanostructures) at the microscopic scale. Access to and management of robust and stable campus cyber- infrastructure are critical needs for the group. • ANGSTROM Highlights: • Proposal "Water in confined states" selected to receive a 2007 DOE-INCITE award. • “Quantum Simulations of Materials and Nanostructures”: SciDAC grant awarded to UCD-led team in September 2006. • “Materials by design: applications to thermoelectrics”: DARPA-PROM grant awarded to UCD-led team in January 2006. • Agreement with IBM/Watson signed for use of BG/L supercomputer by Angstrom members • “First principles simulations of dielectric properties in nano-silicon”: INTEL grant awarded in February 2006.
Cyberinfrastructure for Research: Examples • High performance computing for computational materials physics and chemistry in the ANGSTROM group. • High speed and high bandwidth networking and High performance computation for the LHC project in physics. • Visualization of large datasets in the KeckCAVES.
LHC Computing Tier2 Center Tier2 Center Tier2 Center Tier2 Center ~1 GByte/s Tens of Petabytes by 2008An Exabyte ~5-7 Years later. Tier 0: PBytes of Disk; Tape Robot Experiment Tier 1: Fermilab ~10 Gbps Tier 2: CalTech/UCSD ~10 Gbps Tier 3: • Tier 3 Center at UCD: • ~100 node quad-core processors • ~ 100 TB of storage • ~ 10 Gbps link to ESNET/Internet 2 • Housed in the UCD Data Center. UCD Institute Institute Institute 1 to 10 Gbps Workstations Physics data cache ~10 Faculty in High Energy and Heavy Ion Physics are involved. Slides from Prof. Mani Tripathi
AsiaPac SEA Europe Europe ESnet SDN Core: 30-50G Aus. BNL Japan Japan CHI NYC DEN DC Metro Rings FNAL Aus. ESnet IP Core ≥10 Gbps ALB SDG ATL CERN ELP US-LHCNet Network (US-CERN) National Network and Connection to CERN in Geneva The bay Area Metro Ring does not include UCD.
Cyberinfrastructure for Research: Examples • High performance computing for computational materials physics and chemistry in the ANGSTROM group. • High speed and high bandwidth networking and High performance computation for the LHC project in physics. • Visualization of large datasets in the KeckCAVES.
Visualization in the KeckCAVES • An interdisciplinary collaboration between physical scientists and computer scientists • The project engages graduate students, faculty, researchers from multiple colleges • Requires space and most importantly technical support • Requires flexibility: the technical needs evolve as the project does
Cyberinfrastructure for Research: Lessons from these groups • The need for high-performance computing will continue to increase. Adequate space with associated power and cooling is one of the basic requirements. • Observational and model datasets will continue to increase. High speed networking will allow rapid transfer of data both within and beyond campus • Technical support (the human factor) is crucial to the success of these programs.