180 likes | 329 Views
An Introduction to RENCI for Potential Collaborators (particularly at Duke). Rob Fowler ( rjf@renci.org , 445-9670) Presentation to Duke CTMS October 23, 2009. Renaissance Computing Institute. Founded by UNC Chapel Hill, Duke and NC State with state support
E N D
An Introduction to RENCI for Potential Collaborators(particularly at Duke) Rob Fowler(rjf@renci.org, 445-9670) Presentation to Duke CTMSOctober 23, 2009
Renaissance Computing Institute • Founded by UNC Chapel Hill, Duke and NC State with state support • Leading-edge technologies & multi-campus expertise applied to state issues • HPC, networking, data, visualization • Technical expertise in key areas. • State-wide facilities and expertise to foster engagement • Triangle sites at Duke, NC State, UNC and Europa Center • Regional engagement sites at ECU, UNC Asheville, UNC Charlotte, CSI
RENCI Mission statement: “RENCI, a multi-institutional organization, brings together multidisciplinary experts and advanced technological capabilities to address pressing research issues and to find solutions to complex problems that affect the quality of life in North Carolina, our nation and the world.” RENCI is… • Multi-institutional, spanning campuses and the state to build collaborations that address practical and research problems. • Flexible; reacts to specific state needs. • Leverages national projects and core competencies to help NC, its research environment, and economic vitality.
Environment and Response Health Delivery& Biomedicine ThrustDomains RENCI Thrusts and Technologies Collaboration, and Engagement RENCI expertise University expertise Private sector expertise Core TechnologicalExpertise Visualizationand Analysis DataManagement High-PerformanceComputing DistributedComputing and Networking
RENCI’s Statewide Reach • RENCI at UNC Asheville: • RENCI-produced visual model of 2004 floods used to plan emergency response and future development • Partnering with community groups in efforts to build regional climate services and media-arts business • RENCI at UNC Charlotte: • Built interactive software tool to analyze regional growth patterns and impacts on infrastructure, traffic, education, open spaces and quality of life • Tool will expand to cover all of NC with funding from Z. Smith Reynolds Foundation • RENCI at East Carolina • RENCI at Coastal Studies Institute, Manteo
Resources (people) • 17 PhD staff, 11 master’s (ABDs) • Technical staff: networking, HPC, visualization, cyberinfrastructure development • Domain Scientists (mostly non-staff UNC faculty): genomics, coastal modeling, data management, cloud computing, atmospheric science • Strong ties (joint appointments, space sharing) to UNC’s Data Intensive Cyber Environments (DICE) group • DHS Center of Excellence for Environmental Hazards (space sharing).
Computing: Blue Ridge: 8TF Core i7 cluster, with additional 8 TFLOPS GP GPU capacity. (Planned capability to (at least) double size.) Ocracoke: (3 to 11) TLOPS IBM Blue Gene/L Kitty Hawk: 1.2 TF “Woodcrest” cluster 300TF data storage system, RDBS, Virtualized server farm, experimental (non-production) systems. Visualization high-res, rear-projection walls, 360-degree Social Computing Room, 20-ft dome, 4K tele-immersion, multi touch wall (Duke), multi-touch tables (Charlotte, NC State) Resources (stuff) • Remote collaboration facilities: UNC-CH, Duke, NC State, ECU, UNC-C, UNC-A, CSI
Duke/RENCI Collaborative Projects • Evidence-Based Decision Support (K. Gersing, C Bizon, …) • Decision Support … Infants … Illness. (Brandon, Docherty, X. Wu) • Web-based Vis. Analytics (Yang, Johnson, Bizon, Wu) • Vis Analytics for Quark-Gluon-Plasma (Bass, Wu) • Optimizing Dense Granular Flow Code (Behringer, Bizon) • Protein Design on OSG (Donald, Bizon) • Multi-Touch Interactive Network (Chase, Wu, Heerman) • HCI, Croquet, Collaborative Network OS (Lombardi, Chase, Lombardi, …) • GENI and BEN (Chase, Baldin, …) • NSF Track 2-D proposal (Chase, Fowler, Dreher, Vouk, …) • NSF CRI proposal (Chase, Fowler, Baldin, Freeh)
Evidence Based Medical Decision Support • Aid in selecting best treatments for clinical care patients • Not enough time spent w/patients, more mid-level practitioners, information overload = need for Decision Support • Novel visual analytics that combine: • data-driven evidence from historical electronic medical records • expert provided rules • clinical guidelines that define community care standards Historical responses to treatments Guideline-based context Projected Outcomes Filters/Comparative Rules
Distributed Computing and Networking Projects • VCL development and application (Dreher & NCSU). • Education and Research Cloud, BEN/GENI, and HPC uses. • Secure cloud technology for sensitive, e,g., medical, use. • NSF Teragrid. • NC Bioportal project, RENCI Science Portal, user support. • Open Science Grid (DOE and NSF) • User support activities. • Breakable Experimental Network. • Dark fiber ring joining RENCI, UNC-CH, NCSU, and Duke. • Controlled by researchers. • GENI Island in the Triangle (Chase and Baldine) • Leveraging BEN. • CRI proposal: Attach dedicated heterogeneous compute facilities to the BEN POPs.
RENCI and VCL • VCL = “Virtual Computing Laboratory” (IBM and NCSU) • Originally-- replace NCSU campus labs with centralized facility, more efficient administration, cycle scavenging. • Dynamic allocation, with reservations and on-demand. • Improved hardware resource utilization, improved staff utilization. • Single-seat virtual desktops with custom operating system (OS) and application stack. • Ad hoc clusters for scientific computing. • RENCI activities related to VCL. • Allocation and provisioning of experimental distributed computing research (GENI/BEN uses ORCA and VCL) • Reconfigurable HPC cluster at RENCI. • Secure shared resources for health care research and delivery. • Extending VCL clouds to across multiple institutions. • RENCI contact • Patrick Dreher, dreher@renci.org
The Open Science Grid A framework for large scale distributed resource sharing addressing the technology, policy, and social requirements of sharing OSG is a consortium of software, service and resource providers and researchers, from universities, national laboratories and computing centers across the U.S., who together build and operate the OSG project. The project is funded by the NSF and DOE, and provides staff for managing various aspects of the OSG. Brings petascale computing and storage resources into a uniform grid computing environment Integrates computing and storage resources from over 80 sites in the U.S. and beyond
OSG Engagement Program • Mission • Help new user communities from diverse scientific domains adapt their research computing to leverage OSG • Facilitate University Campus CI deployment, and interconnect it with the national organizations • Drive new requirements and important feedback to infrastructure developers and providers • Methodology: • Embedded Immersive Engagement for Cyberinfrastructure • www.eie4ci.org • National program coordinated at RENCI by McGee
The RENCI Science Portal is … • used by the RENCI Engagement team as a toolbox to assist researchers with large scale computational science problems • a computational science platform accessible via: web browser, secure web services, and a few Java applications we have developed for specific usage models http://www.teragrid.org/tg09/files/tg09_submission_75.pdf
The RENCI Science Portal is … • actively seeking engagements with scientists that show evidence of broad community-wide impact • backed by very large computational capacity as: • a TeraGrid Science Gateway • a gateway to the Open Science Grid (OSG) • NIH machine at UNC-CH CS Dept. (BASS) • a gateway to additional resources accessible to RENCI • currently geared towards large-scale High-Throughput Computing (HTC)
HPC Group Projects(current projects = launch points for new) • DOE SciDAC USQCD (LQCD Consortium) • Performance consulting on “bleeding edge” systems. • DOE SciDAC Performance Engineering Research Inst. • Scalable performance measurement and analysis. • Multi-core, multi-thread measurement and analysis. • Engagement with major users of Leadership-Class Systems. • NSF/NCSA/IBM Track 1 System (“Blue Waters”) • Scalability and load-balance. • Engage with future users of the system. • DoD Advanced Computing Systems Program • OS structures for heterogeneous many-core systems. • “Resource-Centric Reflection”. • NSF CPS: Reliable, Robust, Rapidly-deployable Situational Awareness and Response (R3SAR) • High-productivity programming language and environment for systems composed of sensor platforms, large computational and data resources, and “responders” with hand-held devices. • Computational Science Projects led by Jeff Tilson.