1 / 0

Research Technologies

Systems and Services Overview. Research Technologies. Kristy Kallback-Rose, Marlon Pierce, George Turner. Divisions within UITS. Learning Technologies Technology in teaching and learning

hani
Download Presentation

Research Technologies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Systems and Services Overview Research Technologies Kristy Kallback-Rose, Marlon Pierce, George Turner
  2. Divisions within UITS Learning Technologies Technology in teaching and learning Communication and Support Communication and support services for students, faculty, and staff; licensing agreements Enterprise Software Vendor products, proprietary systems, community source projects Networks Advanced networking engineering services, international initiatives Enterprise Infrastructure Management of infrastructure for university services and software applications Research Technologies Systems and services in support of leading edge research
  3. Units within Research Technologies (RT) Advanced Visualization Laboratory - Consulting and support for scientific visualization, virtual reality, high-end computer graphics, and visual telecollaboration Biomedical Applications - Consultation on access and management of biomedical data Computational Biology - Application support, technical support, and software development for biological computing, particularly in the areas of genomics, cell biology, and molecular biology Core Services - provides services to Researchers at IU, Faculty, Staff, and Students, and Research Technologies Units Digital Library Program - Services for digital library development ScienceGateway - consulting and software for scientific communities who want to develop Web-based mechanisms for accessing and managing scientific applications High Performance Applications - Programming support for IU's parallel supercomputers, Big Red and Quarry High Performance Systems - Central research computational & database resources Open Science Grid - Enables large-scale high-throughput computing for science Research Storage - Provides storage infrastructure to support teaching, research, and administrative computing Stat/Math - Support for statistical and mathematical software
  4. Working with Research Technologies We are here to support research computing Staff at IUPUI and IUB, but support for all 8 campuses Opportunities to Interact in person http://pti.iu.edu/calendar RT Fair Visiting departments and labs by request Workshops Electronically Email (listed at end of presentation)
  5. Today’s focus… High Performance Systems High Performance Applications Research Storage Data Capacitor Core Services Science Gateway Visualization
  6. High Performance Systems - Big Red Main compute resource for large compute tasks (parallel & serial) 1024 JS21 Blades Dual PowerPC 970 MP dual-core CPU (4 cores / server) 8GB RAM GigE ethernet PCI-X Myrinet 2000 switch fabric (high-bandwidth, low-latency MPI) 70 GB local scratch SUSE Linux Enterprise Server 9 NFS home directory (10GB quota) /N/u/$USER/BigRed GPFS volume (346TB) /N/gpfs/$USER Data Capacitor (339TB) /N/dc/scratch/$USER, /N/dc/project/$PROJECT Data Capacitor WAN (339TB) /N/dcwan/scratch/$USER, /N/dcwan/project/$PROJECT
  7. High Performance Systems - Quarry General access computing 140 IBM HS21 Blade servers (b/q nodes) Dual 2.0GHz Intel Xeon 5335 (Clovertown) quad-core processors 8GB (default) or 16GB (himem) RAM Gigabit Ethernet RHEL 4 36 or 73GB local scratch 230 IBM dx340 servers (p/pg nodes) Dual 2.3GHz Intel Xeon E5410 (Harpertown) quad-core processors 16GB RAM Gigabit Ethernet RHEL 5 94GB local scratch NFS home directory (10GB quota) /N/u/$USER/Quarry GPFS volume (346TB) /N/gpfs/$USER Data Capacitor (339TB) /N/dc/scratch/$USER, /N/dc/project/$PROJECT Data Capacitor WAN (339TB) /N/dcwan/scratch/$USER, /N/dcwan/project/$PROJECT Research File System /afs/iu.edu/….
  8. Using Big Red and Quarry Available to faculty, staff and graduate students BigRed : undergraduates can be sponsored by faculty or staff Queueing and Resource Management Big Red = LoadLeveler, Moab Quarry = TORQUE/PBS, Moab Scheduling strategies Fairshare Backfill Interactive processes on Head node 20 minute limit WildWest nodes; ssh from head nodes, see MOTD Reservations and Priority increases are available for time-critical  projects “Condo” computing possibilities
  9. High Performance Systems - Mason Large Memory Jobs Only 18 HP DL580 G7 servers 2 head nodes, 16 compute nodes Quad 1.87GHz Intel Xeon L7555 (Beckton/Nehalem-EX) oct-core CPU (32/server) 512GB RAM 10 Gigabit Ethernet ½ to 1TB local scratch (TBD) RHEL 6 TORQUE + Moab NFS home directory (10GB quota) /N/u/$USER/Mason Data Capacitor (339TB) /N/dc/scratch/$USER, /N/dc/project/$PROJECT Data Capacitor WAN (339TB) /N/dcwan/scratch/$USER, /N/dcwan/project/$PROJECT Account request mechanism TBD Available spring 2011
  10. High Performance Systems – Research Database Complex (old) Dedicated to : research-related databases data-intensive applications using database. Two IBM p575 servers Eight 1.9 GHz Power5 CPU per server 64 GB RAM per server Two 146 GB local disks AIX 5.3 Oracle MySQL NFS home directory (10GB quota) /N/u/$USER/BigRed GPFS volume (~300 TB) /N/gpfs/$USER Retire Spring 2011
  11. High Performance Systems – Research Database Complex (new) Dedicated to : research-related databases data-intensive applications using database. Four (4) HP DL180 G6 Dual 2.4GHz Intel Xeon E5620 (Gulftown/Westmere-EP) Oct-core 64 GB RAM 218 GB local disk GigE networking 48 TB SAN attached RHEL 5 Linux NFS home directory (10GB quota) /N/u/$USER/RDC Oracle MySQL Web Applications front end Available Spring 2011
  12. High Performance Systems – Research Database Complex (Web) Apache & TomCat servers Single Dell 2950 server 1.6 GHz Intel Xeon CPU 8 GB RAM Two 146 GB local disks RHEL 5 NFS home directory (10GB quota) /N/u/$USER/RDC Web frontend requested when RDC account is requested
  13. High Performance Systems – TeraGrid NSF funded HPC systems and support infrastructure 11 resource providers Compute, Visualization, Data http://www.teragrid.org/ See HPA (hpahelp@iu.edu) for assistance
  14. High Performance Systems - TeraGrid IU AMD (Quarry – Gateways Only) IU e1350 (Big Red) LONI Intel 64 Linux Cluster (Queen Bee) NCAR Blue Gene/L (Frost) NCSA Altix UV (Ember) NCSA Intel 64 Linux Cluster (Abe) NCSA PowerEdge 1950 with NVIDIA Tesla S1070 (Lincoln) NICS SGI/NVIDIA, Visualization and Data Analysis System (Nautilus) NICS XT4 (Athena) NICS XT5 (Kraken) ORNL IA-32 Cluster (NSTG Cluster) PSC Altix 4700 (Pople) PSC Altix UV (Blacklight) Purdue 1950 Cluster (Steele) Purdue Cloud (Wispy) Purdue Condor Pool (Condor Pool) Purdue TeraDRE (TeraDRE) SDSC (Trestles) SDSC (Dash) TACC DELL/NVIDIA Visualization & Data Analysis Cluster (Longhorn) TACC PowerEdge Westmere Linux Cluster (Lonestar) TACC Sun Constellation Linux Cluster (Ranger) TACC Visualization Cluster (Spur) TeraGrid Clusters (AQS)
  15. Getting Started Getting Started on BigRed http://kb.iu.edu/data/avjx.html At IU, what is BigRed http://kb.iu.edu/data/aueo.html Getting Started on Quarry http://kb.iu.edu/data/avkx.html At IU, what is Quarry http://kb.iu.edu/data/avju.html Getting Started on IU’s Research Database Complex http://www.kb.iu.edu/data/awmv.html E-mail hps-admin@iu.edu for any problems or concerns regarding Big Red, Quarry or the RDC.
  16. High Performance Applications Support users of IU's HPC and TeraGrid systems help IU researchers make efficient use of IU and TeraGrid compute resources Migrate applications to HPC systems Installation and configuration Optimization for the target architecture Performance analysis Profiling and tracing of serial and parallel codes Extended Consulting Support national community through TeraGrid In depth collaborations with scientists at IU Consulting for grant proposals
  17. Research Storage Scholarly Data Archive (aka Massive Data Storage System) Research File System
  18. Scholarly Data Archive (SDA) Massive data archive for Indiana University Currently can hold over 5,700 Terabytes (TB) Operating since 1999 Primarily tape storage Fronted by over 220TB of disk cache HIPAA aligned
  19. SDA Details Data is safe By default two copies of data IUB and IUPUI each get a copy checksum storing and server-side validation available SDA: Best Uses Files of at least 1MB Single file can be up to 10TB Archive files Files rarely updated Files need to be kept long time Files are read often frequently accessed files tend to stay on disk cache SDA: Poor Uses Small files small files should be aggregated with a tool like WinZip or tar Files that will frequently change do not edit files in place
  20. SDA Access Methods & Quotas Fast access: 100‘s MB/sec hsi and htar command line tools GridFTP clients Kerborized FTP Convenience protocols: 10 - 20 MB/sec Web access via browser requires authentication - no public access upload and download Sftp mount to desktop via CIFS (mapped drive) files and directories can be shared between SDA users with ACL’s default quota 5TB 2nd copy of data is not counted additional storage is readily available
  21. RFS Details disk based system - 60TB total available to faculty, staff and graduate students undergraduates can be sponsored by faculty or staff data is backed up nightly backups are kept for 2 months changes from the previous day kept in 1day-backup directory HIPAA aligned
  22. Using RFS RFS: Best Uses relatively small files should stay under a few hundred MB in size files that are updated frequently files can be edited in place on RFS frequently accessed files files that need to be shared, especially group project work RFS: Poor Uses Backups RFS is intended as working space SDA works better for backups concurrently updated files multiple people updating same file at once, i.e. Access databases relational database, i.e. mysql or postgres
  23. RFS Access Methods & Quotas mount on the desktop (map a drive) OpenAFS client Available for Windows, Macs, Linux Requires getting kerberos tickets which is automatic with Windows authentication to ADS Available on Quarry Web access from browser sftp access Can request project space default project quota is 50GB can define own groups of RFS users no shared accounts required users can be revoked access Default personal quotas are 10GB Additional space is generally available upon request
  24. Data Capacitor (DC) Short term storage (~700 TB) DC-WAN added in Spring of 2008 Increased usable capacity by 360TB Mountable across long distances, facilitates workflows that require unique, geographically distributed resources Allows users to obtain high read/write speeds for their data as well as support for very large files. DC Project and Scratch Available on Big Red and Quarry Projects with storage requirements that cannot be met with other existing systems default size 10 TB default time limit 30 days Other arrangements can usually be made
  25. Data Capacitor WAN (DC-WAN) Short term storage (~340TB) Mountable across long distances, facilitates workflows that require unique, geographically distributed resources Allows users to obtain high read/write speeds for their data as well as support for very large files. DC-WAN Project and Scratch Available on Big Red, Quarry, and various TeraGrid sites Scratch space is available now to all Big Red and Quarry users Available at /N/dcwan/scratch/<username> Files in scratch space may be purged after 14 days. Project space is available upon request Available at /N/dcwan/projects/<projectname> Default size 10 TB Files in project space with access times greater than 30 days may be purged. Arrangements for more space can usually be made Application available at http://pti.iu.edu/dc/allocrequest
  26. Core Services Provides services to Researchers at IU, Faculty, Staff, and Students, and Research Technologies Units Some examples: Web application hosting on the Research Database Complex Wiki hosting Subversion source code repository, GitHub software project service Open Source Mirror (OpenOffice, GNU, etc.) Red hat Enterprise Linux license and management
  27. Science Gateway Group We assist groups who want to provide Web-based access to IU and TeraGridcomputing resources for scientific communities. We lead the Open Gateway Computing Environments project. Open source software for science gateways Partners include Purdue, NCSA, UIUC, UTHSCSA Apache incubators in preparation Software Resources Google/OpenSocial gadgets Scientific workflows and application management on clusters, supercomputers Hardware Resources Gateway Hosting Environment VMs for both development and production web server hosting
  28. Visualization Areas of Expertise Visualization – both scientific and information Virtual reality – projection-based displays, head-mounted displays, navigation and interaction methodologies Advanced displays – stereoscopic, ultra resolution, haptic (force feedback) Spatial input/output technologies – 3D scanners, 3D printers, motion tracking systems Advanced graphics – modeling, animation, rendering Stereoscopic video and animation Visual telecollaboration Available Equipment and Facilities Multi-screen stereoscopic displays Ultra-resolution displays Portable single-screen stereoscopic displays Acquisition devices: 3D scanners, stereo cameras, motion tracking systems Output devices: 3D printer, haptic (force feedback) devices
  29. Help & Additional Information IU KB - http://kb.iu.edu/ “Indiana University’s Advanced CyberInfrastructure: The Least You Need To Know.” document http://pti.iu.edu/cyberinfrastructure.pdf Advanced IT Core (w/IU School of Medicine) – http://uits.iu.edu/page/avoh barnettw@indiana.edu ashankar@indiana.edu Advanced Visualization Lab (AVL) – https://pti.iu.edu/rtv vishelp@indiana.edu Core Services - https://pti.iu.edu/cs rtadmin@rtinfo.indiana.edu Data Capacitor - http://pti.iu.edu/dc dc-team-l@indiana.edu High Performance - Applications (HPA)- https://pti.iu.edu/hpa hpahelp@iu.edu High Performance - Systems (HPS) – https://pti.iu.edu/hps hps-admin@iu.edu Research Storage - https:/pti.iu.edu/storage store-admin@iu.edu Research Technologies Division - http://pti.iu.edu/rt researchtechnologies@iu.edu Science Gateways - https://pti.iu.edu/sgg ogce-discuss@googlegroups.com StatMath Center - http://www.indiana.edu/~statmath/ statmath@iu.edu IUB phone 812-855-4724, IUPUI phone 317-278-4740 Google
More Related