220 likes | 373 Views
An Introduction to Grid Computing Research at Notre Dame. Prof. Douglas Thain University of Notre Dame http://www.cse.nd.edu/~dthain. What is Grid Computing?. Grid computing is the idea that we can attack problems of enormous scale by harnessing lots of machines to work on one problem.
E N D
An Introduction toGrid Computing Researchat Notre Dame Prof. Douglas Thain University of Notre Dame http://www.cse.nd.edu/~dthain
What is Grid Computing? • Grid computing is the idea that we can attack problems of enormous scale by harnessing lots of machines to work on one problem. • When people refer to The Grid, they are imagining a future where computers all over the globe are connected in one colossal system open for use. • Today, we have a variety of large, useful grids, but we don’t yet have The Grid.
Campus Scale Grids at Notre Dame • ND BOB: Bunch of Boxes • A “closet grid” of conventional PCs. • 212 CPUs in Stepan Hall • http://bob.nd.edu • ND Center for Research Computing • A “cluster grid” of dedicated rackmount computers downtown. • 900 CPUs in Union Station. • http://crc.nd.edu • ND Condor Pool • A “workstation grid” of classroom and desktop machines used when idle. • 405 CPUs in Fitzpatrick/Nieuwland • http://www.nd.edu/~condor
Volunteer Grids • Simple Idea: • Most computers are idle 90% of the day. • Can we harness their unused capacity for real work? • Examples: • Pioneered by Condor in 1987 at the Univ Wisconsin. • Popularized by SETI@Home in 1999 at Berkeley • Over 300,000 active participants today. • Successor is the more general BOINC. • Folding@Home • About 200,000 CPUs today. • Makes use of GPU cards: about 100x faster than CPU! • Xgrid: deployed with every Macintosh today. • Challenge: The user must be flexible!
National Computing Grids • NSF Teragrid • Open to any NSF research. • 21,972 CPUs / 220 TB / 6 sites • Open Science Grid • Open to any university. • 21,156 CPUs / 83 TB / 61 sites • Condor Worldwide: • Anyone can install a pool. • 96,352 CPUs / 1608 sites • PlanetLab • Open to CS research sites. • 753 CPUs / 363 sites
Who Needs Grid Computing? • Anyone with unlimited computing needs! • High Energy Physics: • Simulating the detector a particle accelerator before turning it on allows one to understand the output. • Biochemistry: • Simulate complex molecules under different forces to understand how they fold/mate/react. • Biometrics: • Given a large database of human images, evaluate matching algorithms by comparing all to all. • Climatology: • Given a starting global climate, simulate how climate develops under varying assumptions or events.
What are the Challenges? • Why don’t we have The Grid yet? • Technical Challenges: • Enforcing the wishes of all the owners. • Automatically negotiating expectations. • Limiting what resources a user can consume. • Performance and scalability. • Debugging and troubleshooting. • Managing access to data! • Making it easy to use!
Job Job Job Job Job Job Job Job Job Job Job Job Job Job Job Job Computing Environment I will only run jobs between midnight and 8 AM I will only run jobs when there is no-one working at the keyboard Miscellaneous CSE Workstations CPU CPU CPU Fitzpatrick Workstation Cluster CPU CPU CPU CPU CPU CPU CPU Disk Disk Disk Disk Disk Disk Disk Disk Condor Match Maker I prefer to run a job submitted by a CCL student. CPU CPU CPU CPU CPU CPU CPU Disk Disk Disk Disk Disk Disk Disk CVRL Research Cluster CCL Research Cluster
CPU History Storage History
Flocking Between Universities Wisconsin 1200 CPUs Purdue A 541 CPUs Notre Dame 300 CPUs Purdue B 1016 CPUs http://www.cse.nd.edu/~ccl/operations/condor/
Scalable I/O for Biometrics • Computer Vision Research Lab in CSE • Goal: Develop robust algorithms for identifying humans from (non-ideal) images. • Technique: Collect lots of images. Think up clever new matching function. Compare them. • How do you test a matching function? • For a set S of images, • Compute F(Si,Sj) for all Si and Sj in S. • Compare the result matrix to known functions. Credit: Patrick Flynn at Notre Dame CSE
F Computing Similarities
A Big Data Problem • Data Size: 10k images of 1MB = 10 GB • Total I/O: 10k * 10k * 2 MB *1/2 = 100 TB • Would like to repeat many times! • In order to execute such a workload, we must be careful to partition both the I/O and the CPU needs, taking advantage of distributed capacity.
Move 200 TB at Runtime! Job Job Job Job Job Job Job Job Conventional Solution Disk Disk Disk Disk CPU CPU CPU CPU CPU CPU CPU CPU Disk Disk Disk Disk Disk Disk Disk Disk
Job Job Job Job 1. Break array into MB-size chunks. A More Scalable Solution 3. Jobs find nearby data copy, and make full use before discarding. CPU CPU CPU CPU CPU CPU CPU CPU Disk Disk Disk Disk Disk Disk Disk Disk 2. Replicate data to many disks. Result: Biometric users can accomplish in three days what used to take one month!
The All-Pairs Abstraction • All-Pairs: • For a set S and a function F: • Compute F(Si,Sj) for all Si and Sj in S. • The end user provides: • Set S: A bunch of files. • Function F: A self-contained program. • Applies to lots of different problems: • Comparing proteins for interactions. • Searching documents for similarities. • Any kind of optimization problems.
S F F F F F 2 – Backend decides where to run, how to partition, when to retry failures... 1 – User uploads S and F into the system. 3 – Return result matrix to user. An All-Pairs Facility at Notre Dame 100s-1000s of machines All Pairs Web Portal CPU CPU CPU CPU Disk Disk Disk Disk
Research Opportunities • Openings for undergraduate students. • Research for class credit during the year. • Research for paycheck during the summer. • Must enjoy programming and making things work. • Some Project Ideas: • Build a easy-to-use web front-end for using a grid computing system to process biometric data. • Find a way to get data from your workstation to 500 other machines as fast as possible. • Build and manage a filesystem that ties together 500 disks at once to create one gigantic 20TB system.
For more information... • To learn more about Condor@ND • http://www.nd.edu/~condor • Prof. Douglas Thain • dthain@nd.edu • http://www.cse.nd.edu/~dthain • 382 Fitzpatrick Hall