250 likes | 419 Views
Performance Engineering Laboratory. Performance Engineering Laboratories. Computer Engineering Department King Fahd University of Petroleum & Minerals (KFUPM), Dhahran. Agenda. Performance Engineering Laboratory (PEL) Independent performance evaluation Services Track record
E N D
Performance Engineering Laboratory Performance Engineering Laboratories Computer Engineering Department King Fahd University of Petroleum & Minerals (KFUPM), Dhahran
Agenda • Performance Engineering Laboratory (PEL) • Independent performance evaluation • Services • Track record • Web server performance comparisons • Streaming media server performance evaluation • Network traffic analysis • Opportunities for working with us 2
Performance Engineering Laboratory (PEL) • A facility established in computer engineering department at KFUPM • Services • Independent evaluation of client products • Product spec’s evaluation • Performance comparison with competitor’s products • Feedback to developers • Technical marketing input • Network traffic analysis • Target products and services • IT products • Web servers, server accelerators, and streaming media servers • Network infrastructure products • Proxy caches, LDAP serves, and layer-4 switches • High-end systems • Parallel, SMP, and DSM systems • ISPs and carriers 3
Why Evaluate Performance • Performance is central to computer systems • New hardware is typically faster than the existing one • New software is supposed to be “better” than the existing one • Competition demands efficient products • Marketing a product that is slower than its competitors is hard • Highly efficient products can cut cost for customer • Performance is central to any R&D effort • Need to compare similar architectures, algorithms, systems, etc. • Determine the efficacy of new designs • Understand the differences between different systems • Comparison (meaningful!) of a product with competitor’s products • Sales team can’t provide it • Sales team can effectively use it 4
Market Opportunities • Continued demand for client-server and IT products • Large number of competing products • Performance is increasingly becoming the distinguishing factor among similar product • Users are becoming increasingly aware of the performance issues and demand efficient products • Time constraints to undertake detailed performance analysis • Especially true with small to medium sized companies with small product development teams • Performance evaluation often left for QA teams • Goal of QA is to have a reliable, properly functioning product rather than an efficient product • Third-party performance engineers can greatly help 5
Independent Evaluation • Concerns with independent performance evaluation • QA can do it • Sales team can do it; even compare performance with competitors • Confidentiality may be compromised • Our vision • Anyone can generate measurements but few can “read” them to understand the story they tell • Independent evaluation is significantly more credible for end user or IT manager who decides to buy the product • Sales team’s evaluation result is always predictable: “our” product is orders of magnitude better than “their” product • Predictable no information!! • Independent performance evaluation teams work closely with the developers in a professional manner 6
Services • Primary services • Stand-alone product performance, QoS, and reliability analysis • Product Specs evaluation • Comparative analysis • Network traffic and workload characterization • Modeling based analysis • Application parallelization • Application profiling and/or system monitoring • Other services • Setup of in-house testing infrastructure • Customized tools • Automated performance regression testing tools • Training 7
Resources • We have two types of resources • Skilled human resources • A client-server based testing environment 8
PEL Testbed PEL infrastructure being used for proxy performance evaluation 9
Track Record • Performance evaluation projects • Comparison of Apache and Microsoft IIS web servers • Comparison of Darwin Quicktime and Microsoft Windows Media Server • Traffic analysis projects • A campus web traffic analysis • Parallel application performance evaluation • Evaluation of automatically parallelized CFD code for high-end DSMs • A trace-driven and measurement based memory performance evaluation for parallel applications on DSMs • Design and evaluation of monitoring systems 10
Web Server Performance Comparisons Same server host runs Apache under Linux and IIS under MS Windows 2000 Server Server: - Pentium 650 MHz - 256 MB RAM Clients: - Pentium 166 MHz - 64 - 128 MB RAM LAN: - 100 Mbps - Layer 2 switch 11
Web Server Performance Comparisons • Apache shows higher throughput with larger file sizes • IIS shows higher throughput for average (~10KB) file sizes 12
Web Server Performance Comparisons • IIS offers lower latency at high load and small file sizes • Apache shows lower latency with large file sizes only • Apache is network throughput limited here (~90 Mb/s max with 100 Mb/s switch) 13
Web Server Performance Comparisons • So, which web server is better • Apache can show better throughput but our results are limited due to available network bandwidth (100 Mb/s per port) • IIS shows high throughput and low latency for the average WWW document size (~10KB) case with high transaction loads • Other conclusion • Usually, it is an exaggeration to say that one product is better than the other • This is usually true under specific workload conditions • This information is useful for developers to tune their code • Performance evaluation by sales departments won’t tell this • Also, don’t under-estimate Microsoft products on Windows platforms… 14
Comparison of Streaming Media Servers • Server machine runs Darwin Streaming Server under Linux • Same server machine runs Windows Media Server under Win2K Server Server: - Pentium 166 MHz - 128 MB RAM Clients: - Pentium 166 MHz - 48 - 64 MB RAM Switch: - 100 Mbps - Layer 2 switch 15
Comparison of Streaming Media Servers • Peak throughput • Indicated by 100% CPU usage • Windows Media Server delivers significantly larger throughput at higher load that Darwin Streaming Server • Memory performance • WMS shows high cache and page fault rates at high loads but still delivers better throughput • Better exploitation of latency hiding opportunities offered by the processor through OS, compiler, and application • Don’t expect more from a freely available media server!! • Darwin is available in public domain with source code from www.apple.com/quicktime 17
Campus WWW Traffic Analysis • Web site popularity based on one month long logs from MS Proxy 2.0 server • Characteristic heavy-tail distribution of frequency of visits Top 10 Sites 19
Campus WWW Traffic Analysis • Largest number of documents accessed are images followed by text documents • Statistics reflect accesses over one month (Feb. 2002) 20
Campus WWW Traffic Analysis • Analysis of arbitrarily selected 24 hours of proxy operation • Low throughput with high latency 21
Campus WWW Traffic Analysis • Profile shows bandwidth saving by proxy • However, most common case shows highest latency as well • Contrary to common perception, bandwidth is not the cause of long latencies experienced 23
Opportunities to Work With Us • Short term contracts • Suitable for typically one particular product or service • Turn around time of only a few weeks • Longer term contracts • Suitable for multiple products and/or services • Long-term relationship with one or more product development/deployment teams • Points of contact • Dr. Sadiq M. Sait (sadiq@ccse.kfupm.edu.sa) • Dr. Abdul Waheed M.A. Sattar (awaheed@ccse.kfupm.edu.sa) 24
Thank you 25