1 / 9

Performance Benchmarking for SIP Networking Devices

This proposal discusses the need for consistent performance metrics when benchmarking SIP networking devices to ensure accurate comparisons and operational clarity. It explores terminology and methodology for measuring performance factors such as registration rate, session capacity, IM rate, and presence rate. Collaboration with SIPPING WG and SPEC is being considered.

Download Presentation

Performance Benchmarking for SIP Networking Devices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Proposal for BENCHMARKING SIP NETWORKING DEVICES draft-poretsky-sip-bench-term-01.txt draft-poretsky-sip-bench-meth-00.txt Co-authors are Scott Poretsky of Reef Point Systems Vijay Gurbani of Lucent Technologies Carol Davids of IIT VoIP Lab 66th IETF Meeting – Montreal

  2. Service Providers are now planning VoIP and Multimedia network deployments using the IETF developed Session Initiation Protocol (SIP). The mix of SIP signaling and media functions has produced inconsistencies in vendor reported performance metrics and has caused confusion in the operational community. (Terminology) SIP allows a wide range of configuration and operational conditions that can influence performance benchmark measurements. (Methodology) Motivation

  3. -----Original Message----- From: Romascanu, Dan (Dan) [mailto:dromasca@avaya.com] Sent: Sunday, June 25, 2006 6:00 AM I believe that the scope of the 'SIP Performance Metrics' draft is within the scope of what bmwg is doing for a while, quite successfully, some say. On a more 'philosophical plan', there is nothing that says that the IETF work must strictly deal with defining the bits in the Internet Protocols - see http://www.ietf.org/internet-drafts/draft-hoffman-taobis-08.txt. And in any case, measuring how a protocol or a device implementing a protocol behaves can be considered also 'DIRECTLY related to protocol development'. Relevance to BMWG -----Original Message----- From: nahum@watson.ibm.com [mailto:nahum@watson.ibm.com] Sent: Friday, May 26, 2006 2:51 PM SPEC wants to develop and distribute common code for benchmarking, as is done with SPECWeb a SPECJAppServer. That code can and should use the standardized peformance definitions agreed to by SIPPING and/or BMWG.

  4. BMWG develops standard to benchmark SIP networking device performance SIPPING WG develops standard to benchmark end-to-end SIP application performance SPEC to develop industry-available test code for SIP benchmarking in accordance with IETF’s BMWG and SIPPING standards. Industry Collaboration -----Original Message-----From: Poretsky, Scott Sent: Thursday, June 22, 2006 8:00 PMTo: 'Malas, Daryl'; acmorton@att.com; gonzalo.camarillo@ericsson.com; mary.barnes@nortel.comCc: vkg@lucent.com; Poretsky, ScottSubject: RE: (BMWG/SippingWG) SIP performance metricsYes Daryl.  I absolutely agree.  The item posted to BMWG focuses on single DUT benchmarking of SIP performance.  Your work in SIPPING is focused on end-to-end application benchmarking.  It would be great (and I would even say a requirement) that the Terminologies for these two work items remain consistent with each other. 

  5. Performance benchmark metrics for black-box benchmarking of SIP networking devices Terminology and Methodology documents Benchmark SIP Control Media Can be VoIP, Video, IM, Presence, etc. Relationship between the performance of control and media Various SIP Transport TCP and UDP Scope…So Far

  6. Transport with TLS? SCTP? Benchmark impact of NAT? TURN, STUN? Expand Tester to be Emulated Agent _and_ Emulated Server? What is a SUT? Include Relay, Media Gateway, SASF? Scope…Being Considered

  7. 3. Term definitions..............................................5 3.1 Test Components...........................................5 3.2 Test Setup Parameters.....................................9 3.3 Benchmarks.............................................…13 3.3.1 Registration Rate....................................13 3.3.2 Successful Session Rate..............................13 3.3.3 Successful Session Attempt Rate......................14 3.3.4 Standing Session Capacity............................14 3.3.5 Session Completion Rate..............................15 3.3.6 Busy Hour Session Connects (BHSC)....................15 3.3.7 Busy Hour Session Attempts (BHSA)....................16 3.3.8 Session Setup Delay..................................16 3.3.9 Session Teardown Delay...............................17 3.3.10 Standing Sessions...................................17 3.3.11 IM Rate.............................................18 3.3.12 Presence Rate.......................................18 Issue 1: Terms = SIP Session and Associated Media, SIP Server Terminology

  8. Test Cases > Registration Rate > Session Rate > Session Rate with Loop Detection > Enabled Session Rate with Forking > Session Rate with Forking and Loop Detection > Session Rate with Media > IM Rate > Presence Rate > Session Attempt Rate > Session Capacity > Session Attempt Rate with Media > Session Capacity with Media (Open Issues) ISSUE 1: Is a SUT required for benchmarking the associated media? Media does not pass through Proxy Server but would pass through Firewall/NAT. ISSUE 2: Add test cases to benchmark forwarding performance specific to NAT’ing with SIP. Methodology

  9. Work Item Discussion Is this work item of interest to the BMWG? Input for Scope of Work? Thoughts on collaboration with SIPPING WG and SPEC? Document Discussion Feedback on selected terminology? Ideas for additional methodology test cases? Any other comments for the documents? Author Actions Define Scope of Work Resolve identified issues for methodology Submit methodology –00 consistent with BMWG terminology and work in SIPPING Next Steps

More Related