1 / 7

BMWG WG

BMWG WG. IETF 89 Friday Mar 07, 2014 Carol Davids < davids@iit.edu > Vijay K. Gurbani < vkg@bell-labs.com > Scott Poretsky < sporetsky@allot.com > SIP Benchmarking (-09). Status of the work. Jan 16 2013: Terminology and Methodology documents sent to IESG.

hong
Download Presentation

BMWG WG

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BMWG WG • IETF 89 • Friday Mar 07, 2014 • Carol Davids <davids@iit.edu> • Vijay K. Gurbani <vkg@bell-labs.com> • Scott Poretsky <sporetsky@allot.com> • SIP Benchmarking (-09)

  2. Status of the work • Jan 16 2013: Terminology and Methodology documents sent to IESG. • Jan 24 2013: IESG review (Robert Sparks) suggested documents need more work. • Feb 18 2014: -09 versions released for Terminology and Methodology addressing IESG • review.

  3. Diffs

  4. Diffs • Edits for clarity and readability. • Content changes driven from implementing the tests over a number of years. • Major content changes: • Goal remains to benchmark SIP devices other than UAs • The benchmarks are the maximum arrival rate for INVITEs and REGISTERs that the • DUT can sustain with no errors over a long period of time. • Deleted separate consideration of a SUT, since it reduces to the testing of a • system which itself can be considered to be the DUT • Reduced the number of distinct architectures to two: One in which the DUT • handles media and the other in which the DUT does not handle media

  5. Diffs • Major content changes: • Tests related to loop detection and forking were removed, since these are mainly • conformance tests and not performance tests (yes, they will slow performance). • Reduced benchmarks to two from original seven. • The two are: • Session Establishment Rate • Registration Rate • Removed benchmarks related to IM rate (too many variables to control).

  6. Diffs • Major content changes: • IM (MESSAGE) was an example of a non-INVITE benchmarking transaction; • using REGISTER instead. • Expanded test reporting template to include TLS ciphersuites, IPSec profiles and • Codec type.

  7. Next steps • Chair guidance needed on next steps for the drafts.

More Related