100 likes | 254 Views
SIP End-to-End Performance Metrics. 70 th IETF Conference PMOL Daryl Malas. Problem Statement. With widespread implementation of SIP the following problems have surfaced: No standard method for measuring SIP performance Enterprise and SIP Service Provider confusion on “How”
E N D
SIP End-to-End Performance Metrics 70th IETF Conference PMOL Daryl Malas
Problem Statement • With widespread implementation of SIP the following problems have surfaced: • No standard method for measuring SIP performance • Enterprise and SIP Service Provider confusion on “How” • Enterprise and SIP Service Provider confusion on “Where” to measure • Current reliance on PSTN (and other) metrics • Draft defines the “How” and the “Where” for common metrics applicable to ALL SIP applications.
Relevant Standards Dev • IETF • PMOL WG to develop standard to benchmark end-to-end SIP application performance • BMWG developing standard to benchmark SIP networking device performance • Other • SPEC developing industry-available test code for SIP benchmarking in accordance with IETF’s BMWG and PMOL standards • ITU-T (Study Group 12) focused on detailed failure/success scenarios and end-to-end benchmark guidance
History • Introduced in SIPPING WG at 66th IETF conference • Confusion around scope relevant to BMWG SIP tests • Confusion around draft home (e.g. IPPM, SIPPING, or BMWG) • Draft did not fit within the charter of these groups • Unanimous SIPPING WG consensus of interest and to continue development of the draft • Updates and continual progress with draft requested from WG and AD’s • Discussion of further updates and progress in SIPPING WG at 67th and 68th IETF conference • Draft still in search of home at 68th conference • Draft introduced in APM BOF at 69th IETF conference
Draft Discussion • Consensus on purpose and scope • Framework • Concept – LCR (Least Cost Routing) and QBR (Quality Based Routing)
Example Metric - SER • Session Establishment Rate (SER) • Detect ability of a terminating UA or downstream proxy to successfully establish new sessions
“Real World” Application • Enterprise with multiple upstream SIP Service Providers (SSP) • SER SSP A – 68% • SER SSP B – 87% • What can the Enterprise do with this data?
Currently Defined Metrics • Registration Request Delay (RRD) • Detect failures or impairments causing delays in REGISTER requests • Session Request Delay (SRD) • Detect failures or impairments causing delays in new session requests • Session Disconnect Delay (SDD) • Detect failures or impairments causing delays in ending a session • Session Duration Time (SDT) • Detect problems (e.g. Poor audio quality) causing short sessions • Average Hops per Request (AHR) • Detect inefficient routing and failure conditions due to number of elements traversed
Currently Defined Metrics • Session Establishment Efficiency Rate (SEER) • Complements SER, but excludes potential effects of terminating UAS • Session Defects (SD) • Detect consistent failures in dialog processing • Ineffective Session Attempts (ISA) • Detect proxy or UA failures or congestion conditions causing setup requests • Session Disconnect Failures (SDF) • Detects when an active session is terminated due to a failure condition • Session Completion Rate (SCR) • Detects failures caused by downstream proxies not responding to new session setups • Session Success Rate (SSR) • Ratio of successful sessions compared with sessions failing due to ISA or SDF
Next Steps • Agreed upon set of metrics • Too many • Too few • Different Metrics • Concerns or Questions • Working Group Item?