Upload
hunter-galloway
View
17
Download
5
Embed Size (px)
DESCRIPTION
SIP Performance Benchmarking draft-poretsky-sip-bench-term-04.txt draft-poretsky-bmwg-sip-bench-meth-02.txt BMWG, IETF-71 Philadelphia March 2008 Carol Davids IIT Vijay Gurbani Alcatel-Lucent Scott Poretsky NextPoint. Motivation. Problem Statement: - PowerPoint PPT Presentation
Citation preview
1
SIP Performance Benchmarkingdraft-poretsky-sip-bench-term-04.txt
draft-poretsky-bmwg-sip-bench-meth-02.txt
BMWG, IETF-71PhiladelphiaMarch 2008
Carol Davids IIT
Vijay Gurbani Alcatel-Lucent Scott Poretsky NextPoint
2
Motivation• Problem Statement:
– Service Providers are now deploying VoIP and Multimedia using the IETF developed Session Initiation Protocol (SIP).
– Industry lacks common terminology for SIP performance benchmarks
– SIP allows a wide range of configuration and operational conditions that can influence performance benchmark measurements.
• Goals:– Service Providers use the benchmarks to compare performance of RFC
3261 network devices
– Vendors and others can use benchmarks to ensure performance claims are based on common terminology and methodology.
– Benchmark metrics can be applied to make device deployment decisions for IETF SIP
3
Scope
SUT
SIPServer(DUT)
SIP ALG/NAT
Tester(Emulated Agents)
• Terminology defines Performance benchmark metrics for black-box measurements of SIP networking devices
• Methodology describes how to measure the metrics for a DUT or SUT • DUT MUST be a RFC 3261 compliant device and MAY have SIP Aware
Firewall/NAT and other functionality• SUT MAY be RFC 3261 compliant device with a separate external SIP
Firewall and/or NAT• Benchmark
• Control Signaling in presence of Media, not media itself• SIP Transport (TCP, UDP, TLS over these)• Invite and Non-Invite scenarios
SIP Signaling
4
4
Industry Collaboration• BMWG to develop standard to benchmark SIP performance of a
single device
• SIPPING and BMWG Chairs met in Montreal to discuss this SIP Performance Benchmarking work item as it was first opened in BMWG
• PMOL WG developing standard to benchmark end-to-end SIP application performance.
• SPEC to develop industry-available test code for SIP benchmarking in accordance with IETF’s BMWG and SIPPING standards
5
Supported in SIP and SIPPINGSubject: Work on SIP performance issuesDate: Tue, 18 Dec 2007 08:30:31 +0200From: Gonzalo Camarillo <[email protected]>To: Al Morton <[email protected]>, "Romascanu, Dan (Dan)" <[email protected]>CC: [email protected], Jon Peterson <[email protected]>,
Cullen Jennings <[email protected]>, Mary Barnes <[email protected]>
Hi, I understand that BMWG is in the process of deciding whether
or not to work on SIP performance and benchmarking issues. The following drafts were brought to my attention:
http://tools.ietf.org/html/draft-poretsky-sip-bench-term-03http://tools.ietf.org/html/draft-poretsky-bmwg-sip-bench-meth-02 In July 2006 (IETF 66 in Montreal), as you can see in the MoMs
below, the SIPPING WG showed support to work on SIP performance and measurements issues. This was seen as an interesting and useful topic by the SIP community.
http://www3.ietf.org/proceedings/06jul/minutes/sipping.txtCheers,GonzaloSIPPING WG co-chair
From: Dean Willis <[email protected]>Date: December 13, 2007 2:24:10 PM CSTTo: [email protected], Dan Romascanu
<[email protected]>Cc: "Vijay K. Gurbani" <[email protected]>Subject: Benchmarking for SIP in BMWG
Vijay Gurbani just brought it to me attention that there are some drafts related to the benchmarking of SIP systems being discussed in the BMWG working group. In general, I'm very supportive of this concept. We've tried to do it in SIP on several occasions in the past and just don't have the right mindset.
I think that the development of vocabulary and practice around the benchmarking of SIP will be very helpful in the operator sector, as sizing and performance planning with COTS systems are currently very difficult to estimate "on paper" and currently require building up labs just to get baselines.
Further, I think that the SIP "overload" work (http://www.ietf.org/) internet-drafts/draft-hilt-sipping-overload-03) and SIMPLE's "presence scaling" analysis (draft-houri-sipping-presence-scaling-requirements-01) need to have a benchmarking framework in place so that we can really talk about the issues.
Dean
6
Benchmarks
• Maximum Session Establishment Rate
• Maximum Registration Rate
• Maximum IM Rate
• Session Capacity
• Session attempt performance
• Session setup delay
• Session disconnect delay
• Standing sessions
7
BMWG Input
• Dec 18, 2007: Jim McQuaid • Questions re convergence of the capacity metric • Description of Back-off method to be clarified in the next
version
8
Next Steps
• Incorporate comments from meeting and mailing list.
• Consider for BMWG work item?