12
75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky <[email protected]> Carol Davids <[email protected]> Vijay K. Gurbani <[email protected]>

75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

Embed Size (px)

DESCRIPTION

Required Document Reviews 1.Review “Reporting Format” to ensure it is complete 2.Assess alignment with PMOL end-to-end SIP benchmarking work 3.Ensure SIP Overload is sufficiently covered in the methodology while providing overload mechanism-agnostic method(s) to observe and measure the overload condition (as requested by SIPPING) IETF 75 BMWG3

Citation preview

Page 1: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

75th IETF, Stockholm, SwedenJuly 26-31, 2009

BMWG SIP Benchmarking

BMWG, Monday July 27, 2009

Scott Poretsky <[email protected]> Carol Davids <[email protected]>Vijay K. Gurbani <[email protected]>

Page 2: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

Required Document Updates1. Add REGISTRATION and IM specific terms as needed 2. Remove Packet Loss from IM and Registration Rate

test case3. Add option for AUTH Challenge in Meth and update

Session Attempt Failure definition in Term to allow 401 and 407 when AUTH is used.

4. Session Attempt Failure definition should use the Establishment Threshold Time term

5. Add number of endpoints for forking option and establish an upper bound to the response time

6. Add codec as a media parameter since this could impact setup rate

2IETF 75 BMWG

Page 3: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

Required Document Reviews1. Review “Reporting Format” to ensure it is

complete2. Assess alignment with PMOL end-to-end

SIP benchmarking work3. Ensure SIP Overload is sufficiently

covered in the methodology while providing overload mechanism-agnostic method(s) to observe and measure the overload condition (as requested by SIPPING)

IETF 75 BMWG 3

Page 4: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

Reporting Format (Current)

Test Setup SIP Transport Protocol = ____________________ Session Attempt Rate = _____________________ IS Media Attempt Rate = ____________________ Total Sessions Attempted = __________________ Media Streams Per Session = ________________ Associated Media Protocol = _________________ Media Packet Size = ________________________ Media Offered Load = _______________________ Media Session Hold Time = __________________ Establishment Threshold Time = _______________ Loop Detecting Option = _____________________ Forking Option = ___________________________

4

Benchmarks for IS Session Capacity = __________________________ Session Overload Capacity = __________________ Session Establishment Rate = _________________ Session Establishment Performance = __________ Session Attempt Delay = _____________________ Session Disconnect Delay = __________________

Benchmarks for NS IM Rate = _______________________________ Registration Rate = _________________________

IETF 74 BMWG

Page 5: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

Alignment with PMOL

IETF 75 BMWG 5

• We set out to determine if the measurements in the BMWG metrics draft can be used to predict measurements made using the metrics defined in the PMOL end-to-end performance draft.

• We conclude that the answer is “No.” • BMWG takes individual core elements and stresses them to the

point of failure. Its metrics describe the session attempt rates that cause failure of core elements.

• PMOL metrics describe the delays between signaling messages, the duration of sessions and the percent of session attempt failures to session attempts.

• The next two pages contain examples of the divergence of the two sets of metrics.

Page 6: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

Examples

IETF 75 BMWG 6

1. Ineffective Session Attempts• The appearance of an “Ineffective Session Attempt” (ISA) as

defined in PMOL, triggers the end of a test cycle in BMWG. • PMOL counts ISAs, since PMOL is interested in the end-to-end

experience. BMWG stops the test since it has tested to failure.

2. Session Attempts• PMOL does not define a session attempt rate. It does define

session attempt delays. This is because PMOL is interested in the end-to-end performance of a session.

• The BMWG draft defines a session attempt rate as a parameter of test. This is because a metric of interest is the rate at which the first “Ineffective Session Attempt” occurs.

Page 7: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

Alignment with PMOL

IETF 75 BMWG 7

Session Establishment Ratio vs. Session Attempt Rate• The PMOL metric SER is the percent of session attempts that are

not redirected and that are answered. • The BMWG Session Attempt Rate is a parameter of the test. It is

the number of sessions that the Emulated Agent attempts to establish with the DUT/SUT over a specified time interval.

The next pages show the comparison of PMOL metrics with BMWG parameters of test as well as the comparison of PMOL metrics with BMWG metrics.

Page 8: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

PMOL Metrics vs BMWG Parameters of Test

PMOL metrics • Registration request delay• Session request delay• Session disconnect delay• Session duration time• Hops per request• Session establishment

ratio• Session defects ratio• Ineffective session

attempts• Session disconnect failures• Session completion ratio

IETF 75 BMWG 8

BMWG parameters• Session attempt rate• IS media attempt rate• Establishment threshold

time• Media Packet size• Media Offered Load• Media Session Hold Time• Loop detection and forking

options

Page 9: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

PMOL Metrics vs BMWG Parameters of TestPMOL• Session establishment ratio (SER) – a metric

– (# of INVITE Requests w/ associated 200 OK) / ((Total # of INVITE Requests)-(# of INVITE Requests w/ 3XX Response)) x 100 (The percent of session attempts that are not redirected and that are answered.)

• Ineffective session attempts (ISA) – a metric– Ineffective session attempts occur when a proxy or agent

internally releases a setup request with a failed or overloaded condition.

BMWG• Session Attempt Rate – a parameter of test

– The number of sessions that the Emulated Agent attempts to establish with the DUT/SUT over a specified time interval.

• IS Media Attempt Rate – a parameter of test– Configuration on the Emulated Agent for number of INVITE-

Initiated sessions with Associated Media to be established at the DUT per continuous one- second time intervals. IETF 75 BMWG 9

Page 10: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

PMOL metrics vs. BMWG metrics

PMOL metrics • Registration request delay• Session request delay• Session disconnect delay• Session duration time• Hops per request• Session establishment

ratio• Session defects ratio• Ineffective session

attempts• Session disconnect

failures• Session completion ration

IETF 75 BMWG 10

BMWG metrics• Registration rate• Session establishment

rate• Session capacity• Session overload capacity• Session establishment

performance• Session attempt delay• IM rate

Page 11: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

PMOL• Session establishment ratio (SER) – a metric

– (# of INVITE Requests w/ associated 200 OK) / ((Total # of INVITE Requests)-(# of INVITE Requests w/ 3XX Response)) x 100

BMWG• Session Establishment Rate – a metric

– The maximum average rate at which the DUT/SUT can successfully establish sessions.

• Session Establishment Performance – a metric– The percent of Session Attempts that become Established

Sessions over the duration of a benchmarking test.

IETF 75 BMWG 11

PMOL metrics vs. BMWG metrics

Page 12: 75 th IETF, Stockholm, Sweden July 26-31, 2009 BMWG SIP Benchmarking BMWG, Monday July 27, 2009 Scott Poretsky Carol Davids Vijay K. Gurbani

PMOL• Session request delay (SRD) – a metric

– Time of Status Indicative Response minus Time of INVITE

BMWG• Session Attempt Delay – a metric

– The average time measured at the Emulated Agent for a Session Attempt to result in an Established Session.

IETF 75 BMWG 12

PMOL metrics vs. BMWG metrics