Network Working Group
   INTERNET-DRAFT
   Expires in: April 2007
                                                   Scott Poretsky
                                                   Reef Point Systems

                                                   Vijay Gurbani
                                                   Lucent Technologies

                                                   Carol Davids
                                                   Illinois Institute
                                                   of Technology

                                                   October 2006

                        Terminology for Benchmarking
                           SIP Networking Devices

                 <draft-poretsky-sip-bench-term-02.txt>

Intellectual Property Rights (IPR) statement:
By submitting this Internet-Draft, each author represents that any
applicable patent or other IPR claims of which he or she is aware
have been or will be disclosed, and any of which he or she becomes
aware will be disclosed, in accordance with Section 6 of BCP 79.

Status of this Memo

   Internet-Drafts are working documents of the Internet Engineering
   Task Force (IETF), its areas, and its working groups.  Note that
   other groups may also distribute working documents as
   Internet-Drafts.

   Internet-Drafts are draft documents valid for a maximum of six months
   and may be updated, replaced, or obsoleted by other documents at any
   time.  It is inappropriate to use Internet-Drafts as reference
   material or to cite them other than as "work in progress."

   The list of current Internet-Drafts can be accessed at
   http://www.ietf.org/ietf/1id-abstracts.txt.

   The list of Internet-Draft Shadow Directories can be accessed at
   http://www.ietf.org/shadow.html.

Copyright Notice
   Copyright (C) The Internet Society (2006).

Poretsky, Gurbani, Davids                                   [Page 1]


INTERNET-DRAFT           Benchmarking Terminology for     October 2006
                            SIP Networking Devices

ABSTRACT
This document provides a terminology for benchmarking SIP
performance in networking devices.  Terms are included for test
components, test setup parameters, and performance benchmark metrics
for black-box benchmarking of SIP networking devices.  The
Performance Benchmark Metrics are obtained for the SIP Control Plane
and Media Plane.  The terms are intedned for use in a companion
Methodology document for complete performance characterization of a
device in a variety of network conditions making it possible to
compare performance of different devices. It is critical to provide
Test Setup Parameters and a Methodology document for SIP performance
benchmarking because SIP allows a wide range of configuration and
operational conditions that can influence performance benchmark
measurements.  It is necessary to have terminology and methodology
standards to ensure that reported benchmarks have consistent
definition and were obtained following the same procedures.
Benchmarks can be applied to compare performance of a variety
of SIP networking devices.

Poretsky, Gurbani, Davids                                   [Page 2]


INTERNET-DRAFT           Benchmarking Terminology for     October 2006
                            SIP Networking Devices

Table of Contents

     1. Introduction .................................................4
     2. Existing definitions .........................................4
     3. Term definitions..............................................5
        3.1 Test Components...........................................5
           3.1.1 SIP Control Plane....................................5
           3.1.2 SIP Media Plane......................................5
           3.1.3 Emulated Agents......................................6
           3.1.4 Session Server.......................................6
           3.1.5 SIP-Aware Stateful Firewall..........................6
           3.1.6 Invite-Initiated Control Session.....................7
           3.1.7 Non-Invite Initiated Control Session.................8
           3.1.8 Registration.........................................8
           3.1.9 Associated Media Stream..............................8
           3.1.10 Associated Media Session............................9
        3.2 Test Setup Parameters.....................................9
           3.2.1 SIP Transport Protocol...............................9
           3.2.2 Intended Media Session Duration......................10
           3.2.3 Measured Media Session Duration......................10
           3.2.4 Session Attempt Rate.................................11
           3.2.4.1 Media Session Attempt Rate.........................11
           3.2.4.2 NIICS Session Attempt Rate.........................12
           3.2.5 Media Streams per Session............................12
           3.2.6 Media Packet Size....................................13
           3.2.7 Media Offered Load, per Media Stream.................13
           3.2.8 Media Offered Load, Aggregate........................13
           3.2.9 Media Session Hold Time..............................14
        3.3 Benchmarks................................................14
           3.3.1 Registration Rate....................................14
           3.3.2 Session Rate.........................................14
           3.3.3 Session Capacity.....................................15
           3.3.4 Session Establishment Performance....................15
           3.3.5 Session Setup Delay..................................16
           3.3.6 Session Teardown Delay...............................16
           3.3.7 Standing Sessions....................................17
           3.3.8 IM Rate..............................................18
           3.3.9 Presence Rate........................................18
     4. IANA Considerations...........................................19
     5. Security Considerations.......................................19
     6. Acknowledgements..............................................19
     7. References....................................................20
     8. Author's Address..............................................21
     9. Full Copyright Statement......................................22


Poretsky, Gurbani, Davids                                     [Page 3]


INTERNET-DRAFT           Benchmarking Terminology for       October 2006
                            SIP Networking Devices

1. Introduction
Service Providers are now planning Voice Over IP (VoIP) and Multimedia
network deployments using the IETF developed Session Initiation Protocol
(SIP) [Ro02].  VoIP has led to development of new networking devices
including SIP Server, Session Border Controller, and SIP-Aware Stateful
Firewall.  The mix of voice and IP functions in this variety of devices
has produced inconsistencies in vendor reported performance metrics and
has caused confusion in the service provider community. SIP allows a
wide range of configuration and operational conditions that can
influence performance benchmark measurements.  It is important to be
able to correlate a signalling measurement with the media plane
measurements to determine the system performance.

When defining SIP performance benchmarks it is critical to also provide
definitions for Test Setup Parameters and a corresponding Methodology
document for SIP performance benchmarking.  This enables benchmarks to
be understood, fairly compared, and repeatable.  This document provides
the benchmarking terms for performance benchmarking the SIP control
and media planes.  Terms are included for Test Components, Test Setup
Parameters, and Benchmarks.  All benchmarks are black-box measurements
of the SIP Control and Media Planes.  It is intended that these terms
be used in a companion Methodology document.

SIP is used to create a growing number of very different applications
and features.  The set of benchmarking terms provided in this document
is intended for application to each.  SIP is frequently used to
create streams of media.  The control plane and the media plane are
treated as orthogonal in this document.  In order to characterize the
performance of one or another application or feature it may be
necessary to logically associate several of the benchmarking metrics
provided here.  Benchmarks to be obtained and compared for different
types of Devices Under test (DUTs) such as SIP Proxy Server, SBC,
P-CSCF, Proxy Server paired with a Firewall/NAT device, and P-CSCF
paired with a Firewall/NAT device.  Media benchmarks can also be made
when testing Systems Under Test (SUTs).

2.  Existing definitions
   The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT",
   "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this
   document are to be interpreted as described in BCP 14, RFC 2119.
   RFC 2119 defines the use of these key words to help make the
   intent of standards track documents as clear as possible.  While this
   document uses these keywords, this document is not a standards track
   document.  The term Throughput is defined in RFC 2544.

   Many SIP terms used in this document are defined in [Ro02].

Poretsky, Gurbani, Davids                                     [Page 4]


INTERNET-DRAFT           Benchmarking Terminology for       October 2006
                            SIP Networking Devices

3. Term Definitions

   3.1 Test Components
       3.1.1 SIP Control Plane

           Definition:
           The logical plane in which SIP Signaling messages are
           exchanged between SIP Agents.

           Discussion:
           SIP signaling messages are used to establish SIP signaling
           sessions in several ways: directly between two User Agents;
           between a User Agent and a Proxy Server; or between a
           series of Proxy Servers.

           Measurement Units:
           N/A

           Issues:
           None

           See Also:
           SIP Media Plane
           Emulated Agents

        3.1.2 SIP Media Plane

           Definition:
           The logical plane in which media streams, also known as
           the "payload" or "bearer channel", established by the
           SIP Signaling messages are exchanged.

           Discussion:
           The Media Plane is analagous to the Data Plane.
           Packets for the SIP Control Plane and the SIP Media Plane
           traverse different paths, which can produce variation
           in performance.  For this reason it is necessary to
           benchmark performance of the SIP Control Plane and the
           SIP Media Plane.

           Measurement Units:
           N/A

           Issues:
           None

           See Also:
           SIP Control Plane
           Emulated Agents

Poretsky, Gurbani, Davids                                    [Page 5]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

        3.1.3 Emulated Agent

           Definition:
           Device in test topology that initates/responds to SIP
           signaling as a session endpoint and sources/receives
           associated media for established connections.

           Discussion:
           The Emulated Agent (EA) is a function of the Tester.
           The Tester MAY be configured to be multiple EAs.

           Measurement Units:
           N/A

           Issues:
           None

           See Also:
           SIP Media Plane
           SIP Control Plane

        3.1.4 Session Server

           Definition:
           Device in test topology that acts as proxy between Emulated
           Agents.  This device is either a DUT or component of a SUT.

           Discussion:
           The Session Server MUST be a RFC 3261 [Ro02] compliant
           device.  It MAY be a Proxy Server, Session Border Controller
           (SBC), or other type of device that is RFC 3261 compliant.

           Measurement Units:  N/A

           Issues:  None

           See Also:
           SIP Control Plane

        3.1.5 SIP-Aware Stateful Firewall

           Definition:
           Device in test topology that provides SIP DoS Protection
           for the Emulated Agents and Session Server

           Discussion:
           The SIP-Aware Stateful Firewall MAY be an internal
           component or function of the Session Server.  The
           SIP-Aware Stateful Firewall MAY be a standalone device
           that MUST be pairedwith a Session Server to be benchmarked
           as a SUT. SIP-Aware Stateful Firewalls MAY include
           Network Address Translation (NAT) functionality.
           Additional functionality MAY NOT be supported.

Poretsky, Gurbani, Davids                                    [Page 6]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

           Measurement Units:
           N/A

           Issues:
           None

           See Also:


        3.1.6 Invite-Initiated Control Session (IICS)

           Definition:
           A SIP signaling exchange that includes an initial SIP INVITE
           message exchanged between Emulated Agent and DUT/SUT.

           Discussion:

           An Invite-Initiated Control Session (IICS) MAY have
           associated media.  The inclusion of media is test case
           dependent. An IICS may be in one of several different
           states:  Attempting , Established, Disconnecting. These
           states are distinguished as follows:

           Attempting - the state after the invite is sent by or
           received at the Tester and before the ack to that Invite
           is received at or sent by the Tester. This definition
           includes possible reinvites as well as redirects. It also
           includes all Invites that are rejected for lack of
           authentication information.

           Established - the state after the 200 OK for the initiating
           Invite is sent by or received at the Tester. This definition
           includes possible reinvites as well as redirects.  It also
           includes all invites that are rejected for lack of
           authentication information.

           Disconnecting - the state after a BYE is sent by or received
           at the Tester.

           Measurement Units:
           N/A

           Issues:
           None

           See Also:


Poretsky, Gurbani, Davids                                    [Page 7]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

       3.1.7 Non-INVITE-initiated Control Session (NIICS)

           Definition:
           A SIP signaling exchange that does not include an initial SIP
           INVITE message exchanged between Emulated Agent and DUT/SUT.

           Discussion:
           A Non-INVITE-initiated Control Session (N-IICS) does not have
           Associated media.

           Measurement Units:
           N/A

           Issues:
           None

           See Also:

       3.1.8 Registration

           Definition:
           A NIICS whose initial SIP message is a REGISTER request.

           Discussion:
           Registrations represent a percent of SIP network traffic.
           As such they represent a significant part of the work of
           some elements of the DUT/SUT.  A Registration attempt
           MAY be sussessful or unsuccessful.  A suuccessful
           registration is determined by receipt of a 200 OK
           response.  An unsuccessful registration is one that
           does not receive a 200 OK response.

           Measurement Units:
           N/A

           Issues:
           None

           See Also:

       3.1.9 Associated Media Stream

           Definition:
           A media stream that is associated with an IICS.

Poretsky, Gurbani, Davids                                    [Page 8]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

           Discussion:
           Any media protocol MAY be used.  If RTP is used, the
           Associated Media Stream of the IICS is identified as
           the concatenation of
           (1) the IP address and transport port identified by the
               originator in its SDP.
           (2) the IP address and transport port identified by the
               destination in its SDP.
           (3) the SSRC in the RTP sent by the originator.
           (4) the SSRC in the RTP sent by the destination.
           (5) the RTCP ports identified by the originator and
               destination.

           Measurement Units:
           NA

           Issues:
           None

       3.1.10 Associated Media Session

           Definition:
           The collection of associated media streams created by
           an IICS and identified by a SIP Call-ID.

           Discussion:
           A session, as defined by SDP, can comprise one or more media
           streams.

           Measurement Units:
           NA

           Issues:
           None


    3.2 Test Setup Parameters

       3.2.1 SIP Transport Protocol

           Definition:
           The protocol used for transport of the SIP Control Plane
           messages.

           Discussion:
           Performance benchmarks may vary for the same SIP networking
           device depending upon whether TCP, UDP, TLS, SCTP, or another
           transport layer protocol is used.  For this reason it MAY
           be necessary to measure the SIP Performance Benchmarks using
           these various transport protocols.  Performance Benchmarks
           MUST report the SIP Transport Protocol used to obtain the
           benchmark results.

Poretsky, Gurbani, Davids                                    [Page 9]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

           Measurement Units:
           TCP or UDP

           Issues:
           None

           See Also:

        3.2.2 Intended Session Duration

           Definition:
           Configuration on the Emulated Agent for time from an IICS
           establishment to BYE.  This is the duration of the
           Established State.

           Discussion:
           The Intended Session Duration is configured on the Emulated
           Agent. This value is used for all sessions.  When
           benchmarking Session Capacity the effective value of the
           Session Duration is infinite.

           Measurement Units:
           seconds

           Issues:
           None

           See Also:
           Session Attempt Rate

        3.2.3 Measured Session Duration

           Definition:
           Average measurement on the DUT/SUT for time from session
           establishment to BYE.

           Discussion:
           The value of the Measured Session Duration MAY not equal
           the Intended Session Duration.  This parameter requires
           that the session duration be measured for every session
           through the test duration.

           Measurement Units:
           seconds

           Issues:
           None

           See Also:
           Intended Session Duration
           Session Attempt Rate

Poretsky, Gurbani, Davids                                    [Page 10]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

        3.2.4 Session Attempt Rate

           Definition:
           Configuration on the Emulated Agent for number of
           sessions to be established at the DUT per continuous
           one-second intervals.

           Discussion:
           The Session Attempt Rate can cause variation in
           performance benchmark measurements.  Since this is
           the number of sessions configured on the Tester, some
           sessions may not be successfully established on the
           DUT.  Sessions may be IICS or NIICS.

           For a fixed value of Session Attempt Rate, more stress
           may be incurred on the DUT/SUT when it processes
           sessions setups and teardowns concurrently during each
           one-second interval.

           Measurement Units:
           session attempts per second (saps)

           Issues:
           None

           See Also:
           Measured Session Duration
           Session Attempt Rate

        3.2.4.1 Media Session Attempt Rate

           Definition:
           Configuration on the Emulated Agent for number of
           media sessions to be established at the DUT per
           continuous one-second intervals.

           Discussion:
           The Media Session Attempt Rate can cause variation in
           performance benchmark measurements.  Since this is
           the number of sessions configured on the Tester, some
           sessions may not be successfully established on the
           DUT.  Media Sessions MUST be associated to IICSes.

           Measurement Units:
           session attempts per second (saps)

           Issues:
           None

           See Also:
           Measured Media Session Duration
           Session Attempt Rate

Poretsky, Gurbani, Davids                                    [Page 11]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

        3.2.4.2 NIICS Attempt Rate

           Definition:
           Configuration on the Emulated Agent for number of
           NIICSes to be established at the DUT per continuous
           one-second intervals.

           Discussion:
           The NIICS attempts include registrations, Instant Messages,
           and Presence-related messages.

           Measurement Units:
           session attempts per second (saps)

           Issues:
           None

           See Also:
           Measured Session Duration
           Session Attempt Rate

        3.2.5 Media Streams per Media Session

           Definition:
           Configuration on the Emulated Agent for a fixed number
           of media streams offered for each session.

           Discussion:
           For a single benchmark test, all sessions use the
           same number of Media Streams per Session.  Presence
           of media streams and the number of media streams per
           session can cause variation in performance benchmark
           measurements.  The RECOMMENDED values for Media
           Streams per Session are 0,1,2,3,4, but higher values
           MAY be used.

           Measurement Units:
           media streams per session (msps)

           Issues:
           None

           See Also:

Poretsky, Gurbani, Davids                                    [Page 12]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

        3.2.6 Media Packet Size

           Definition:
           Configuration on the Emulated Agent for a fixed size of
           packets used for media streams.

           Discussion:
           For a single benchmark test, all sessions use the
           same size packet for media streams.  The size of packets can
           cause variation in performance benchmark measurements.

           Measurement Units:
           bytes

           Issues:
           None

           See Also:

        3.2.7 Media Offered Load, per Media Stream

           Definition:
           The constant amount of media traffic offered by the
           Emulated Agent to the DUT/SUT for each media stream.

           Discussion:
           For a single benchmark test, all sessions use the
           same Media Offered Load, per Media Stream.

           Measurement Units:
           pps

           Issues:
           None

           See Also:

        3.2.8 Media Offered Load, Aggregate

           Definition:
           The total amount of media traffic offered by the
           Emulated Agent to the DUT/SUT.

           Discussion:

           Measurement Units:
           pps

           Issues:
           None

           See Also:

Poretsky, Gurbani, Davids                                    [Page 13]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

        3.2.9 Media Session Hold Time

          Definition:
           The amount of time during which media flows from the
           Tester to the DUT for a successful IICS.

           Discussion:

           Measurement Units:
           seconds

           Issues:
           None

           See Also:

    3.3 Benchmarks

        3.3.1 Registration Rate

           Definition:
           Maximum number of registrations successfully
           completed by the DUT/SUT.

           Discussion:
           This benchmark is obtained with zero failure in which
           100% of the registrations attempteded by the Emulated Agent
           are successfully completed by the DUT/SUT.
           The maximum value is obtained by testing to failure.

           Measurement Units:
           registrations per second (rps)

           Issues:
           None

           See Also:

        3.3.2 Session Rate

           Definition:
           Maximum number of Control Sessions successfully established
           per continuous one-second intervals with the
           sessions remaining active.

           Discussion:
           This benchmark is obtained with zero failure in which
           100% of the sessions introduced by the Emulated Agent
           successfully establish.  The maximum value is obtained
           by testing to failure.  Sessions may be IICS or NIICS.

           Measurement Units:
           sessions per second (sps)

Poretsky, Gurbani, Davids                                    [Page 14]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

           Issues:
           None

           See Also:
           Invite-Initiated Control Session (IICS)
           Non-Invite Initiated COntrol Session (NIICS)
           Session Attempt Rate

        3.3.3 Session Capacity
           Definition:
           The maximum number of SIP sessions that the DUT/SUT
           can simultaneously have established.

           Discussion:
           The Session Duration MUST be infinite so that sessions
           remain established for the duration of the test to
           obtain the Session Capacity benchmark.  The Session
           Capacity must be reported with the Session Rate used to
           reach the maximum.  Since Session Rate is a zero-loss
           measurement, there must be zero failures to achieve the
           Session Capacity.  Sessions may be IICS or NIICS.

           Measurement Units:
           sessions

           Issues: None

           See Also:
           Session Attempt Rate

        3.3.4 Session Establishment Performance
           Definition:
           The percentage of sessions that successfully establish
           for the duration of a benchmarking test.

           Discussion:
           Session Establishment Performance is a benchmark to indicate
           session establishment success for the duration of a test.
           The duration for measuring this benchmark is to be specified
           in the Methodology.  The Session Duration may be configured
           so that sessions terminate during the test duration.
           Established Sessions MAY be reported as the percentage of
           Session Attempts that failed or the percentage of Session
           Attempts that were successful.

           Measurement Units:
           Percentage, %

           Issues: None

           See Also:
           Session Rate
           Sesion Attempt Rate

Poretsky, Gurbani, Davids                                    [Page 15]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices


        3.3.5 Session Setup Delay

           Definition:
           The average time for a session to establish.

           Discussion:
           Time is from the Emulated Agent to signal the first
           INVITE.  Session Setup Delay MUST be measured for
           every established session to calculate the average.
           Session Setup Delay MUST be measured at the
           Successful Setup Attempt Rate.

           Measurement Units:
           msec

           Issues:
           None

           See Also:
           Successful Setup Attempt Rate


        3.3.6 Session Teardown Delay

           Definition:
           The average time for a session to teardown.

           Discussion:
           Time is from the Emulated Agent to signal the BYE.
           Session Teardown Delay MUST be measured for every
           established session to calculate the average.
           Session Setup Delay MUST be measured with the rate
           of teardowns configured to the value of the
           Successful Setup Attempt Rate.

           Measurement Units:
           msec

           Issues:
           None

           See Also:
           Successful Setup Attempt Rate

Poretsky, Gurbani, Davids                                    [Page 16]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

        3.3.7 Standing Sessions

           Definition:
           Measurement of the number of Active Control Sessions
           concurrently established on the DUT/SUT at an instant.

           Discussion:
           The number of Standing Sessions is influenced by
           the Session Duration and the Session Rate (or
           Session Attempt Rate).  Benchmarks MUST be reported
           with the maximum and average Standing Sessions for
           the DUT/SUT.  In order to determine the maximum and
           average Standing Sessions on the DUT/SUT for the
           duration of the test it is necessary to make
           periodic measurements of the number of Standing
           Sessions on the DUT/SUT.  The recommended value
           for the measurement period is 1 second.

           Measurement Units:
           sessions

           Issues:
           None

           See Also:
           Active Control Sessions
           Session Duration
           Session Rate
           Session Attempt Rate

Poretsky, Gurbani, Davids                                    [Page 17]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices
    3.3.8 IM Rate

    Definition:
    Maximum number of IM messages completed successfully
    by the DUT/SUT.

    Discussion:
    For a UAS, the definition of success is
    the receipt of an IM request and the subsequent sending
    of a final response.  For a UAC, the definition of success
    is the sending of an IM request and the receipt of
    a final response to it.  For a proxy, the definition of
    success is as follows:
      a) the number of IM requests it receives from the upstream
      client MUST be equal to the number of IM requests it
      sent to the downstream server; and
      b) the number of IM responses it receives from the
      downstream server MUST be equal to the number of IM
      requests sent to the downstream server; and
      c) the number of IM responses it sends to the upstream
      client MUST be equal to the number of IM requests it
      received from the upstream client.

    Measurement Units:
    IM Messages Per Second

    Issues:
    None.

    See Also:

    3.3.9 Presence Rate

    Definition:
    Maximum number of presence notifications sent out by
    the DUT/SUR acting as a Presence Agent [Ro04].

    Discussion: The intent of this benchmark is to assess the
    throughput of a Presence Agent (PA, see [Ro04]).  The
    PA will accept subscriptions from watchers, and when the
    target of the subscription is registered with the PA (who
    is acting as a registrar), a notification is generated to
    the watcher.  This benchmark will use the presence event
    package as documented in [Ro04].  The Presence Rate will
    be less than or equal to the Registration Rate.

    Measurement Units:
    Presence Notifications Per Second

    See Also:
    Registration Rate.

Poretsky, Gurbani, Davids                                    [Page 18]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices
4. IANA Considerations

   This document requires no IANA considerations.

5. Security Considerations

   Documents of this type do not directly affect the security of
   Internet or corporate networks as long as benchmarking is not
   performed on devices or systems connected to production
   networks.  Security threats and how to counter these in SIP
   and the media layer is discussed in RFC3261, RFC3550, and
   RFC3711 and various other drafts.  This document attempts to
   formalize a set of common terminology for benchmarking SIP
   networks.


6. Acknowledgements

    The authors would like to thank Keith Drage and Daryl Malas
    for their contributions to this document.


Poretsky, Gurbani, Davids                                    [Page 19]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices

7. References
7.1 Normative References


    [Ba91] Bradner, S. "Benchmarking Terminology for Network
           Interconnection Devices", IETF RFC 1242, July 1991.

    [Ba99] Bradner, S. and McQuaid, J., "Benchmarking
           Methodology for Network Interconnect Devices",
           IETF RFC 2544, March 1999.

    [Ga05] Garcia-Martin, M., "Input 3rd-Generation Partnership
           Project (3GPP) Release 5 Requirements on the Session
           Initiation Protocol (SIP)", IETF RFC 4083, May 2005.

    [Li06] Lingle, K., Mule, J., Maeng, J., Walker, D.,
           "Management Information Base for the Session Initiation
           Protocol (SIP)", draft-ietf-sip-mib-10.txt, work in
           progress, March 2006.

    [Ma98] Mandeville, R., "Benchmarking Terminology for LAN
           Switching Devices", IETF RFC 2285, February 1998.

    [Ma06] Malas, D. "SIP Performance Metrics",
           draft-malas-performance-metrics-01.txt, work in progress,
         October 2006.

    [Po06] Poretsky, S., Gurbani, V., and Davids, C., "SIP
           Performance Benchmarking Methodology",
           draft-poretsky-bmwg-sip-meth-00, work in progress,
         October 2006.

    [Ro02] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston,
           A., Peterson, J., Sparks, R., Handley, M. and E. Schooler,
           "SIP: Session Initiation Protocol", IETF RFC 3261,
           June 2002.

    [Ro04] Rosenberg, J., "A Presence Event Package for the
           Session Initiation Protocol (SIP)," IETF RFC 3856, August
           2004.

    [Sp06] Sparks, R., et al, "Session Initiation Protocol (SIP)
           Torture Test Messages", IETF RFC 4475, October 2006.

7.2 Informative References
    None

Poretsky, Gurbani, Davids                                    [Page 20]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices


8. Author's Address

      Scott Poretsky
      Reef Point Systems
      8 New England Executive Park
      Burlington, MA 01803
      USA
      Phone: + 1 508 439 9008
      EMail: sporetsky@reefpoint.com


      Vijay Gurbani
      2000 Lucent Lane
      Lucent Technologies
      Room 6G-440
      Naperville, IL 60566
      USA
      Phone: + 1 630 224 0216
      Email: vkg@lucent.com


      Carol Davids
      Illinois Institute of Technology
      Rice Campus
      201 East Loop Road
      Wheaton, IL 60187
      USA
      Phone: + 1 630 682 6000
      Email: davids@iit.edu

Poretsky, Gurbani, Davids                                    [Page 21]


INTERNET-DRAFT           Benchmarking Terminology for      October 2006
                            SIP Networking Devices


Full Copyright Statement

   Copyright (C) The Internet Society (2006).

   This document is subject to the rights, licenses and restrictions
   contained in BCP 78, and except as set forth therein, the authors
   retain all their rights.

   This document and the information contained herein are provided on an
   "AS IS" basis and THE CONTRIBUTOR, THE ORGANIZATION HE/SHE REPRESENTS
   OR IS SPONSORED BY (IF ANY), THE INTERNET SOCIETY AND THE INTERNET
   ENGINEERING TASK FORCE DISCLAIM ALL WARRANTIES, EXPRESS OR IMPLIED,
   INCLUDING BUT NOT LIMITED TO ANY WARRANTY THAT THE USE OF THE
   INFORMATION HEREIN WILL NOT INFRINGE ANY RIGHTS OR ANY IMPLIED
   WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.

Intellectual Property
   The IETF takes no position regarding the validity or scope of any
   Intellectual Property Rights or other rights that might be claimed to
   pertain to the implementation or use of the technology described in
   this document or the extent to which any license under such rights
   might or might not be available; nor does it represent that it has
   made any independent effort to identify any such rights.  Information
   on the procedures with respect to rights in RFC documents can be
   found in BCP 78 and BCP 79.

   Copies of IPR disclosures made to the IETF Secretariat and any
   assurances of licenses to be made available, or the result of an
   attempt made to obtain a general license or permission for the use of
   such proprietary rights by implementers or users of this
   specification can be obtained from the IETF on-line IPR repository at
   http://www.ietf.org/ipr.

   The IETF invites any interested party to bring to its attention any
   copyrights, patents or patent applications, or other proprietary
   rights that may cover technology that may be required to implement
   this standard.  Please address the information to the IETF at ietf-
   ipr@ietf.org.

Acknowledgement
   Funding for the RFC Editor function is currently provided by the
   Internet Society.

Poretsky, Gurbani, Davids                                    [Page 22]