Methodology for Benchmarking Session Initiation Protocol (SIP) Devices: Basic Session Setup and Registration
RFC 7502

 
Document Type RFC - Informational (April 2015; No errata)
Last updated 2015-04-13
Replaces draft-poretsky-bmwg-sip-bench-meth
Stream IETF
Formats plain text pdf html
Stream WG state Submitted to IESG for Publication
Consensus Yes
Document shepherd Al Morton
Shepherd write-up Show (last changed 2014-07-19)
IESG IESG state RFC 7502 (Informational)
Telechat date
Responsible AD Joel Jaeggli
IESG note Al Morton (acmorton@att.com) is the document shepherd.
Send notices to bmwg-chairs@ietf.org, draft-ietf-bmwg-sip-bench-meth@ietf.org
IANA IANA review state Version Changed - Review Needed
IANA action state No IC
Internet Engineering Task Force (IETF)                         C. Davids
Request for Comments: 7502              Illinois Institute of Technology
Category: Informational                                       V. Gurbani
ISSN: 2070-1721                        Bell Laboratories, Alcatel-Lucent
                                                             S. Poretsky
                                                    Allot Communications
                                                              April 2015

Methodology for Benchmarking Session Initiation Protocol (SIP) Devices:
                  Basic Session Setup and Registration

Abstract

   This document provides a methodology for benchmarking the Session
   Initiation Protocol (SIP) performance of devices.  Terminology
   related to benchmarking SIP devices is described in the companion
   terminology document (RFC 7501).  Using these two documents,
   benchmarks can be obtained and compared for different types of
   devices such as SIP Proxy Servers, Registrars, and Session Border
   Controllers.  The term "performance" in this context means the
   capacity of the Device Under Test (DUT) to process SIP messages.
   Media streams are used only to study how they impact the signaling
   behavior.  The intent of the two documents is to provide a normalized
   set of tests that will enable an objective comparison of the capacity
   of SIP devices.  Test setup parameters and a methodology are
   necessary because SIP allows a wide range of configurations and
   operational conditions that can influence performance benchmark
   measurements.

Status of This Memo

   This document is not an Internet Standards Track specification; it is
   published for informational purposes.

   This document is a product of the Internet Engineering Task Force
   (IETF).  It represents the consensus of the IETF community.  It has
   received public review and has been approved for publication by the
   Internet Engineering Steering Group (IESG).  Not all documents
   approved by the IESG are a candidate for any level of Internet
   Standard; see Section 2 of RFC 5741.

   Information about the current status of this document, any errata,
   and how to provide feedback on it may be obtained at
   http://www.rfc-editor.org/info/rfc7502.

Davids, et al.                Informational                     [Page 1]
RFC 7502              SIP Benchmarking Methodology            April 2015

Copyright Notice

   Copyright (c) 2015 IETF Trust and the persons identified as the
   document authors.  All rights reserved.

   This document is subject to BCP 78 and the IETF Trust's Legal
   Provisions Relating to IETF Documents
   (http://trustee.ietf.org/license-info) in effect on the date of
   publication of this document.  Please review these documents
   carefully, as they describe your rights and restrictions with respect
   to this document.  Code Components extracted from this document must
   include Simplified BSD License text as described in Section 4.e of
   the Trust Legal Provisions and are provided without warranty as
   described in the Simplified BSD License.

Davids, et al.                Informational                     [Page 2]
RFC 7502              SIP Benchmarking Methodology            April 2015

Table of Contents

   1.  Introduction  . . . . . . . . . . . . . . . . . . . . . . . .   4
   2.  Terminology . . . . . . . . . . . . . . . . . . . . . . . . .   5
   3.  Benchmarking Topologies . . . . . . . . . . . . . . . . . . .   5
   4.  Test Setup Parameters . . . . . . . . . . . . . . . . . . . .   7
     4.1.  Selection of SIP Transport Protocol . . . . . . . . . . .   7
     4.2.  Connection-Oriented Transport Management  . . . . . . . .   7
     4.3.  Signaling Server  . . . . . . . . . . . . . . . . . . . .   7
     4.4.  Associated Media  . . . . . . . . . . . . . . . . . . . .   8
     4.5.  Selection of Associated Media Protocol  . . . . . . . . .   8
     4.6.  Number of Associated Media Streams per SIP Session  . . .   8
     4.7.  Codec Type  . . . . . . . . . . . . . . . . . . . . . . .   8
     4.8.  Session Duration  . . . . . . . . . . . . . . . . . . . .   8
     4.9.  Attempted Sessions per Second (sps) . . . . . . . . . . .   8
     4.10. Benchmarking Algorithm  . . . . . . . . . . . . . . . . .   9
   5.  Reporting Format  . . . . . . . . . . . . . . . . . . . . . .  11
     5.1.  Test Setup Report . . . . . . . . . . . . . . . . . . . .  11
     5.2.  Device Benchmarks for Session Setup . . . . . . . . . . .  12
     5.3.  Device Benchmarks for Registrations . . . . . . . . . . .  12
   6.  Test Cases  . . . . . . . . . . . . . . . . . . . . . . . . .  13
     6.1.  Baseline Session Establishment Rate of the Testbed  . . .  13
     6.2.  Session Establishment Rate without Media  . . . . . . . .  13
Show full document text