Benchmarking Methodology for Source Address Validation
draft-chen-bmwg-savnet-sav-benchmarking-00
This document is an Internet-Draft (I-D).
Anyone may submit an I-D to the IETF.
This I-D is not endorsed by the IETF and has no formal standing in the
IETF standards process.
The information below is for an old version of the document.
| Document | Type |
This is an older version of an Internet-Draft whose latest revision state is "Replaced".
|
|
|---|---|---|---|
| Authors | Li Chen , Dan Li , Libin Liu , Lancheng Qin | ||
| Last updated | 2024-07-08 | ||
| Replaced by | draft-ietf-bmwg-savnet-sav-benchmarking | ||
| RFC stream | (None) | ||
| Formats | |||
| Stream | Stream state | (No stream defined) | |
| Consensus boilerplate | Unknown | ||
| RFC Editor Note | (None) | ||
| IESG | IESG state | I-D Exists | |
| Telechat date | (None) | ||
| Responsible AD | (None) | ||
| Send notices to | (None) |
draft-chen-bmwg-savnet-sav-benchmarking-00
IETF L. Chen
Internet-Draft Zhongguancun Laboratory
Intended status: Standards Track D. Li
Expires: 9 January 2025 Tsinghua University
L. Liu
L. Qin
Zhongguancun Laboratory
8 July 2024
Benchmarking Methodology for Source Address Validation
draft-chen-bmwg-savnet-sav-benchmarking-00
Abstract
This document defines methodologies for benchmarking the performance
of source address validation (SAV) mechanisms. SAV mechanisms are
utilized to generate SAV rules to prevent source address spoofing,
and have been implemented with many various designs in order to
perform SAV in the corresponding scenarios. This document takes the
approach of considering a SAV device to be a black box, defining the
methodology in a manner that is agnostic to the mechanisms. This
document provides a method for measuring the performance of existing
and new SAV implementations.
Status of This Memo
This Internet-Draft is submitted in full conformance with the
provisions of BCP 78 and BCP 79.
Internet-Drafts are working documents of the Internet Engineering
Task Force (IETF). Note that other groups may also distribute
working documents as Internet-Drafts. The list of current Internet-
Drafts is at https://datatracker.ietf.org/drafts/current/.
Internet-Drafts are draft documents valid for a maximum of six months
and may be updated, replaced, or obsoleted by other documents at any
time. It is inappropriate to use Internet-Drafts as reference
material or to cite them other than as "work in progress."
This Internet-Draft will expire on 9 January 2025.
Copyright Notice
Copyright (c) 2024 IETF Trust and the persons identified as the
document authors. All rights reserved.
Chen, et al. Expires 9 January 2025 [Page 1]
Internet-Draft SAVBench July 2024
This document is subject to BCP 78 and the IETF Trust's Legal
Provisions Relating to IETF Documents (https://trustee.ietf.org/
license-info) in effect on the date of publication of this document.
Please review these documents carefully, as they describe your rights
and restrictions with respect to this document. Code Components
extracted from this document must include Revised BSD License text as
described in Section 4.e of the Trust Legal Provisions and are
provided without warranty as described in the Revised BSD License.
Table of Contents
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 3
1.1. Goal and Scope . . . . . . . . . . . . . . . . . . . . . 3
1.2. Requirements Language . . . . . . . . . . . . . . . . . . 4
2. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 4
3. Test Methodology . . . . . . . . . . . . . . . . . . . . . . 4
3.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 4
3.2. Network Topology and Device Configuration . . . . . . . . 5
4. SAV Performance Indicators . . . . . . . . . . . . . . . . . 6
4.1. Proportion of Improper Blocks . . . . . . . . . . . . . . 6
4.2. Proportion of Improper Permits . . . . . . . . . . . . . 6
4.3. Protocol Convergence Time . . . . . . . . . . . . . . . . 6
4.4. Control Plane Processing Throughput . . . . . . . . . . . 6
4.5. Data Plane Forwarding Rate . . . . . . . . . . . . . . . 6
5. Benchmarking Tests . . . . . . . . . . . . . . . . . . . . . 6
5.1. Intra-domain SAV . . . . . . . . . . . . . . . . . . . . 6
5.1.1. SAV Accuracy . . . . . . . . . . . . . . . . . . . . 6
5.1.2. Protocol Convergence Performance . . . . . . . . . . 15
5.1.3. Control Plane Performance . . . . . . . . . . . . . . 16
5.1.4. Data Plane Forwarding Performance . . . . . . . . . . 17
5.2. Inter-domain SAV . . . . . . . . . . . . . . . . . . . . 18
5.2.1. SAV Accuracy . . . . . . . . . . . . . . . . . . . . 18
5.2.2. Protocol Convergence Performance . . . . . . . . . . 32
5.2.3. Control Plane Performance . . . . . . . . . . . . . . 32
5.2.4. Data Plane Forwarding Performance . . . . . . . . . . 32
6. Reporting Format . . . . . . . . . . . . . . . . . . . . . . 32
7. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 33
8. Security Considerations . . . . . . . . . . . . . . . . . . . 33
9. References . . . . . . . . . . . . . . . . . . . . . . . . . 33
9.1. Normative References . . . . . . . . . . . . . . . . . . 33
9.2. Informative References . . . . . . . . . . . . . . . . . 34
Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 34
Chen, et al. Expires 9 January 2025 [Page 2]
Internet-Draft SAVBench July 2024
1. Introduction
Source address validation (SAV) is significantly important to prevent
source address spoofing. Operators are suggested to deploy different
SAV mechanisms [RFC3704] [RFC8704] based on their deployment network
environments. In addition, existing intra-domain and inter-domain
SAV mechanisms have problems in operational overhead and accuracy
under various scenarios [intra-domain-ps] [inter-domain-ps]. Intra-
domain and inter-domain SAVNET architectures [intra-domain-arch]
[inter-domain-arch] are proposed to guide the design of new intra-
domain and inter-domain SAV mechanisms to solve the problems. The
benchmarking methodology defined in this document will help operators
to get a more accurate idea of the SAV performance when their
deployed devices enable SAV and will also help vendors to test the
performance of SAV implementation for their devices.
This document provides generic methodologies for benchamarking SAV
mechanism performance. To achieve the desired functionality, a SAV
device may support many SAV mechanisms. This document considers a
SAV device to be a black box, regardless of the design and
implementation. The tests defined in this document can be used to
benchmark a SAV device for SAV accuracy, convergence performance, and
control plane and data plane forwarding performance. These tests can
be performed on a hardware router, a bare metal server, a virtual
machine (VM) instance, or q container instance, which runs as a SAV
device. This document is intended for those people who want to
measure a SAV device's performance as well as compare the performance
of various SAV devices.
1.1. Goal and Scope
The benchmarking methodology outlined in this draft focuses on two
objectives:
* Assessing ''which SAV mechnisms performn best'' over a set of
well-defined scenarios.
* Measuring the contribution of sub-systems to the overall SAV
systems's performance (also known as ''micro-benchmark'').
The benchmark aims to compare the SAV performance of individual
devices, e.g., hardware or software routers. It will showcase the
performance of various SAV mechanisms for a given device and network
scenario, with the objective of deploying the appropriate SAV
mechanism in their network scenario.
Chen, et al. Expires 9 January 2025 [Page 3]
Internet-Draft SAVBench July 2024
1.2. Requirements Language
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT",
"SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and
"OPTIONAL" in this document are to be interpreted as described in
BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all
capitals, as shown here.
2. Terminology
Improper Block: The validation results that the packets with
legitimate source addresses are blocked improperly due to inaccurate
SAV rules.
Improper Permit: The validation results that the packets with spoofed
source addresses are permitted improperly due to inaccurate SAV
rules.
SAV Control Plane: The SAV control plane consists of processes
including gathering and communicating SAV-related information.
SAV Data Plane: The SAV data plane stores the SAV rules within a
specific data structure and validates each incoming packet to
determine whether to permit or discard it.
Host-facing Router: An intra-domain router of an AS which is
connected to a host network (i.e., a layer-2 network).
Customer-facing Router: An intra-domain router of an AS which is
connected to an intra-domain customer network running the routing
protocol (i.e., a layer-3 network).
3. Test Methodology
3.1. Test Setup
The test setup in general is compliant with [RFC2544]. The Device
Under Test (DUT) is connected to a Tester and other network devices
to construct the network topology introduced in Section 5. The
Tester is a traffic generator to generate network traffic with
various source and destination addresses in order to emulate the
spoofing or legitimate traffic. It is OPTIONAL to choose various
proportions of traffic and it is needed to generate the traffic with
line speed to test the data plane forwarding performance.
Chen, et al. Expires 9 January 2025 [Page 4]
Internet-Draft SAVBench July 2024
+~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +--------------+ |
| | | |
+-->| | DUT | |---+
| | | | | |
| | +--------------+ | |
| +~~~~~~~~~~~~~~~~~~~~~~~~~~+ |
| |
| +--------------+ |
| | | |
+---------| Tester |<--------+
| |
+--------------+
Figure 1: Test Setup.
Figure 1 shows the test setup for DUT. In the test network
environment, the DUT can be connected to other devices to construct
various test scenarios. The Tester can be connected to the DUT
directly or by other devices. The connection type between them is
determined according to the benchmarking tests in Section 5.
Besides, the Tester can generate spoofing traffic or legitimate
traffic to test the SAV accuracy of DUT in the corresponding
scenarios, and it can also generate traffic with line speed to test
the data plane forwarding performance of the DUT. In addition, the
DUT needs to support logs to record all the test results.
3.2. Network Topology and Device Configuration
The location where the DUT resides in the network topology affects
the accuracy of SAV mechanisms. Therefore, the benchmark MUST put
the DUT into different locations in the network to test it.
The device in the network topology can have various routing
configurations and the generated SAV rules also depends on their
configurations. The device configurations used needs to be specified
as well.
In addition, it is necessary to indicate the device role, such as
host-facing router, customer-facing router, and AS border router in
the intra-domain network, and the business relationship between ASes
in the inter-domain network.
The network traffic generated by Tester must specify traffic rate,
the proportion of spoofing traffic and legitimate traffic, and the
distribution of source addresses, when testing the data plane
forwarding performance, as all may affect the testing results.
Chen, et al. Expires 9 January 2025 [Page 5]
Internet-Draft SAVBench July 2024
4. SAV Performance Indicators
This section lists key performance indicators (KPIs) of SAV for
overall benchmarking tests. All KPIs MUST be measured in the
bencharking scenarios described in Section 5. Also, the KPIs MUST be
measured from the result output of the DUT.
4.1. Proportion of Improper Blocks
The proportion of legitimate traffic which is blocked improperly by
the DUT across all the legitimate traffic, and this can reflect the
SAV accuracy of the DUT.
4.2. Proportion of Improper Permits
The proportion of spoofing traffic which is permitted improperly by
the DUT across all the spoofing traffic, and this can reflect the SAV
accuracy of the DUT.
4.3. Protocol Convergence Time
The protocol convergence time represents the period during which the
SAV control plane protocol converges to update the SAV rules when
routing changes happen, and it is the time elapsed from the begining
of routing change to the completion of SAV rule update. This KPI can
indicate the convergence performance of the SAV protocol.
4.4. Control Plane Processing Throughput
The control plane processing throughput measures the throughput for
processing the packets for communicating SAV-related information, and
it can indicate the SAV control plane performance of the DUT.
4.5. Data Plane Forwarding Rate
The data plane forwarding rate measures the SAV data plane forwarding
throughput for processing the data plane traffic, and it can indicate
the SAV data plane performance of the DUT.
5. Benchmarking Tests
5.1. Intra-domain SAV
5.1.1. SAV Accuracy
Chen, et al. Expires 9 January 2025 [Page 6]
Internet-Draft SAVBench July 2024
5.1.1.1. Objective
Measure the accuracy of the DUT to process legitimate traffic and
spoofing traffic across various intra-domain network scenarios
including SAV for customer or host Network, SAV for Internet-facing
network, and SAV for aggregation-router-facing network, defined as
the proportion of legitimate traffic which is blocked improperly by
the DUT across all the legitimate traffic and the proportion of
spoofing traffic which is permitted improperly by the DUT across all
the spoofing traffic.
5.1.1.2. Test Scenarios
5.1.1.2.1. SAV for Customer or Host Network
*Test Case 1*:
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +~~~~~~~~~~+ |
| | Router 1 | |
| FIB on DUT +~~~~~~~~~~+ |
| Dest Next_hop /\ | |
| 10.0.0.0/15 Network 1 | | |
| | \/ |
| +----------+ |
| | DUT | |
| +----------+ |
| /\ | |
|Outbound traffic with | | Inbound traffic with |
|source IP addresses | | destination IP addresses |
|of 10.0.0.0/15 | | of 10.0.0.0/15 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| \/
+--------------------+
| Tester (Network 1) |
| (10.0.0.0/15) |
+--------------------+
Figure 2: SAV for customer or host network in intra-domain
symmetric routing scenario.
Figure 2 shows the case of SAV for customer or host network in intra-
domain symmetric routing scenario, and the DUT performs SAV as a
customer/host-facing router and connects to Router 1 to access the
Internet. Network 1 is a customer/host network within the AS,
connects to the DUT, and its own prefix is 10.0.0.0/15. The Tester
can emulate Network 1 to advertise its prefix in the control plane
Chen, et al. Expires 9 January 2025 [Page 7]
Internet-Draft SAVBench July 2024
and generate spoofing and legitimate traffic in the data plane. In
this case, the Tester configs to make the inbound traffic destined
for 10.0.0.0/15 come from the DUT. The DUT learns the route to
prefix 10.0.0.0/15 from the Tester, while the Tester can send
outbound traffic with source addresses in prefix 10.0.0.0/15 to the
DUT, which emulates the a symmetric routing scenario between the
Tester and the DUT. The IP addrsses in this test case is optional
and users can use other IP addresses, and this holds true for other
test cases as well.
*Procedure*:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for customer or host network in intra-domain
symmetric routing scenario, a testbed can be built as shown in
Figure 2 to construct the test network environment. The Tester
is connected to the DUT and performs the functions as Network 1.
2. Then, the devices including the DUT and Router 1 are configured
to form the symmetric routing scenario.
3. Finally, the Tester generates traffic using 10.0.0.0/15 as source
addresses (legitimate traffic) and traffic using 10.2.0.0/15 as
source addresses (spoofing traffic) to the DUT, respectively.
The ratio of spoofing traffic to legitimate traffic can vary,
such as from 1:9 to 9:1.
*Expected Results*: The DUT can block the spoofing traffic and permit
the legitimate traffic from Network 1 for this test case.
*Test Case 2*:
Chen, et al. Expires 9 January 2025 [Page 8]
Internet-Draft SAVBench July 2024
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment AS |
| +~~~~~~~~~~+ |
| | Router 2 | |
| FIB on DUT +~~~~~~~~~~+ FIB on Router 1 |
| Dest Next_hop /\ \ Dest Next_hop |
| 10.1.0.0/16 Network 1 / \ 10.0.0.0/16 Network 1 |
| 10.0.0.0/16 Router 2 / \/ 10.1.0.0/16 Router 2 |
| +----------+ +~~~~~~~~~~+ |
| | DUT | | Router 1 | |
| +----------+ +~~~~~~~~~~+ |
| /\ / |
|Outbound traffic with \ / Inbound traffic with |
|source IP addresses \ / destination IP addresses |
|of 10.0.0.0/16 \ / of 10.0.0.0/16 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
\ \/
+--------------------+
| Tester (Network 1) |
| (10.0.0.0/15) |
+--------------------+
Figure 3: SAV for customer or host network in intra-domain
asymmetric routing scenario.
Figure 3 shows the case of SAV for customer or host network in intra-
domain asymmetric routing scenario, and the DUT performs SAV as a
customer/host-facing router. Network 1 is a customer/host network
within the AS, connects to the DUT and Router 1, respectively, and
its own prefix is 10.0.0./15. The Tester can emulate Network 1 and
performs its control plane and data plane functions. In this case,
the Tester configs to make the inbound traffic destined for
10.1.0.0/16 come only from the DUT and the inbound traffic destined
for 10.0.0.0/16 to come only from Router 1. The DUT only learns the
route to prefix 10.1.0.0/16 from the Tester, while Router 1 only
learns the route to the prefix 10.0.0.0/16 from Network 1. Then, the
DUT and Router 1 avertise their learned prefixes to Router 2.
Besides, the DUT learns the route to 10.0.0.0/16 from Router 2, and
Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester
can send outbound traffic with source addresses of prefix 10.0.0.0/16
to the DUT, which emulates the an asymmetric routing scenario between
the Tester and the DUT.
*Procedure*:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for customer or host network in intra-domain
asymmetric routing scenario, a testbed can be built as shown in
Chen, et al. Expires 9 January 2025 [Page 9]
Internet-Draft SAVBench July 2024
Figure 3 to construct the test network environment. The Tester
is connected to the DUT and Router 1 and performs the functions
as Network 1.
2. Then, the devices including the DUT, Router 1, and Router 2, are
configured to form the asymmetric routing scenario.
3. Finally, the Tester generates traffic using 10.1.0.0/16 as source
addresses (spoofing traffic) and traffic using 10.0.0.0/16 as
source addresses (legitimate traffic) to the DUT, respectively.
The ratio of spoofing traffic to legitimate traffic can vary,
such as from 1:9 to 9:1.
*Expected Results*: The DUT can block the spoofing traffic and permit
the legitimate traffic from Network 1 for this test case.
5.1.1.2.2. SAV for Internet-facing Network
*Test Case 1*:
+---------------------+
| Tester (Internet) |
+---------------------+
/\ | Inbound traffic with source
| | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment | | |
| | \/ |
| +----------+ |
| | DUT | SAV facing Internet |
| FIB on DUT +----------+ |
| Dest Next_hop /\ | |
| 10.0.0.0/15 Network 1 | | |
| | \/ |
| +~~~~~~~~~~+ |
| | Router 1 | |
| +~~~~~~~~~~+ |
| /\ | |
|Outbound traffic with | | Inbound traffic with |
|source IP addresses | | destination IP addresses |
|of 10.0.0.0/15 | | of 10.0.0.0/15 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| \/
+--------------------+
| Network 1 |
| (10.0.0.0/15) |
+--------------------+
Chen, et al. Expires 9 January 2025 [Page 10]
Internet-Draft SAVBench July 2024
Figure 4: SAV for Internet-facing network in intra-domain
symmetric routing scenario.
Figure 4 shows the test case of SAV for Internet-facing network in
intra-domain symmetric routing scenario. In this test case, the
network topology is the same as Figure 2, and the difference is the
location of the DUT in the network topology, where the DUT is
connected to Router 1 and the Internet, and the Tester is used to
emulate the Internet. The DUT performs Internet-facing SAV instead
of customer/host-network-facing SAV.
*Procedure*:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for Internet-facing network in intra-domain
symmetric routing scenario, a testbed can be built as shown in
Figure 4 to construct the test network environment. The Tester
is connected to the DUT and performs the functions as the
Internet.
2. Then, the devices including the DUT and Router 1 are configured
to form the symmetric routing scenario.
3. Finally, the Tester can send traffic using 10.0.0.0/15 as source
addresses (spoofing traffic) and traffic using 10.2.0.0/15 as
source addresses (legitimate traffic) to the DUT, respectively.
The ratio of spoofing traffic to legitimate traffic can vary,
such as from 1:9 to 9:1.
*Expected Results*: The DUT can block the spoofing traffic and permit
the legitimate traffic from the Internet for this test case.
*Test Case 2*:
Chen, et al. Expires 9 January 2025 [Page 11]
Internet-Draft SAVBench July 2024
+---------------------+
| Tester (Internet) |
+---------------------+
/\ | Inbound traffic with source
| | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment | | |
| | \/ |
| +----------+ |
| | DUT | |
| FIB on Router 1 +----------+ FIB on Router 2 |
| Dest Next_hop /\ \ Dest Next_hop |
| 10.1.0.0/16 Network 1 / \ 10.0.0.0/16 Network 1 |
| 10.0.0.0/16 DUT / \/ 10.1.0.0/16 DUT |
| +~~~~~~~~~~+ +~~~~~~~~~~+ |
| | Router 1 | | Router 2 | |
| +~~~~~~~~~~+ +~~~~~~~~~~+ |
| /\ / |
|Outbound traffic with \ / Inbound traffic with |
|source IP addresses \ / destination IP addresses |
|of 10.0.0.0/16 \ / of 10.0.0.0/16 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
\ \/
+--------------------+
| Network 1 |
| (10.0.0.0/15) |
+--------------------+
Figure 5: SAV for Internet-facing network in intra-domain
asymmetric routing scenario.
Figure 5 shows the test case of SAV for Internet-facing network in
intra-domain asymmetric routing scenario. In this test case, the
network topology is the same with Figure 3, and the difference is the
location of the DUT in the network topology, where the DUT is
connected to Router 1 and Router 2 within the same AS, as well as the
Internet. The Tester is used to emulate the Internet. The DUT
performs Internet-facing SAV instead of customer/host-network-facing
SAV.
*Procedure*:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for Internet-facing network in intra-domain
asymmetric routing scenario, a testbed can be built as shown in
Figure 5 to construct the test network environment. The Tester
is connected to the DUT and performs the functions as the
Internet.
Chen, et al. Expires 9 January 2025 [Page 12]
Internet-Draft SAVBench July 2024
2. Then, the devices including the DUT, Router 1, and Router 2 are
configured to form the asymmetric routing scenario.
3. Finally, the Tester can send traffic using 10.0.0.0/15 as source
addresses (spoofing traffic) and traffic using 10.2.0.0/15 as
source addresses (legitimate traffic) to the DUT, respectively.
The ratio of spoofing traffic to legitimate traffic can vary,
such as from 1:9 to 9:1.
*Expected Results*: The DUT can block the spoofing traffic and permit
the legitimate traffic from the Internet for this test case.
5.1.1.2.3. SAV for Aggregation-router-facing Network
*Test Case 1*:
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +----------+ |
| | DUT | SAV facing Router 1 |
| FIB on DUT +----------+ |
| Dest Next_hop /\ | |
| 10.0.0.0/15 Network 1 | | |
| | \/ |
| +~~~~~~~~~~+ |
| | Router 1 | |
| +~~~~~~~~~~+ |
| /\ | |
|Outbound traffic with | | Inbound traffic with |
|source IP addresses | | destination IP addresses |
|of 10.0.0.0/15 | | of 10.0.0.0/15 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| \/
+--------------------+
| Tester (Network 1) |
| (10.0.0.0/15) |
+--------------------+
Figure 6: SAV for aggregation-router-facing network in intra-
domain symmetric routing scenario.
Figure 6 shows the test case of SAV for aggregation-router-facing
network in intra-domain symmetric routing scenario. The test network
environment of Figure 6 is the same with Figure 4. The Tester is
connected to Router 1 to emulate the functions of Network 1 to test
the SAV accuracy of the DUT facing the direction of Router 1.
*Procedure*:
Chen, et al. Expires 9 January 2025 [Page 13]
Internet-Draft SAVBench July 2024
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for Internet-facing network in intra-domain
symmetric routing scenario, a testbed can be built as shown in
Figure 6 to construct the test network environment. The Tester
is connected to Router 1 and performs the functions as Network 1.
2. Then, the devices including the DUT and Router 1 are configured
to form the symmetric routing scenario.
3. Finally, the Tester can send traffic using 10.1.0.0/15 as source
addresses (legitimate traffic) and traffic using 10.2.0.0/15 as
source addresses (spoofing traffic) to Router 1, respectively.
The ratio of spoofing traffic to legitimate traffic can vary,
such as from 1:9 to 9:1.
*Expected Results*: The DUT can block the spoofing traffic and permit
the legitimate traffic from the direction of Router 1 for this test
case.
*Test Case 2*:
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +----------+ |
| | DUT | SAV facing Router 1 and 2|
| FIB on Router 1 +----------+ FIB on Router 2 |
| Dest Next_hop /\ \ Dest Next_hop |
| 10.1.0.0/16 Network 1 / \ 10.0.0.0/16 Network 1 |
| 10.0.0.0/16 DUT / \/ 10.1.0.0/16 DUT |
| +~~~~~~~~~~+ +~~~~~~~~~~+ |
| | Router 1 | | Router 2 | |
| +~~~~~~~~~~+ +~~~~~~~~~~+ |
| /\ / |
|Outbound traffic with \ / Inbound traffic with |
|source IP addresses \ / destination IP addresses |
|of 10.0.0.0/16 \ / of 10.0.0.0/16 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
\ \/
+--------------------+
| Tester (Network 1) |
| (10.0.0.0/15) |
+--------------------+
Figure 7: SAV for aggregation-router-facing network in intra-
domain asymmetric routing scenario.
Chen, et al. Expires 9 January 2025 [Page 14]
Internet-Draft SAVBench July 2024
Figure 7 shows the test case of SAV for aggregation-router-facing
network in intra-domain asymmetric routing scenario. The test
network environment of Figure 7 is the same with Figure 5. The
Tester is connected to Router 1 and Router 2 to emulate the functions
of Network 1 to test the SAV accuracy of the DUT facing the direction
of Router 1 and Router 2.
*Procedure*:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for aggregation-router-facing network in intra-
domain asymmetric routing scenario, a testbed can be built as
shown in Figure 7 to construct the test network environment. The
Tester is connected to Router 1 and Router 2 and performs the
functions as Network 1.
2. Then, the devices including the DUT, Router 1, and Router 2 are
configured to form the asymmetric routing scenario.
3. Finally, the Tester generates traffic using 10.1.0.0/16 as source
addresses (spoofing traffic) and traffic using 10.0.0.0/16 as
source addresses (legitimate traffic) to Router 1, respectively.
The ratio of spoofing traffic to legitimate traffic can vary,
such as from 1:9 to 9:1.
*Expected Results*: The DUT can block the spoofing traffic and permit
the legitimate traffic from the direction of Router 1 and Router 2
for this test case.
5.1.2. Protocol Convergence Performance
5.1.2.1. Objective
Measure the protocol convergence performance of the DUT when route
changes happen due to network failures or operator configurations,
defined as the protocol convergence time representing the time
elapsed from the begining of routing change to the completion of SAV
rule update.
5.1.2.2. Test Scenario
+-------------+ +-----------+
| Tester |<-------->| DUT |
+-------------+ +-----------+
Figure 8: Test setup for protocol convergence performance
measurement.
Chen, et al. Expires 9 January 2025 [Page 15]
Internet-Draft SAVBench July 2024
*Test Case*:
Figure 8 shows the test setup for protocol convergence performance
measurement. The protocol convergence process of the DUT to update
SAV rules launches when the route changes happen. Route changes is
the cause of updating SAV rules and may be because of network
failures or operator configurations. Therefore, in Figure 8, the
Tester is direclty connects to the DUT and emulates the route changes
to launch the convergence process of the DUT by adding or withdrawing
the prefixes.
*Procedure*:
1. First, in order to test the protocol convergence time of the DUT,
a testbed can be built as shown in Figure 8 to construct the test
network environment. The Tester is directly connected to the
DUT.
2. Then, the Tester proactively withdraws the prefixes in a certern
percentage of the overall prefixes supported by the DUT, such as
10%, 20%, ..., 100%.
3. Finally, the protocol convergence time is calculated according to
the logs of the DUT about the beginning and completion of the
protocol convergence.
*Measurements*: The logs of the DUT records the begining time of the
protocol convergence process and its completion time, and the
protocol convergence time is calculated by subtracting the beggining
time from the completion time of the protocol convergence process.
5.1.3. Control Plane Performance
*Test Case*:
The test of the control plane performance uses the same test setup
shown in Figure 8. The control plane performance measures the
control plane throughput to process the protocol messages.
Therefore, the Tester can vary the rate for sending protocol
messages, such as from 10% to 100% of the overall link capacity
between the Tester and the DUT. Then, the DUT records the size of
the processed total protocol messages and processing time.
*Procedure*:
Chen, et al. Expires 9 January 2025 [Page 16]
Internet-Draft SAVBench July 2024
1. First, in order to test the control plane processing throughput
of the DUT, a testbed can be built as shown in Figure 8 to
construct the test network environment. The Tester is directly
connected to the DUT.
2. Then, the Tester proactively sends the protocol messages to the
DUT in a certern percentage of the overall link capacity between
the Tester and the DUT, such as 10%, 20%, ..., 100%.
3. Finally, the control plane processing throughput is calculated
according to the logs of the DUT about the overall size of the
protocol messages and the overall processing time.
*Measurements*: The logs of the DUT records the overall size of the
protocol messages and the overall processing time, and the control
plane processing throughput is calculated by dividing the overall
size of the protocol messages by the overall processing time.
5.1.4. Data Plane Forwarding Performance
*Test Case*:
The test of the data plane forwarding performance uses the same test
setup shown in Figure 8. The Tester needs to send the traffic which
include spoofing and legitimate traffic at the rate of the overall
link capacity between the Tester and the DUT, and the DUT build a SAV
table with occupying the overall allocated storage space. The ratio
of spoofing traffic to legitimate traffic can vary, such as from 1:9
to 9:1. The DUT records the overall size of the forwarded packets
and the overall forwarding time.
*Procedure*:
1. First, in order to test the data plane forwarding rate of the
DUT, a testbed can be built as shown in Figure 8 to construct the
test network environment. The Tester is directly connected to
the DUT.
2. Then, the Tester proactively sends the data plane traffic
including spoofing and legitimate traffic to the DUT at the rate
of the overall link capacity between the Tester and the DUT. The
ratio of spoofing traffic to legitimate traffic can vary, such as
from 1:9 to 9:1.
3. Finally, the data plane forwarding rate is calculated according
to the logs of the DUT about the overall size of the forwarded
traffic and the overall forwarding time.
Chen, et al. Expires 9 January 2025 [Page 17]
Internet-Draft SAVBench July 2024
*Measurements*: The logs of the DUT records the overall size of the
forwarded traffic and the overall forwarding time, and the data plane
forwarding rate is calculated by dividing the overall size of the
forwarded traffic by the overall forwarding time.
5.2. Inter-domain SAV
5.2.1. SAV Accuracy
5.2.1.1. Objective
Measure the accuracy of the DUT to process legitimate traffic and
spoofing traffic across various inter-domain network scenarios
including SAV for customer-facing ASes and SAV for provider/peer-
facing ASes, defined as the proportion of legitimate traffic which is
blocked improperly by the DUT across all the legitimate traffic and
the proportion of spoofing traffic which is permitted improperly by
the DUT across all the spoofing traffic.
5.2.1.2. Test Scenario
5.2.1.2.1. SAV for Customer-facing ASes
*Test case 1*:
Chen, et al. Expires 9 January 2025 [Page 18]
Internet-Draft SAVBench July 2024
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +~~~~~~~~~~~~~~~~+ |
| | AS 3(P3) | |
| +~+/\~~~~~~+/\+~~+ |
| / \ |
| / \ |
| / \ |
| / (C2P) \ |
| +------------------+ \ |
| | DUT(P4) | \ |
| ++/\+--+/\+----+/\++ \ |
| / | \ \ |
| P2[AS 2] / | \ \ |
| / | \ \ |
| / (C2P) | \ P5[AS 5] \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+ | \ \ |
|| AS 2(P2) | | P1[AS 1] \ \ |
|+~~~~~~~~~~+/\+~~+ | P6[AS 1] \ \ |
| \ | \ \ |
| P6[AS 1] \ | \ \ |
| P1[AS 1] \ | \ \ |
| (C2P) \ | (C2P/P2P) (C2P) \ (C2P) \ |
| +~~~~~~~~~~~~~~~~+ +~~~~~~~~~~~~~~~~+ |
| | AS 1(P1, P6) | | AS 5(P5) | |
| +~~~~~~~~~~~~~~~~+ +~~~~~~~~~~~~~~~~+ |
| /\ | |
| | | |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| \/
+----------------+
| Tester |
+----------------+
Figure 9: SAV for customer-facing ASes in inter-domain symmetric
routing scenario.
Chen, et al. Expires 9 January 2025 [Page 19]
Internet-Draft SAVBench July 2024
Figure 9 presents a test case of SAV for customer-facing ASes in
inter-domain symmetric routing scenario. In this test case, AS 1, AS
2, AS 3, the DUT, and AS 5 constructs the test network environment,
and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and
the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3,
and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises
prefixes P1 and P6 to AS 2 and the DUT, respectively, and then AS 2
further propagates the route for prefix P1 and P6 to the DUT.
Consequently, the DUT can learn the route for prefixes P1 and P6 from
AS 1 and AS 2. In this test case, the legitimate path for the
traffic with source addresses in P1 and destination addresses in P4
is AS 1->AS 2->AS 4, and the Tester is connected to the AS 1 and the
SAV for customer-facing ASes of the DUT is tested.
Procedure:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for customer-facing ASes in inter-domain symmetric
routing scenario, a testbed can be built as shown in Figure 9 to
construct the test network environment. The Tester is connected
to AS 1 and generates the test traffic to the DUT.
2. Then, the ASes including AS 1, AS 2, AS 3, the DUT, and AS 5, are
configured to form the symmetric routing scenario.
3. Finally, the Tester sends the traffic using P1 as source
addresses and P4 as destination addresses (legitimate traffic) to
the DUT via AS 2 and traffic using P5 as source addresses and P4
as destination addresses (spoofing traffic) to the DUT via AS 2,
respectively. The ratio of spoofing traffic to legitimate
traffic can vary, such as from 1:9 to 9:1.
*Expected Results*: The DUT can block the spoofing traffic and permit
the legitimate traffic from the direction of AS 2 for this test case.
*Test case 2*:
Chen, et al. Expires 9 January 2025 [Page 20]
Internet-Draft SAVBench July 2024
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +~~~~~~~~~~~~~~~~+ |
| | AS 3(P3) | |
| +~+/\~~~~~~+/\+~~+ |
| / \ |
| / \ |
| / \ |
| / (C2P) \ |
| +------------------+ \ |
| | DUT(P4) | \ |
| ++/\+--+/\+----+/\++ \ |
| / | \ \ |
| P2[AS 2] / | \ \ |
| / | \ \ |
| / (C2P) | \ P5[AS 5] \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+ | \ \ |
|| AS 2(P2) | | P1[AS 1] \ \ |
|+~~~~~~~~~~+/\+~~+ | P6[AS 1] \ \ |
| P6[AS 1] \ | NO_EXPORT \ \ |
| P1[AS 1] \ | \ \ |
| NO_EXPORT \ | \ \ |
| (C2P) \ | (C2P) (C2P) \ (C2P) \ |
| +~~~~~~~~~~~~~~~~+ +~~~~~~~~~~~~~~~~+ |
| | AS 1(P1, P6) | | AS 5(P5) | |
| +~~~~~~~~~~~~~~~~+ +~~~~~~~~~~~~~~~~+ |
| /\ | |
| | | |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| \/
+----------------+
| Tester |
+----------------+
Figure 10: SAV for customer-facing ASes in inter-domain
asymmetric routing scenario caused by NO_EXPORT.
Figure 10 presents a test case of SAV for customer-facing ASes in
inter-domain asymmetric routing scenario caused by NO_EXPORT
configuration. In this test case, AS 1, AS 2, AS 3, the DUT, and AS
5 constructs the test network environment, and the DUT performs SAV
as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer
of the DUT, which is a customer of AS 3, and AS 5 is a customer of
both AS 3 and the DUT. AS 1 advertises prefixes P1 to AS 2 and adds
the NO_EXPORT community attribute to the BGP advertisement sent to AS
2, preventing AS 2 from further propagating the route for prefix P1
to the DUT. Similarly, AS 1 adds the NO_EXPORT community attribute
to the BGP advertisement sent to the DUT, resulting in the DUT not
Chen, et al. Expires 9 January 2025 [Page 21]
Internet-Draft SAVBench July 2024
propagating the route for prefix P6 to AS 3. Consequently, the DUT
only learns the route for prefix P1 from AS 1 in this scenario. In
this test case, the legitimate path for the traffic with source
addresses in P1 and destination addresses in P4 is AS 1->AS 2->DUT,
and the Tester is connected to the AS 1 and the SAV for customer-
facing ASes of the DUT is tested.
*Procedure*:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for customer-facing ASes in inter-domain asymmetric
routing scenario caused by NO_EXPORT, a testbed can be built as
shown in Figure 10 to construct the test network environment.
The Tester is connected to AS 1 and generates the test traffic to
the DUT.
2. Then, the ASes including AS 1, AS 2, AS 3, the DUT, and AS 5, are
configured to form the asymmetric routing scenario.
3. Finally, the Tester sends the traffic using P1 as source
addresses and P4 as destination addresses (legitimate traffic) to
the DUT via AS 2 and traffic using P5 as source addresses and P4
as destination addresses (spoofing traffic) to the DUT via AS 2,
respectively. The ratio of spoofing traffic to legitimate
traffic can vary, such as from 1:9 to 9:1.
*Expected Results*: The DUT can block the spoofing traffic and permit
the legitimate traffic from the direction of AS 2 for this test case.
*Test case 3*:
Chen, et al. Expires 9 January 2025 [Page 22]
Internet-Draft SAVBench July 2024
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +----------------+ |
| Anycast Server+-+ AS 3(P3) | |
| +-+/\----+/\+----+ |
| / \ |
| P3[AS 3] / \ P3[AS 3] |
| / \ |
| / (C2P) \ |
| +----------------+ \ |
| | DUT(P4) | \ |
| ++/\+--+/\+--+/\++ \ |
| P6[AS 1, AS 2] / | \ \ |
| P2[AS 2] / | \ \ |
| / | \ \ |
| / (C2P) | \ P5[AS 5] \ P5[AS 5] |
| +----------------+ | \ \ |
|User+-+ AS 2(P2) | | P1[AS 1] \ \ |
| +----------+/\+--+ | P6[AS 1] \ \ |
| P6[AS 1] \ | NO_EXPORT \ \ |
| P1[AS 1] \ | \ \ |
| NO_EXPORT \ | \ \ |
| \ (C2P) | (C2P) (C2P) \ (C2P) \ |
| +----------------+ +----------------+ |
| |AS 1(P1, P3, P6)| | AS 5(P5) | |
| +----------------+ +----------------+ |
| /\ | |
| | | |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| \/
+----------------+
| Tester |
| (Edge Server) |
+----------------+
Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
Figure 11: SAV for customer-facing ASes in the scenario of direct
server return (DSR).
Figure 11 presents a test case of SAV for customer-facing ASes in the
scenario of direct server return (DSR). In this test case, AS 1, AS
2, AS 3, the DUT, and AS 5 constructs the test network environment,
and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and
the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3,
and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2
send requests to the anycast destination IP, the forwarding path is
AS 2->DUT->AS 3. The anycast servers in AS 3 receive the requests
Chen, et al. Expires 9 January 2025 [Page 23]
Internet-Draft SAVBench July 2024
and tunnel them to the edge servers in AS 1. Finally, the edge
servers send the content to the users with source addresses in prefix
P3. The reverse forwarding path is AS 1->DUT->AS 2. The Tester
sends the traffic with source addresses in P3 and destination
addresses in P2 along the path AS 1->DUT->AS 2.
Procedure:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for customer-facing ASes in the scenario of DSR, a
testbed can be built as shown in Figure 11 to construct the test
network environment. The Tester is connected to AS 1 and
generates the test traffic to the DUT.
2. Then, the ASes including AS 1, AS 2, AS 3, the DUT, and AS 5, are
configured to form the scenario of DSR.
3. Finally, the Tester sends the traffic using P3 as source
addresses and P2 as destination addresses (legitimate traffic) to
AS 2 via the DUT.
*Expected Results*: The DUT can permit the legitimate traffic with
source addresses in P3 from the direction of AS 1 for this test case.
*Test case 4*:
Chen, et al. Expires 9 January 2025 [Page 24]
Internet-Draft SAVBench July 2024
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +----------------+ |
| | AS 3(P3) | |
| +--+/\+--+/\+----+ |
| / \ |
| / \ |
| / \ |
| / (C2P) \ |
| +----------------+ \ |
| | DUT(P4) | \ |
| ++/\+--+/\+--+/\++ \ |
| P6[AS 1, AS 2] / | \ \ |
| P2[AS 2] / | \ \ |
| / | \ \ |
| / (C2P) | \ P5[AS 5] \ P5[AS 5] |
+----------+ | +----------------+ | \ \ |
| Tester |-|->| | | \ \ |
|(Attacker)| | | AS 2(P2) | | \ \ |
| (P1') |<|--| | | P1[AS 1] \ \ |
+----------+ | +---------+/\+---+ | P6[AS 1] \ \ |
| P6[AS 1] \ | NO_EXPORT \ \ |
| P1[AS 1] \ | \ \ |
| NO_EXPORT \ | \ \ |
| \ (C2P) | (C2P) (C2P) \ (C2P) \ |
| +----------------+ +----------------+ |
| Victim+-+ AS 1(P1, P6) | Server+-+ AS 5(P5) | |
| +----------------+ +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of
AS 2 or connected to AS 2 through other ASes.
Figure 12: SAV for customer-facing ASes in the scenario of
reflection attacks.
Figure 12 depicts the test case of SAV for customer-facing ASes in
the scenario of reflection attacks. In this test case, the
reflection attack by source address spoofing takes place within DUT's
customer cone, where the attacker spoofs the victim's IP address (P1)
and sends requests to servers' IP address (P5) that are designed to
respond to such requests. The Tester performs the source address
spoofing function as an attacker. The arrows in Figure 12 illustrate
the commercial relationships between ASes. AS 3 serves as the
provider for the DUT and AS 5, while the DUT acts as the provider for
AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.
*Procedure*:
Chen, et al. Expires 9 January 2025 [Page 25]
Internet-Draft SAVBench July 2024
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for customer-facing ASes in the scenario of
reflection attacks, a testbed can be built as shown in Figure 12
to construct the test network environment. The Tester is
connected to AS 2 and generates the test traffic to the DUT.
2. Then, the ASes including AS 1, AS 2, AS 3, the DUT, and AS 5, are
configured to form the scenario of reflection attacks.
3. Finally, the Tester sends the traffic using P1 as source
addresses and P5 as destination addresses (spoofing traffic) to
AS 5 via the DUT.
*Expected Results*: The DUT can block the spoofing traffic with
source addresses in P1 from the direction of AS 2 for this test case.
*Test case 5*:
Chen, et al. Expires 9 January 2025 [Page 26]
Internet-Draft SAVBench July 2024
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |
| +----------------+ |
| | AS 3(P3) | |
| +--+/\+--+/\+----+ |
| / \ |
| / \ |
| / \ |
| / (C2P) \ |
| +----------------+ \ |
| | DUT(P4) | \ |
| ++/\+--+/\+--+/\++ \ |
| P6[AS 1, AS 2] / | \ \ |
| P2[AS 2] / | \ \ |
| / | \ \ |
| / (C2P) | \ P5[AS 5] \ P5[AS 5] |
+----------+ | +----------------+ | \ \ |
| Tester |-|->| | | \ \ |
|(Attacker)| | | AS 2(P2) | | \ \ |
| (P5') |<|--| | | P1[AS 1] \ \ |
+----------+ | +---------+/\+---+ | P6[AS 1] \ \ |
| P6[AS 1] \ | NO_EXPORT \ \ |
| P1[AS 1] \ | \ \ |
| NO_EXPORT \ | \ \ |
| \ (C2P) | (C2P) (C2P) \ (C2P) \ |
| +----------------+ +----------------+ |
| Victim+-+ AS 1(P1, P6) | | AS 5(P5) | |
| +----------------+ +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of
AS 2 or connected to AS 2 through other ASes.
Figure 13: SAV for customer-facing ASes in the scenario of direct
attacks.
Figure 13 presents the test case of SAV for customer-facing ASes in
the scenario of direct attacks. In this test case, the direct attack
by source address spoofing takes place within the DUT's customer
cone, where the attacker spoofs a source address (P5) and directly
targets the victim's IP address (P1), overwhelming its network
resources. The Tester performs the source address spoofing function
as an attacker. The arrows in Figure 13 illustrate the commercial
relationships between ASes. AS 3 serves as the provider for the DUT
and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS
5. Additionally, AS 2 is the provider for AS 1.
*Procedure*:
Chen, et al. Expires 9 January 2025 [Page 27]
Internet-Draft SAVBench July 2024
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for customer-facing ASes in the scenario of direct
attacks, a testbed can be built as shown in Figure 13 to
construct the test network environment. The Tester is connected
to AS 2 and generates the test traffic to the DUT.
2. Then, the ASes including AS 1, AS 2, AS 3, the DUT, and AS 5, are
configured to form the scenario of direct attacks.
3. Finally, the Tester sends the traffic using P5 as source
addresses and P1 as destination addresses (spoofing traffic) to
AS 1 via the DUT.
*Expected Results*: The DUT can block the spoofing traffic with
source addresses in P5 from the direction of AS 2 for this test case.
5.2.1.2.2. SAV for Provider/Peer-facing ASes
*Test case 1*:
Chen, et al. Expires 9 January 2025 [Page 28]
Internet-Draft SAVBench July 2024
+----------------+
| Tester |
| (Attacker) |
| (P1') |
+----------------+
| /\
| |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment \/ | |
| +----------------+ |
| | | |
| | AS 3(P3) | |
| | | |
| +-+/\----+/\+----+ |
| / \ |
| / \ |
| / \ |
| / (C2P/P2P) \ |
| +----------------+ \ |
| | DUT(P4) | \ |
| ++/\+--+/\+--+/\++ \ |
| P6[AS 1, AS 2] / | \ \ |
| P2[AS 2] / | \ \ |
| / | \ \ |
| / (C2P) | \ P5[AS 5] \ P5[AS 5] |
| +----------------+ | \ \ |
|Server+-+ AS 2(P2) | | P1[AS 1] \ \ |
| +----------+/\+--+ | P6[AS 1] \ \ |
| P6[AS 1] \ | NO_EXPORT \ \ |
| P1[AS 1] \ | \ \ |
| NO_EXPORT \ | \ \ |
| \ (C2P) | (C2P) (C2P) \ (C2P) \ |
| +----------------+ +----------------+ |
| Victim+-+ AS 1(P1, P6) | | AS 5(P5) | |
| +----------------+ +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of
AS 3 or connected to AS 3 through other ASes.
Figure 14: SAV for provider-facing ASes in the scenario of
reflection attacks.
Figure 14 depicts the test case of SAV for provider-facing ASes in
the scenario of reflection attacks. In this test case, the attacker
spoofs the victim's IP address (P1) and sends requests to servers' IP
address (P2) that respond to such requests. The Tester performs the
source address spoofing function as an attacker. The servers then
send overwhelming responses back to the victim, exhausting its
Chen, et al. Expires 9 January 2025 [Page 29]
Internet-Draft SAVBench July 2024
network resources. The arrows in Figure 14 represent the commercial
relationships between ASes. AS 3 acts as the provider or lateral
peer of the DUT and the provider for AS 5, while the DUT serves as
the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the
provider for AS 1.
*Procedure*:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for provider-facing ASes in the scenario of
reflection attacks, a testbed can be built as shown in Figure 14
to construct the test network environment. The Tester is
connected to AS 3 and generates the test traffic to the DUT.
2. Then, the ASes including AS 1, AS 2, AS 3, the DUT, and AS 5, are
configured to form the scenario of reflection attacks.
3. Finally, the Tester sends the traffic using P1 as source
addresses and P2 as destination addresses (spoofing traffic) to
AS 2 via AS 3 and the DUT.
*Expected Results*: The DUT can block the spoofing traffic with
source addresses in P1 from the direction of AS 3 for this test case.
*Test case 2*:
Chen, et al. Expires 9 January 2025 [Page 30]
Internet-Draft SAVBench July 2024
+----------------+
| Tester |
| (Attacker) |
| (P2') |
+----------------+
| /\
| |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment \/ | |
| +----------------+ |
| | AS 3(P3) | |
| +-+/\----+/\+----+ |
| / \ |
| / \ |
| / \ |
| / (C2P/P2P) \ |
| +----------------+ \ |
| | DUT(P4) | \ |
| ++/\+--+/\+--+/\++ \ |
| P6[AS 1, AS 2] / | \ \ |
| P2[AS 2] / | \ \ |
| / | \ \ |
| / (C2P) | \ P5[AS 5] \ P5[AS 5] |
|+----------------+ | \ \ |
|| AS 2(P2) | | P1[AS 1] \ \ |
|+----------+/\+--+ | P6[AS 1] \ \ |
| P6[AS 1] \ | NO_EXPORT \ \ |
| P1[AS 1] \ | \ \ |
| NO_EXPORT \ | \ \ |
| \ (C2P) | (C2P) (C2P) \ (C2P) \ |
| +----------------+ +----------------+ |
| Victim+-+ AS 1(P1, P6) | | AS 5(P5) | |
| +----------------+ +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of
AS 3 or connected to AS 3 through other ASes.
Figure 15: SAV for provider-facing ASes in the scenario of direct
attacks.
Figure 15 showcases a testcase of SAV for provider-facing ASes in the
scenario of direct attacks. In this test case, the attacker spoofs
another source address (P2) and directly targets the victim's IP
address (P1), overwhelming its network resources. The arrows in
Figure 15 represent the commercial relationships between ASes. AS 3
acts as the provider or lateral peer of the DUT and the provider for
AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.
Additionally, AS 2 is the provider for AS 1.
Chen, et al. Expires 9 January 2025 [Page 31]
Internet-Draft SAVBench July 2024
*Procedure*:
1. First, in order to test whether the DUT can generate accurate SAV
rules for SAV for provider-facing ASes in the scenario of direct
attacks, a testbed can be built as shown in Figure 15 to
construct the test network environment. The Tester is connected
to AS 3 and generates the test traffic to the DUT.
2. Then, the ASes including AS 1, AS 2, AS 3, the DUT, and AS 5, are
configured to form the scenario of direct attacks.
3. Finally, the Tester sends the traffic using P2 as source
addresses and P1 as destination addresses (spoofing traffic) to
AS1 via AS 3 and the DUT.
*Expected Results*: The DUT can block the spoofing traffic with
source addresses in P2 from the direction of AS 3 for this test case.
5.2.2. Protocol Convergence Performance
The test setup, procedure, and measures can refer to Section 5.1.2.
5.2.3. Control Plane Performance
The test setup, procedure, and measures can refer to Section 5.1.3.
5.2.4. Data Plane Forwarding Performance
The test setup, procedure, and measures can refer to Section 5.1.4.
6. Reporting Format
Each test has a reporting format that contains some global and
identical reporting components, and some individual components that
are specific to individual tests. The following parameters for test
configuration and SAV mechanism settings MUST be reflected in the
test report.
Test Configuration Parameters:
1. Test device hardware and software versions
2. Device CPU load
3. Network topology
4. Test traffic attributes
Chen, et al. Expires 9 January 2025 [Page 32]
Internet-Draft SAVBench July 2024
5. System configuration (e.g., physical or virtual machine, CPU,
memory, caches, operating system, interface capacity)
6. Device configuration (e.g., symmetric routing, NO_EXPORT)
7. SAV mechanism
7. IANA Considerations
This document has no IANA actions.
8. Security Considerations
The benchmarking tests described in this document are limited to the
performance characterization of SAV devices in a lab environment with
isolated networks.
The benchmarking network topology will be an independent test setup
and MUST NOT be connected to devices that may forward the test
traffic into a production network.
9. References
9.1. Normative References
[RFC3704] Baker, F. and P. Savola, "Ingress Filtering for Multihomed
Networks", BCP 84, RFC 3704, DOI 10.17487/RFC3704, March
2004, <https://www.rfc-editor.org/rfc/rfc3704>.
[RFC8704] Sriram, K., Montgomery, D., and J. Haas, "Enhanced
Feasible-Path Unicast Reverse Path Forwarding", BCP 84,
RFC 8704, DOI 10.17487/RFC8704, February 2020,
<https://www.rfc-editor.org/rfc/rfc8704>.
[RFC2544] Bradner, S. and J. McQuaid, "Benchmarking Methodology for
Network Interconnect Devices", RFC 2544,
DOI 10.17487/RFC2544, March 1999,
<https://www.rfc-editor.org/rfc/rfc2544>.
[RFC2119] Bradner, S., "Key words for use in RFCs to Indicate
Requirement Levels", BCP 14, RFC 2119,
DOI 10.17487/RFC2119, March 1997,
<https://www.rfc-editor.org/rfc/rfc2119>.
[RFC8174] Leiba, B., "Ambiguity of Uppercase vs Lowercase in RFC
2119 Key Words", BCP 14, RFC 8174, DOI 10.17487/RFC8174,
May 2017, <https://www.rfc-editor.org/rfc/rfc8174>.
Chen, et al. Expires 9 January 2025 [Page 33]
Internet-Draft SAVBench July 2024
9.2. Informative References
[intra-domain-ps]
"Source Address Validation in Intra-domain Networks Gap
Analysis, Problem Statement, and Requirements", 2024,
<https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-
domain-problem-statement/>.
[inter-domain-ps]
"Source Address Validation in Inter-domain Networks Gap
Analysis, Problem Statement, and Requirements", 2024,
<https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-
domain-problem-statement/>.
[intra-domain-arch]
"Intra-domain Source Address Validation (SAVNET)
Architecture", 2024, <https://datatracker.ietf.org/doc/
draft-ietf-savnet-intra-domain-architecture/>.
[inter-domain-arch]
"Inter-domain Source Address Validation (SAVNET)
Architecture", 2024, <https://datatracker.ietf.org/doc/
draft-wu-savnet-inter-domain-architecture/>.
Authors' Addresses
Li Chen
Zhongguancun Laboratory
Beijing
China
Email: lichen@zgclab.edu.cn
Dan Li
Tsinghua University
Beijing
China
Email: tolidan@tsinghua.edu.cn
Libin Liu
Zhongguancun Laboratory
Beijing
China
Email: liulb@zgclab.edu.cn
Chen, et al. Expires 9 January 2025 [Page 34]
Internet-Draft SAVBench July 2024
Lancheng Qin
Zhongguancun Laboratory
Beijing
China
Email: qinlc@zgclab.edu.cn
Chen, et al. Expires 9 January 2025 [Page 35]