17. DPDK Test Suite
The DPDK Test Suite, abbreviated DTS, is a Python test framework with test suites implementing functional and performance tests used to test DPDK.
17.1. DTS Terminology
- DTS node
A generic description of any host/server DTS connects to.
- DTS runtime environment
An environment containing Python with packages needed to run DTS.
- DTS runtime environment node
A node where at least one DTS runtime environment is present. This is the node where we run DTS and from which DTS connects to other nodes.
- System under test
An SUT is the combination of DPDK and the hardware we’re testing in conjunction with DPDK (NICs, crypto and other devices).
- System under test node
A node where at least one SUT is present.
- Traffic generator
A TG is either software or hardware capable of sending packets.
- Traffic generator node
A node where at least one TG is present. In case of hardware traffic generators, the TG and the node are literally the same.
In most cases, interchangeably referring to a runtime environment, SUT, TG or the node they’re running on (e.g. using SUT and SUT node interchangeably) doesn’t cause confusion. There could theoretically be more than of these running on the same node and in that case it’s useful to have stricter definitions. An example would be two different traffic generators (such as Trex and Scapy) running on the same node. A different example would be a node containing both a DTS runtime environment and a traffic generator, in which case it’s both a DTS runtime environment node and a TG node.
17.2. DTS Environment
DTS is written entirely in Python using a variety of dependencies. DTS uses Poetry as its Python dependency management. Python build/development and runtime environments are the same and DTS development environment, DTS runtime environment or just plain DTS environment are used interchangeably.
17.2.1. Setting up DTS environment
Python Version
The Python Version required by DTS is specified in
dts/pyproject.toml
in the [tool.poetry.dependencies] section:[tool.poetry.dependencies] python = "^3.10"
The Python dependency manager DTS uses, Poetry, doesn’t install Python, so you may need to satisfy this requirement by other means if your Python is not up-to-date. A tool such as Pyenv is a good way to get Python, though not the only one.
Poetry
The typical style of python dependency management, pip with
requirements.txt
, has a few issues. The advantages of Poetry include specifying what Python version is required and forcing you to specify versions, enforced by a lockfile, both of which help prevent broken dependencies. Another benefit is the usage ofpyproject.toml
, which has become the standard config file for python projects, improving project organization. To install Poetry, visit their doc pages. The recommended Poetry version is at least 1.8.2.Getting a Poetry shell
Once you have Poetry along with the proper Python version all set up, it’s just a matter of installing dependencies via Poetry and using the virtual environment Poetry provides:
poetry install poetry shell
SSH Connection
DTS uses the Fabric Python library for SSH connections between DTS environment and the other hosts. The authentication method used is pubkey authentication. Fabric tries to use a passed key/certificate, then any key it can with through an SSH agent, then any “id_rsa”, “id_dsa” or “id_ecdsa” key discoverable in
~/.ssh/
(with any matching OpenSSH-style certificates). DTS doesn’t pass any keys, so Fabric tries to use the other two methods.
17.3. Setting up System Under Test
There are two areas that need to be set up on a System Under Test:
DPDK dependencies
DPDK will be built and run on the SUT. Consult the Getting Started guides for the list of dependencies for each distribution.
Hardware dependencies
Any hardware DPDK uses needs a proper driver and most OS distributions provide those, but the version may not be satisfactory. It’s up to each user to install the driver they’re interested in testing. The hardware also may also need firmware upgrades, which is also left at user discretion.
Hugepages
There are two ways to configure hugepages:
DTS configuration
You may specify the optional hugepage configuration in the DTS config file. If you do, DTS will take care of configuring hugepages, overwriting your current SUT hugepage configuration. Configuration of hugepages via DTS allows only for allocation of 2MB hugepages, as doing so prevents accidental/over allocation of hugepage sizes not recommended during runtime due to contiguous memory space requirements. Thus, if you require hugepage sizes not equal to 2MB, then this configuration must be done outside of the DTS framework.
System under test configuration
It’s possible to use the hugepage configuration already present on the SUT. If you wish to do so, don’t specify the hugepage configuration in the DTS config file.
User with administrator privileges
DTS needs administrator privileges to run DPDK applications (such as testpmd) on the SUT. The SUT user must be able run commands in privileged mode without asking for password. On most Linux distributions, it’s a matter of setting up passwordless sudo:
Run
sudo visudo
and check that it contains%sudo ALL=(ALL:ALL) NOPASSWD:ALL
.Add the SUT user to the sudo group with:
sudo usermod -aG sudo <sut_user>
17.4. Setting up Traffic Generator Node
These need to be set up on a Traffic Generator Node:
Traffic generator dependencies
The traffic generator running on the traffic generator node must be installed beforehand. For Scapy traffic generator, only a few Python libraries need to be installed:
sudo apt install python3-pip sudo pip install --upgrade pip sudo pip install scapy==2.5.0
Hardware dependencies
The traffic generators, like DPDK, need a proper driver and firmware. The Scapy traffic generator doesn’t have strict requirements - the drivers that come with most OS distributions will be satisfactory.
User with administrator privileges
Similarly to the System Under Test, traffic generators need administrator privileges to be able to use the devices. Refer to the System Under Test section <sut_admin_user> for details.
17.5. Running DTS
DTS needs to know which nodes to connect to and what hardware to use on those nodes. Once that’s configured, either a DPDK source code tarball or tree folder need to be supplied whether these are on your DTS host machine or the SUT node. DTS can accept a pre-compiled build placed in a subdirectory, or it will compile DPDK on the SUT node, and then run the tests with the newly built binaries.
17.5.1. Configuring DTS
DTS configuration is split into nodes and test runs,
and must respect the model definitions
as documented in the DTS API docs under the config
page.
The root of the configuration is represented by the Configuration
model.
By default, DTS will try to use the dts/conf.yaml
config file,
which is a template that illustrates what can be configured in DTS.
The user must have administrator privileges which don’t require password authentication.
17.5.2. DTS Execution
DTS is run with main.py
located in the dts
directory after entering Poetry shell:
(dts-py3.10) $ ./main.py --help
usage: main.py [-h] [--config-file FILE_PATH] [--output-dir DIR_PATH] [-t SECONDS] [-v] [--dpdk-tree DIR_PATH | --tarball FILE_PATH] [--remote-source]
[--precompiled-build-dir DIR_NAME] [--compile-timeout SECONDS] [--test-suite TEST_SUITE [TEST_CASES ...]] [--re-run N_TIMES]
[--random-seed NUMBER]
Run DPDK test suites. All options may be specified with the environment variables provided in brackets. Command line arguments have higher priority.
options:
-h, --help show this help message and exit
--config-file FILE_PATH
[DTS_CFG_FILE] The configuration file that describes the test cases, SUTs and DPDK build configs. (default: conf.yaml)
--output-dir DIR_PATH, --output DIR_PATH
[DTS_OUTPUT_DIR] Output directory where DTS logs and results are saved. (default: output)
-t SECONDS, --timeout SECONDS
[DTS_TIMEOUT] The default timeout for all DTS operations except for compiling DPDK. (default: 15)
-v, --verbose [DTS_VERBOSE] Specify to enable verbose output, logging all messages to the console. (default: False)
--compile-timeout SECONDS
[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK. (default: 1200)
--test-suite TEST_SUITE [TEST_CASES ...]
[DTS_TEST_SUITES] A list containing a test suite with test cases. The first parameter is the test suite name, and the rest are
test case names, which are optional. May be specified multiple times. To specify multiple test suites in the environment
variable, join the lists with a comma. Examples: --test-suite suite case case --test-suite suite case ... |
DTS_TEST_SUITES='suite case case, suite case, ...' | --test-suite suite --test-suite suite case ... | DTS_TEST_SUITES='suite,
suite case, ...' (default: [])
--re-run N_TIMES, --re_run N_TIMES
[DTS_RERUN] Re-run each test case the specified number of times if a test failure occurs. (default: 0)
--random-seed NUMBER [DTS_RANDOM_SEED] The seed to use with the pseudo-random generator. If not specified, the configuration value is used instead.
If that's also not specified, a random seed is generated. (default: None)
DPDK Build Options:
Arguments in this group (and subgroup) will be applied to a DPDKLocation when the DPDK tree, tarball or revision will be provided, other arguments
like remote source and build dir are optional. A DPDKLocation from settings are used instead of from config if construct successful.
--dpdk-tree DIR_PATH [DTS_DPDK_TREE] The path to the DPDK source tree directory to test. Cannot be used in conjunction with --tarball. (default:
None)
--tarball FILE_PATH, --snapshot FILE_PATH
[DTS_DPDK_TARBALL] The path to the DPDK source tarball to test. DPDK must be contained in a folder with the same name as the
tarball file. Cannot be used in conjunction with --dpdk-tree. (default: None)
--remote-source [DTS_REMOTE_SOURCE] Set this option if either the DPDK source tree or tarball to be used are located on the SUT node. Can only
be used with --dpdk-tree or --tarball. (default: False)
--precompiled-build-dir DIR_NAME
[DTS_PRECOMPILED_BUILD_DIR] Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with --dpdk-tree or --tarball.
(default: None)
The brackets contain the names of environment variables that set the same thing. The minimum DTS needs is a config file and a pre-built DPDK or DPDK sources location which can be specified in said config file or on the command line or environment variables.
17.5.3. DTS Results
Results are stored in the output dir by default
which be changed with the --output-dir
command line argument.
The results contain basic statistics of passed/failed test cases and DPDK version.
17.6. Contributing to DTS
There are two areas of contribution: The DTS framework and DTS test suites.
The framework contains the logic needed to run test cases, such as connecting to nodes, running DPDK applications and collecting results.
The test cases call APIs from the framework to test their scenarios. Adding test cases may require adding code to the framework as well.
17.6.1. Framework Coding Guidelines
When adding code to the DTS framework, pay attention to the rest of the code and try not to divert much from it. The DTS developer tools will issue warnings when some of the basics are not met. You should also build the API documentation to address any issues found during the build.
The API documentation, which is a helpful reference when developing, may be accessed
in the code directly or generated with the API docs build steps.
When adding new files or modifying the directory structure,
the corresponding changes must be made to DTS API doc sources in doc/api/dts
.
Speaking of which, the code must be properly documented with docstrings. The style must conform to the Google style. See an example of the style here. For cases which are not covered by the Google style, refer to PEP 257. There are some cases which are not covered by the two style guides, where we deviate or where some additional clarification is helpful:
The
__init__()
methods of classes are documented separately from the docstring of the class itself.The docstrings of implemented abstract methods should refer to the superclass’s definition if there’s no deviation.
Instance variables/attributes should be documented in the docstring of the class in the
Attributes:
section.The
dataclass.dataclass
decorator changes how the attributes are processed. The dataclass attributes which result in instance variables/attributes should also be recorded in theAttributes:
section.Class variables/attributes and Pydantic model fields, on the other hand, should be documented with
#:
above the type annotated line. The description may be omitted if the meaning is obvious.The
Enum
andTypedDict
also process the attributes in particular ways and should be documented with#:
as well. This is mainly so that the autogenerated documentation contains the assigned value.When referencing a parameter of a function or a method in their docstring, don’t use any articles and put the parameter into single backticks. This mimics the style of Python’s documentation.
When specifying a value, use double backticks:
def foo(greet: bool) -> None: """Demonstration of single and double backticks. `greet` controls whether ``Hello World`` is printed. Args: greet: Whether to print the ``Hello World`` message. """ if greet: print(f"Hello World")The docstring maximum line length is the same as the code maximum line length.
17.7. How To Write a Test Suite
All test suites inherit from TestSuite
defined in dts/framework/test_suite.py
.
There are four types of methods that comprise a test suite:
Test cases
Test cases are methods that start with a particular prefix.Functional test cases start withtest_
, e.g.test_hello_world_single_core
.Performance test cases start withtest_perf_
, e.g.test_perf_nic_single_core
.A test suite may have any number of functional and/or performance test cases. However, these test cases must test the same feature, following the rule of one feature = one test suite. Test cases for one feature don’t need to be grouped in just one test suite, though. If the feature requires many testing scenarios to cover, the test cases would be better off spread over multiple test suites so that each test suite doesn’t take too long to execute.Setup and Teardown methods
There are setup and teardown methods for the whole test suite and each individual test case.Methodsset_up_suite
andtear_down_suite
will be executed before any and after all test cases have been executed, respectively.Methodsset_up_test_case
andtear_down_test_case
will be executed before and after each test case, respectively.These methods don’t need to be implemented if there’s no need for them in a test suite. In that case, nothing will happen when they are executed.Configuration, traffic and other logic
The
TestSuite
class contains a variety of methods for anything that a test suite setup, a teardown, or a test case may need to do.The test suites also frequently use a DPDK app, such as testpmd, in interactive mode and use the interactive shell instances directly.
These are the two main ways to call the framework logic in test suites. If there’s any functionality or logic missing from the framework, it should be implemented so that the test suites can use one of these two ways.
Test case verification
Test case verification should be done with the
verify
method, which records the result. The method should be called at the end of each test case.Other methods
Of course, all test suite code should adhere to coding standards. Only the above methods will be treated specially and any other methods may be defined (which should be mostly private methods needed by each particular test suite). Any specific features (such as NIC configuration) required by a test suite should be implemented in the
SutNode
class (and the underlying classes thatSutNode
uses) and used by the test suite via thesut_node
field.
17.8. DTS Developer Tools
There are three tools used in DTS to help with code checking, style and formatting:
-
Alphabetically sorts python imports within blocks.
-
Does most of the actual formatting (whitespaces, comments, line length etc.) and works similarly to clang-format.
-
Runs a collection of python linters and aggregates output. It will run these tools over the repository:
linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
-
Enables static typing for Python, exploiting the type hints in the source code.
These three tools are all used in devtools/dts-check-format.sh
,
the DTS code check and format script.
Refer to the script for usage: devtools/dts-check-format.sh -h
.
17.9. Building DTS API docs
The documentation is built using the standard DPDK build system. See Compiling the DPDK Target from Source for more details on compiling DPDK with meson.
The doc build dependencies may be installed with Poetry:
poetry install --only docs
poetry install --with docs # an alternative that will also install DTS dependencies
poetry shell
After executing the meson command, build the documentation with:
ninja -C build doc
The output is generated in build/doc/api/dts/html
.
Note
Make sure to fix any Sphinx warnings when adding or updating docstrings.
17.10. Configuration Example
The following example (which can be found in dts/conf.yaml
) sets up two nodes:
SUT1
which is already setup with the DPDK build requirements and any other required for execution;TG1
which already has Scapy installed in the system.
And they both have two network ports which are physically connected to each other.
Note
This example assumes that you have setup SSH keys in both the system under test and traffic generator nodes.
test_runs:
# define one test run environment
- dpdk_build:
dpdk_location:
# dpdk_tree: Commented out because `tarball` is defined.
tarball: dpdk-tarball.tar.xz
# Either `dpdk_tree` or `tarball` can be defined, but not both.
remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
# is located on the SUT node, instead of the execution host.
# precompiled_build_dir: Commented out because `build_options` is defined.
build_options:
arch: x86_64
os: linux
cpu: native
# the combination of the following two makes CC="ccache gcc"
compiler: gcc
compiler_wrapper: ccache # Optional.
# If `precompiled_build_dir` is defined, DPDK has been pre-built and the build directory is
# in a subdirectory of DPDK tree root directory. Otherwise, will be using the `build_options`
# to build the DPDK from source. Either `precompiled_build_dir` or `build_options` can be
# defined, but not both.
perf: false # disable performance testing
func: true # enable functional testing
skip_smoke_tests: false # optional
test_suites: # the following test suites will be run in their entirety
- hello_world
# The machine running the DPDK test executable
system_under_test_node:
node_name: "SUT 1"
vdevs: # optional; if removed, vdevs won't be used in the test run
- "crypto_openssl"
# Traffic generator node to use for this test run
traffic_generator_node: "TG 1"
nodes:
# Define a system under test node, having two network ports physically
# connected to the corresponding ports in TG 1 (the peer node)
- name: "SUT 1"
hostname: sut1.change.me.localhost
user: dtsuser
arch: x86_64
os: linux
lcores: "" # use all the available logical cores
use_first_core: false # tells DPDK to use any physical core
memory_channels: 4 # tells DPDK to use 4 memory channels
hugepages_2mb: # optional; if removed, will use system hugepage configuration
number_of: 256
force_first_numa: false
ports:
# sets up the physical link between "SUT 1"@0000:00:08.0 and "TG 1"@0000:00:08.0
- pci: "0000:00:08.0"
os_driver_for_dpdk: vfio-pci # OS driver that DPDK will use
os_driver: i40e # OS driver to bind when the tests are not running
peer_node: "TG 1"
peer_pci: "0000:00:08.0"
# sets up the physical link between "SUT 1"@0000:00:08.1 and "TG 1"@0000:00:08.1
- pci: "0000:00:08.1"
os_driver_for_dpdk: vfio-pci
os_driver: i40e
peer_node: "TG 1"
peer_pci: "0000:00:08.1"
# Define a Scapy traffic generator node, having two network ports
# physically connected to the corresponding ports in SUT 1 (the peer node).
- name: "TG 1"
hostname: tg1.change.me.localhost
user: dtsuser
arch: x86_64
os: linux
ports:
# sets up the physical link between "TG 1"@0000:00:08.0 and "SUT 1"@0000:00:08.0
- pci: "0000:00:08.0"
os_driver_for_dpdk: rdma
os_driver: rdma
peer_node: "SUT 1"
peer_pci: "0000:00:08.0"
# sets up the physical link between "SUT 1"@0000:00:08.0 and "TG 1"@0000:00:08.0
- pci: "0000:00:08.1"
os_driver_for_dpdk: rdma
os_driver: rdma
peer_node: "SUT 1"
peer_pci: "0000:00:08.1"
hugepages_2mb: # optional; if removed, will use system hugepage configuration
number_of: 256
force_first_numa: false
traffic_generator:
type: SCAPY