Project

General

Profile

Actions

Feature #4244

open

performance testing configs repo

Added by Peter Manev 9 months ago. Updated 7 months ago.

Status:
New
Priority:
Normal
Assignee:
Target version:
-
Effort:
Difficulty:
Label:

Description

Master ticket.
Open to suggestions.

We should be able to come up with a way to share configs and setups for performance testing. Fro example configs/pcaps for testing with pktgen,trex etc.
Whatever can be public of course.
I am thinking of a repo similar to suricata-verify and of course at the same time a repo that anyone can pull(or contribute to) and run a test and share results.
This also could be related to specific performance corner cases and could be included in regular QA / per release/process etc.

Actions #1

Updated by Peter Manev 9 months ago

  • Assignee set to Peter Manev
something like:
  • folder structure per tool
  • for each tool we could have
  • per test case (lets say 40G ISP traffic mix, 10G Corporate traffic mix, SMB/NFS/KRB5 mix , etc ...)
  • readme explaining the test setup if needed, "how" to run, exact command lines, version of the tool used, link to redmine ticket
  • folder with a config
  • folder with example or needed pcaps
  • script (optional?) if needed to facilitate testing
Actions #2

Updated by Peter Manev 9 months ago

Also add in custom.yaml if needed.
One thing that can be made more detailed/explained is that those test cases are not "copy paste" production configs.

Actions #3

Updated by Peter Manev 9 months ago

We might also need specific compile options per test: a debug/asan etc.
One thing to be very careful about those settings is that it needs to be well controlled as they are quite heavy on performance and we might end up chasing our tails. Those will be specific cases where the config is not maxing out the testing setup but rather put enough pressure to expose certain waned or not wanted condition. (example rs_functions appearing in the top 20 on perf top)

Actions #4

Updated by Peter Manev 9 months ago

Could be interesting to also add expected/recommended time duration of each test/config.
It would also make sense i guess if all that info can be done in a structured format so that it can be picked up by a QA tool and used for a test in an automated fashion. (yaml/python config etc.)
All the updates above I actually have done with "trex in mind" - so this could actually differ for another tool (ex pktgen).

Actions #5

Updated by Peter Manev 8 months ago

sha256 sums of any pcaps provided should be included in the doc/config of each test as well.

Actions #6

Updated by Peter Manev 8 months ago

Pre-run checks need to be done too - is everything that is supposed to be up is up, checksums match, versions match etc

Actions

Also available in: Atom PDF