https://redmine.openinfosecfoundation.org/https://redmine.openinfosecfoundation.org/favicon.ico?17011170022020-12-31T09:54:07ZOpen Information Security FoundationSuricata - Feature #4244: performance testing configs repohttps://redmine.openinfosecfoundation.org/issues/4244?journal_id=189902020-12-31T09:54:07ZPeter Manevpetermanev@gmail.com
<ul><li><strong>Assignee</strong> set to <i>Peter Manev</i></li></ul>something like:
<ul>
<li>folder structure per tool</li>
<li>for each tool we could have</li>
</ul>
<blockquote>
<ul>
<li>per test case (lets say 40G ISP traffic mix, 10G Corporate traffic mix, SMB/NFS/KRB5 mix , etc ...)</li>
<li>readme explaining the test setup if needed, "how" to run, exact command lines, version of the tool used, link to redmine ticket </li>
<li>folder with a config </li>
<li>folder with example or needed pcaps </li>
<li>script (optional?) if needed to facilitate testing</li>
</ul>
</blockquote> Suricata - Feature #4244: performance testing configs repohttps://redmine.openinfosecfoundation.org/issues/4244?journal_id=189942021-01-01T10:34:22ZPeter Manevpetermanev@gmail.com
<ul></ul><p>Also add in custom.yaml if needed.<br />One thing that can be made more detailed/explained is that those test cases are not "copy paste" production configs.</p> Suricata - Feature #4244: performance testing configs repohttps://redmine.openinfosecfoundation.org/issues/4244?journal_id=189962021-01-01T14:52:25ZPeter Manevpetermanev@gmail.com
<ul></ul><p>We might also need specific compile options per test: a debug/asan etc. <br />One thing to be very careful about those settings is that it needs to be well controlled as they are quite heavy on performance and we might end up chasing our tails. Those will be specific cases where the config is not maxing out the testing setup but rather put enough pressure to expose certain waned or not wanted condition. (example rs_functions appearing in the top 20 on perf top)</p> Suricata - Feature #4244: performance testing configs repohttps://redmine.openinfosecfoundation.org/issues/4244?journal_id=189972021-01-02T10:17:21ZPeter Manevpetermanev@gmail.com
<ul></ul><p>Could be interesting to also add expected/recommended time duration of each test/config.<br />It would also make sense i guess if all that info can be done in a structured format so that it can be picked up by a QA tool and used for a test in an automated fashion. (yaml/python config etc.)<br />All the updates above I actually have done with "trex in mind" - so this could actually differ for another tool (ex pktgen).</p> Suricata - Feature #4244: performance testing configs repohttps://redmine.openinfosecfoundation.org/issues/4244?journal_id=190672021-01-17T17:56:45ZPeter Manevpetermanev@gmail.com
<ul></ul><p>sha256 sums of any pcaps provided should be included in the doc/config of each test as well.</p> Suricata - Feature #4244: performance testing configs repohttps://redmine.openinfosecfoundation.org/issues/4244?journal_id=190872021-01-24T08:42:15ZPeter Manevpetermanev@gmail.com
<ul></ul><p>Pre-run checks need to be done too - is everything that is supposed to be up is up, checksums match, versions match etc</p> Suricata - Feature #4244: performance testing configs repohttps://redmine.openinfosecfoundation.org/issues/4244?journal_id=194142021-02-24T19:06:09ZCorey Thomas
<ul></ul><p><a class="external" href="https://gitlab.oisf.net/dev/suricata-ci-trex">https://gitlab.oisf.net/dev/suricata-ci-trex</a></p>