Victor Julien wrote in #note-5:
There was quite a bit of interest at suricon 2021. I think the first step should be to define a JSON schema / definition document as a rst doc in a PR to the suricata github repo.
Wrt implementation, I could imagine we start experimenting in suricata-update or a different tool outside of suricata proper, and have that "compile" the JSON into the existing rule format at first.
The 2 pass parser experimenting with in https://redmine.openinfosecfoundation.org/issues/3317 could be extended. This tokenizer/lexer could be extended to be much more fine-grained to the point of breaking down each field in a byte_jump and others to their own struct elements. Essentially creating an AST which is the result of a first pass.
As a result, the rules could be dumped in JSON in mass (with serde) as a conversion tool, and as the data structures are all created now, JSON rules could be consumed with serde as well.
My hesitation with experimenting with this in suricata-update is I think its actually easier to work with this sort of stuff in Rust, and the end result would likely be Rust.