Hi Henry!
In the Wazuh alert generating process, collected data/logs arrives from agents and those events fall in archives.json file (if it's enabled to create the archive.json file), then events are processed by decoders and rules, firing alerts if they matchs.
Alert
generating flow looks like:
Collecting data/logs -> archives.json -> decoding/rules -> alerts.json
The idea is to analyse some events/logs on archive.json and create decoders and rules for them. you can find the custom rules and decoders documentation
here, Also Wazuh includes a tool to test each log,
/var/ossec/bin/wazuh-logtest,
This is an example log, where you can see how the decoder and the rules match and capture different fields.
```
Type one log per line
Dec 25 20:45:02 MyHost example[12345]: User 'admin' logged from '192.168.1.100'
**Phase 1: Completed pre-decoding.
full event: 'Dec 25 20:45:02 MyHost example[12345]: User 'admin' logged from '192.168.1.100''
timestamp: 'Dec 25 20:45:02'
hostname: 'MyHost'
program_name: 'example'
**Phase 2: Completed decoding.
name: 'example'
dstuser: 'admin'
srcip: '192.168.1.100'
**Phase 3: Completed filtering (rules).
id: '100010'
level: '0'
description: 'User logged'
groups: '['local', 'syslog', 'sshd']'
firedtimes: '1'
mail: 'False'
```
Also if you want, I can help you to create some rules, share some example logs (from archive.json file).
To be aware, enabling the archive.json file could increase the disk consumption.
Let me know if that works.
Regards.