Hi,
The decoders and rules are only needed in the wazuh-manager
. Note that those files are used for event analysis, and the component that is in charge of doing that is the wazuh-manager
, the wazuh-agent
are only in charge of collecting events and sending them to the wazuh-manager
.
Regarding where to create those decoders and rules, as you say you have to do it in the /var/ossec/etc/rules/<rules_file_name>.xml
path for the rules and /var/ossec/etc/decoders/<decoders_file_name>.xml
. This is done so that the added files are not deleted during a wazuh-manager
upgrade. The default wazuh-manager
decoders and rules are located in a different path /var/ossec/ruleset/
, and they can be modified during the upgrade.
As for the names of these files in principle does not matter, will be processed all those contained in the directories /var/ossec/etc/decoders/
and /var/ossec/etc/rules/
(configuration specified in the ossec.conf
file).
My recommendation is that if you are going to create a decoder and/or single rule, do it in the existing local_rules.xml
or local_decoder.xml
file, but if you are going to create a complete group as in your case, create a new file, such as jenkins_rules.xml
and add there all the custom ones. Of course, these files must have xml
format and decoder and/or rules syntax.
I hope you find this information helpful.
Best regards.
Hi,
There are several things to keep in mind from the time you monitor a file until the alerts appear on the wazuh-dashboard
.
In the wazuh-dashboard
will appear all the security alerts generated by the wazuh-manager
. A security alert is an event that has been received by the wazuh-manager
, decoded and patched with some level 3 or higher rule. It is important to understand that not all events generate alerts and therefore do not appear in the wazuh-dashboard
, but it depends on the ruleset whether or not to generate such alerts. To summarize, in the wazuh-dashboard
only the alerts will appear, and these alerts are generated according to the rules and decoders configured for the received events.
Now, you have to check where in this process is the lack of configuration or understanding on your part. Let’s do a walkthrough from low level to the highest level. To do so, I would ask myself the following questions in that order:
wazuh-agent
monitoring the log file?wazuh-agent
sending the events to the wazuh-manager
?wazuh-manager
receiving events from the wazuh-agent
?wazuh-manager
generating alert for the desired use case from the received event?wazuh-indexer
?You will have to find out where is the problem. Here are some recommendations and tips to check each one of them, although I think that what is probably failing you is creating the necessary decoders and rules to generate the alerts in the desired cases (point 4).
1. Is the wazuh-agent
monitoring the log file?
To monitor the log file, it is necessary to configure a <localfile>
block. In this case, I see that you have done it through the centralized configuration (agent.conf
) which is done from the wazuh-manager
.
<localfile>
<location>/var/log/jenkins/jenkins.log</location>
<log_format>syslog</log_format>
</localfile>
The previous block is correct, now you have to check if the wazuh-agent
has received it correctly. To do this, go to the agent.conf
of the wazuh-agent
and check that you have this configuration.
Also, you can check the ossec.log
of the wazuh-agent
to see if you have a log like the following (UNIX log syntax)
2022/09/26 08:09:10 wazuh-logcollector: INFO: (1950): Analyzing file: '/var/log/jenkins/jenkins.log'.
If yes, it means that it is monitoring correctly.
2. Is the wazuh-agent
sending the events to the wazuh-manager
?
Check that the wazuh-agent
is registered and connected correctly to the wazuh-manager
. To do this, you can run the following command in the wazuh-manager
and check if that agent appears in the list of registered agents and is in an active state.
/var/ossec/bin/agent_control -l
Also, you can check if in the ossec.log
of the wazuh-agent
there is a log indicating that it is connected to the wazuh-manager
. For example:
2022/10/18 10:27:39 wazuh-agentd: INFO: (4102): Connected to the server (172.16.1.50:1514/tcp).
3. The wazuh-manager
is receiving the events from the wazuh-agent
.
To check the events received by the wazuh-manager
you can enable logging and check if the events monitored by the wazuh-agent
appear in the wazuh-agent
file. To do this, edit the /var/ossec/etc/ossec.conf
file of the wazuh-manager
and activate:
<logall_json>yes</logall_json>
Then restart to apply the changes
systemctl restart wazuh-manager
Now, generate events in the Jenkins log (on the wazuh-agent
side), to see if these events are recorded in the /var/ossec/logs/archives/archives.json
file on the wazuh-manager
side. In this file, a line will be generated for each event received.
If it appears, we can affirm that the wazuh-manager
is correctly receiving the wazuh-agent
events. In case it does not, we have to check the previous steps.
Note: Remember to disable the to no and restart wazuh to avoid unnecessary disk usage (when you finish debugging).
4. Is the wazuh-manager
generating alert for the desired use case from the received event?.
Once we know that the wazuh-manager
has received the event, we must check if the alert is being generated or not. To do this, just look for if after receiving the event an alert is being generated in the file /var/ossec/logs/alerts/alerts.json
. A line should be generated for each alert generated.
Check if the related alerts are being generated. If not, then you will probably have to create new decoders and rules for the cases you want.
5. Is the alert being sent and indexed correctly in the wazuh-indexer
?
Once the alert is generated and stored in the /var/ossec/logs/alerts/alerts.json
file, the Filebeat
component is in charge of sending it to the wazuh-indexer
where it is stored and consulted to be shown in the wazuh-dashboard
.
First, you should check in the wazuh-indexer
alerts indexes if you have any desired alert indexed. If not, you should check if Filebeat
is correctly configured and connected to the wazuh-indexer
and that the status of the services is correct …
An important thing is that all the components are accessible through the network, and that there is no firewall blocking the communication through the communication ports in each of them.
This point 5 requires a more specific debugging that I will discuss later, if this is what is failing you.
Try everything I have indicated and let me know the results obtained in each of them to identify where the problem is and tell you how to proceed.
Regards.
Hi,
Note that point (3) and (4) are not the same.
(3) checks that the wazuh-manager
is receiving the events and (4) checks that these events are generating an alert. The condition for an event to generate an alert is that the event is decoded and patched with some level 3 or higher rule (ruleset).
What I ask you to check in this case, is that if the wazuh-manager
has received the event (you have checked it in the file /var/ossec/logs/archives/archives.json
), now check if it has generated alert corresponding to this event. For it, you have to check the same but in this other file /var/ossec/logs/alerts/alerts.json
(notice that it is not the same, archives.json
is for events and this one for alerts).
In case they are being stored in this file, then we continue with the debugging in the following steps, but in case they are not, we must review the syntax of the event, and the decoder and rule that you are using to generate that possible alert. To move forward, in case you are not generating the alert, share the raw log of the event (you can find it in the full_log
field of the event found in the /var/ossec/logs/archives/archives.json
file), as well as share the decoders and rules you are using to generate the alert. Finally, comment the condition that the event log has to fulfill for which you want an alert to be generated.
Hi,
The test to see if a log would match with a decoder and rule is to use the /var/ossec/bin/wazuh-logtest
tool.
# /var/ossec/bin/wazuh-logtest
Starting wazuh-logtest v4.3.8
Type one log per line
2022-10-18 13:22:32.463+0000 [id=326] INFO hudson.model.AsyncPeriodicWork#lambda$doRun$1: Finished Periodic background build discarder. 1 ms
**Phase 1: Completed pre-decoding.
full event: '2022-10-18 13:22:32.463+0000 [id=326] INFO hudson.model.AsyncPeriodicWork#lambda$doRun$1: Finished Periodic background build discarder. 1 ms'
**Phase 2: Completed decoding.
No decoder matched.
In this case, notice that the log 2022-10-18 13:22:32.463+0000 [id=326] INFO hudson.model.AsyncPeriodicWork#lambda$doRun$1: Finished Periodic background build discarder. 1 ms
of the event is not decoded or matched with any rule, so it will never generate an alert.
If we look for the decoders that wazuh
has by default for Jenkins (see https://github.com/wazuh/wazuh/blob/v4.3.9/ruleset/decoders/0415-jenkins_decoders.xml), we observe that the format is different, both for the date and for the order (see the example comments).
For that reason, it is necessary to add decoders and custom rules for the cases you want. I will give you an example of how it would be created. For example, for the event log mentioned above.
First, I am going to look for a pattern in the log that allows me to identify that the log belongs to Jenkins. As I see, I can use the date format and the word hudson.
for this, I will create the following decoder.
I create and add the following decoder, for example to the file /var/ossec/etc/decoders/local_decoder.xml
:
<decoder name="custom_jenkins">
<prematch type="pcre2">\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d+\+\d+ \[id=\d+\]\s+\w+\s+hudson.\.*</prematch>
<regex>id=(\d+)]\s+(\w+)\s+(hudson.\.*): (\.*)</regex>
<order>_id, level, hudson, log_description</order>
</decoder>
Next, I add in /var/ossec/etc/rules/local_rules.xml
the following generic rule, wich will generate a level 3 alert when a log is decoded with our “custom_jenkins” decoder:
<group name="custom_jenkins,">
<rule id="100051" level="3">
<decoded_as>custom_jenkins</decoded_as>
<description>Security event from Jenkins log: $(log_description)</description>
</rule>
</group>
I check in /var/ossec/bin/wazuh-logtest
tool that the log is now decoded and matched correctly.
# /var/ossec/bin/wazuh-logtest
Starting wazuh-logtest v4.3.8
Type one log per line
2022-10-18 13:22:32.463+0000 [id=326] INFO hudson.model.AsyncPeriodicWork#lambda$doRun$1: Finished Periodic background build discarder. 1 ms
**Phase 1: Completed pre-decoding.
full event: '2022-10-18 13:22:32.463+0000 [id=326] INFO hudson.model.AsyncPeriodicWork#lambda$doRun$1: Finished Periodic background build discarder. 1 ms'
**Phase 2: Completed decoding.
name: 'custom_jenkins'
_id: '326'
hudson: 'hudson.model.AsyncPeriodicWork#lambda$doRun$1'
level: 'INFO'
log_description: 'Finished Periodic background build discarder. 1 ms'
**Phase 3: Completed filtering (rules).
id: '100051'
level: '3'
description: 'Security event from Jenkins log: Finished Periodic background build discarder. 1 ms'
groups: '['custom_jenkins']'
firedtimes: '1'
mail: 'False'
**Alert to be generated.
As you can see, this log is already decoded and would generate a level 3 alert. You can edit the decoder and rule contents as needed.
Finally, it is necessary to restart the wazuh-manager
to apply the changes in the decoders and rules in the analysis engine.
systemctl restart wazuh-manager
And that would be all, from now on alerts of this type would be generated in the alerts.json
and the flow would continue forward, until they are displayed on the dashboard.
I am going to recommend some references for rules and decoders:
• Creating decoders and rules from scratch: https://wazuh.com/blog/creating-decoders-and-rules-from-scratch/
• Sibling decoders: flexible extraction of information: https://wazuh.com/blog/sibling-decoders-flexible-extraction-of-information/
• Custom rules and decoders: https://documentation.wazuh.com/current/user-manual/ruleset/custom.html
• Testing decoders and rules: https://documentation.wazuh.com/current/user-manual/ruleset/testing.html
Try the above and let us know the results.
Regards.
--
You received this message because you are subscribed to a topic in the Google Groups "Wazuh mailing list" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/wazuh/pZW0N34wMU0/unsubscribe.
To unsubscribe from this group and all its topics, send an email to wazuh+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/wazuh/afe2f007-431c-4251-8db8-da617ffef2can%40googlegroups.com.