Error with Cloudflare logs from SQS

153 views
Skip to first unread message

Juan Daniel Vargas Ramírez

unread,
May 9, 2024, 3:50:44 AM5/9/24
to Wazuh | Mailing List
Hi,

I've been trying to integrate with Cloudflare using an Amazon S3 bucket and the SQS queue service, and up to this point, everything is fine. The issue arises when I try to integrate Wazuh to retrieve messages from SQS. It throws an error indicating that it doesn't have the expected format, thus it will skip the messages, essentially ignoring everything I'm sending from Cloudflare. I've been reviewing the "aws-s3.py" module in the "wodles" module, and at some point, it sends an attribute `message['route']`, which apparently should contain attributes `bucket_path` and `log_path`, which my logs do not contain. I'm not sure if I'm missing adding this somewhere in my flow, or if it could be due to something else. I would truly appreciate your help as I've been reviewing for several days and haven't found a solution to this problem.

Thank you, I will looking forward.

Have a great day!
Error wazuh.png

Gerardo David Caceres Fleitas

unread,
May 9, 2024, 6:44:00 AM5/9/24
to Wazuh | Mailing List
Hello Juan,

I hope you are doing great today.

Please note that we recommend creating one post per query for better organization. This allows us to track the posts better. I noticed the same post in various communities; one was already answered.

https://discord.com/channels/1049711339578331186/1049711340316541004/1237858830881394719
https://github.com/wazuh/wazuh/discussions/23355

Regarding your question, I have no prior experience with Cloudflare, but the error seems related to the log format. You can check if JSON/Syslog is available for sending, otherwise, you may need to edit the Wazuh script.

Another approach for integrating Cloudflare could be making a script with Cloudflare API calls to send the logs that are interested in receiving alerts and send the results to a file. Better if these logs are in JSON format. This script will need to run on an existing agent or you can create a virtual machine with a new agent just for this purpose. The script could be created with this documentation https://developers.cloudflare.com/logs/logpull/

The script can be configured in a crontab to collect the logs as frequently as they require them to be used.

Once the logs are received on the agent, a local file configuration can be created in the agent's ossec.conf file. The agent will then read the logs from the file, an additional field (for logs in JSON format) can be specified here to help identify these logs as Cloudflare's logs.

The following is an example of this configuration (assume the script generates the log file named /var/log/cloudflare/cloudflare_log.json)

<local file>
 <location>/var/log/cloudflare/cloudflare_log.json</location>
 <log_format>json</log_format>
 <label key="@source">cloudflare_waf</label>
</local file>

https://documentation.wazuh.com/current/user-manual/reference/ossec-conf/localfile.html

With this local file configuration, the agent will send the logs to the manager, and then you can create decoders/rules according to the needs.

I hope this helps.

Best regards.
Reply all
Reply to author
Forward
0 new messages