Automate log rotation

20 views
Skip to first unread message

DIWAHAR RAHAWID

unread,
Sep 8, 2025, 4:43:01 AM (yesterday) Sep 8
to Wazuh | Mailing List
Hi Team, 

I am using Wazuh with limited storage i would like to keep data locally on server for 3 months only and Indexer of log older that 3 months to be cleared automatically, I have already configured Wazuh to create snapshot on blob,.

Let me know is there any way to automate log rotation. 

Regards
Diwahar

ismail....@wazuh.com

unread,
Sep 8, 2025, 5:28:29 AM (24 hours ago) Sep 8
to Wazuh | Mailing List
Hi,

You can refer to this document for creating a retention policy. Index lifecycle management helps to optimize the Wazuh indexer cluster performance by controlling the lifecycle of an index. You can perform periodic operations, such as index rollovers and deletions. These periodic operations are configured using Index State Management (ISM) policies.

The Wazuh alert logs are initially stored in the /var/ossec/logs/alerts/alerts.json and /var/ossec/logs/alerts/alerts.log files. These files are automatically rotated and compressed daily at midnight (12:00 AM). The compressed files, in .gz format, along with their respective .sum files, are organized in a date-wise folder structure under /var/ossec/logs/alerts, with subdirectories created for each year and month (e.g., /var/ossec/logs/alerts/2024/Dec/). The original files are deleted only after the compression process is completed successfully. For this rotation and compression to occur seamlessly, the server must run at the log rotate time, and adequate system resources (CPU, memory, and disk space) must be available. Failure to meet these conditions may result in the compression process being skipped, leaving the original files uncompressed.

You can refer to this document Event logging - Wazuh server · Wazuh documentation.

If the files remain uncompressed due to any issue, you can manually compress them using the following commands. This will compress the files into alerts-xx.json.gz and alerts-xx.log.gz. Ensure that sufficient system resources are available before performing the manual compression to avoid potential failures.

Navigate to the directory where the uncompressed files are located and run:

cd /var/ossec/logs/alerts/2024/Dec/ gzip ossec-alerts-30.json


This data is not automatically removed, and Wazuh does not clean up these logs as it is considered historical data, which can be re-indexed if required. In case you want to clear this historical data, you can use cron jobs like the following ones to do it:

45 18 * * * find /var/ossec/logs/alerts/ -name "*.gz" -type f -mtime +90 -exec rm -f {} \; 45 18 * * * find /var/ossec/logs/alerts/ -name "*.sum" -type f -mtime +90 -exec rm -f {} \;

These cron jobs will delete all *.gz and all *.sum files older than 90 days, which will clear the historical data older than that.


I hope it helps. Please let us know if you have any further queries or issues here.

Regards,
Reply all
Reply to author
Forward
0 new messages