Archive & alerts file compression issue

776 views
Skip to first unread message

ismailctest C

unread,
Nov 23, 2022, 12:04:16 AM11/23/22
to Wazuh mailing list
Hi Team,
Everyday getting approx 200GB archive.json & 80GB alerts.json files.
Compression has not happening frequently.

Facing Issues:
  1. Compression is not completing, we can see there small size .gz file & not removing the same date uncompressed file.
  2. Compression has not happening some days, will be available there uncompressed file.
Expecting:
All archive & alerts file should be compressed everyday & uncompressed file to be removed.

Please support on this, Is there additional configuration to be applied for this since getting huge size files are getting every day.
Is there any size limit or time limit to compress?

Thanks,



Jose Camargo

unread,
Nov 23, 2022, 8:06:18 PM11/23/22
to Wazuh mailing list
Hi Ismail, thank you for using Wazuh!

The log rotation will be applied by using the ILM policy, to make that you can take a look to this documentation blog, where everything is explained step by step:

Wazuh index management · Wazuh · The Open Source Security Platform


Wazuh stores all the events received from the agents in the /var/ossec/logs/archives/archives.(json|log). The events that are important or of security relevance are considered as alerts and these alerts are stored at /var/ossec/logs/alerts/alerts.(json|log). For both alerts and archives, logs are rotated automatically and an individual directory is created for each month and year where the date-wise logs are saved in individual .json and .log files per day.

As it is a hard link, you can not change the internal log rotation strategy or directory. However, as /var/ossec/logs/alerts/<year>/<month>/<day>.json|log files are generated per day, you can schedule a cronjob in your environment to move the older files to some other directory. For example:

1# crontab -e 20 0 * * * find /var/ossec/logs/alerts/ -type f -mtime +2 -exec mv {} /home/backup \;

These crontab will run at 00:00 am everyday and automatically move the files from /var/ossec/logs/alerts/ directory that was last modified before 2 days or more as -mtime +2 to /home/backup directory. You can modify the destination directory and last modified duration. However, it is recommended to set the -mtime value at least +2 to avoid moving any files that might be needed by Wazuh Manager later.

For the syslog at /var/log/, you can create a similar cronjob according to your need. However, it is always recommended to be aware of the static files and avoid moving them to reduce the risk of breaking things.

You can review these to learn more about cronjob:

I hope it helps. Please let me know, if you need anything else.


Regards,
Jose Camargo

ismailctest C

unread,
Nov 24, 2022, 6:24:04 AM11/24/22
to Wazuh mailing list
Hi,
Thanks for your support.

This is not helpful.

alerts.json or archive.json file to be compressed automatically & should be removed original file after compression which has happened before.
We are getting huge logs (more than 200GB) now, after that compression is not happening frequently or partially happening compression & not removing original file.

How to fix it, any file tuning needed. Is there any size limit to compress?

Thanks.

Jose Camargo

unread,
Nov 24, 2022, 3:12:57 PM11/24/22
to Wazuh mailing list
Hi Ismail,

For configuring log rotation, there are some options in internal_options.conf file described in:

https://documentation.wazuh.com/current/user-manual/reference/internal-options.html?highlight=internal%20options#monitord


# Rotate plain and JSON logs daily. (0=no, 1=yes)
monitord.rotate_log=1
# Days to keep old ossec.log files [0..500]
monitord.keep_log_days=31
# Size of internal log files to rotate them (Megabytes) [0..4096]
monitord.size_rotate=1
# Maximum number of rotations per day for internal logs [1..256]
monitord.daily_rotations=1


The last one control the number of logs compressed and stored per day. 

Please let me know if you run into any issues, I'll be glad to help.

Regards,
Jose Camargo

Reply all
Reply to author
Forward
0 new messages