Hi Team,
We have PostgreSQL in place, and for compliance purposes we require a DAM (Database Activity Monitoring) solution. Due to pricing constraints, we are planning to monitor database-level logs using Wazuh.
During my research, I found an approach for PostgreSQL integration via the Wazuh agent. However, the default rules provided are not sufficient and do not fully match our requirements.
Our key use cases are as follows:
Alert when a database user successfully logs in.
Alert when a database user login fails.
Alert when an existing database user has repeated login failures.
Alert when a non-existing database user attempts to log in.
Could you please guide us on how to implement these specific use cases in Wazuh, including any custom decoders or rules that may be required?
Thank you for your support.
Best regards,
Chandra
Hi Chandra,
Yes, this is a feasible approach in Wazuh and it is commonly implemented as a log-based DAM (Database Activity Monitoring) use case.
At a high level, the recommended approach is:
Enable PostgreSQL authentication logging (successful connections, failed logins, and attempts using non-existing users). Without proper database logging, these use cases cannot be implemented.
Collect PostgreSQL logs with the Wazuh agent, using log file monitoring or syslog/journald.
Create custom decoders (if needed) to extract relevant fields such as database user, database name, source IP, and authentication result.
Create custom rules to generate alerts for:
Successful database logins (audit visibility).
Failed login attempts.
Repeated login failures using frequency and timeframe (brute-force / password spraying detection).
Login attempts using non-existing database users (user enumeration).
Suppress trusted or expected activity (for example health-checks or monitoring probes) using rules with level 0, to avoid unnecessary noise.
Please note that this approach provides database-level monitoring based on logs.
Relevant documentation:
Log collection:
https://documentation.wazuh.com/current/user-manual/capabilities/log-data-collection/index.html
Decoders (overview):
https://documentation.wazuh.com/current/user-manual/ruleset/decoders/index.html
Decoder XML syntax:
https://documentation.wazuh.com/current/user-manual/ruleset/ruleset-xml-syntax/decoders.html
Rules (overview):
https://documentation.wazuh.com/current/user-manual/ruleset/rules/index.html
Rules XML syntax and correlation:
https://documentation.wazuh.com/current/user-manual/ruleset/ruleset-xml-syntax/rules.html
Custom rules and testing:
https://documentation.wazuh.com/current/user-manual/ruleset/testing.html
If you can share a few sample PostgreSQL log lines (successful login, failed login, and non-existing user), the required decoders and rules can be tailored precisely.
Hi Chandra,
Thanks for sharing the consolidated PostgreSQL log file — I’ve reviewed the complete set of logs from all files.
All authentication-related entries in the logs correspond to successful connections, for example:
From these events, Wazuh can reliably detect and alert on:
Successful database logins
Database user and database name
Source IP or local connections
SSL/TLS usage (useful for compliance)
However, after reviewing the full log set, there are no entries related to authentication failures, such as:
Password authentication failed
Login attempts with non-existing users
role does not exist or similar errors
Because these events are not present in the logs, the following DAM use cases cannot be implemented yet:
Alert on failed login attempts
Alert on repeated login failures
Alert on login attempts using non-existing users
To fully cover those scenarios, PostgreSQL must be configured to log authentication failures in addition to successful connections. In practice, this usually requires:
Enabling connection logging:
log_connections = on
log_disconnections = on
Using a log_line_prefix that includes user, database and source IP, for example:
log_line_prefix = '%m %u@%d %r [%p] '
Verifying that authentication failures (FATAL messages such as password authentication failed or role does not exist) are not being filtered out by the logging destination (file, syslog, or CloudWatch in the case of RDS).
Once those authentication failure log entries are available, Wazuh can handle the detection using custom rules and frequency-based correlation.
Hi Javier,
Thank you for the explanation. I understand what you are trying to convey.
Could you please help me by creating some custom rules, as I already have a decoder configured for this use case? Additionally, I would appreciate your guidance on handling authentication failure events using the decoder below.
I am currently using the following decoders:
If I replace connection authorized with connection failed (or the corresponding message observed in the logs for failed authentication attempts), will this decoder work correctly for detecting failed login attempts?
Please let me know if additional modifications or a separate decoder would be required for this scenario.
Thanks,authentication errors are typically logged as FATAL messages, for example:
FATAL: password authentication failed for user "..."
FATAL: role "..." does not exist
Because of this, simply replacing connection authorized with connection failed in the same decoder is usually not reliable. The log level changes (LOG → FATAL) and the message structure may also vary depending on the type of failure.
Your postgres_database decoder is suitable for detecting successful logins, as long as it matches the actual log format. For authentication failures, the recommended approach is to define a separate decoder specifically for FATAL messages, covering wrong passwords and non-existing users, and extracting at least the user and source IP when available.
Once those events are properly decoded, rules like the following can be used as a reference:
Successful login
Hi Javier,
I am facing an additional issue while following the Wazuh blog below for monitoring OS-level PostgreSQL logs:
https://wazuh.com/blog/monitoring-postgresql-database-with-wazuh/
PostgreSQL authentication failure events are being detected correctly. However, I am not receiving any authentication success alerts, even though I can see the authentication success logs arriving in the archives.log file.
Could you please guide me on how to process these authentication success events so that they are generated in the alerts.json file?
I am also attaching some sample logs for reference. If possible, could you please help by creating the appropriate decoder (and rule, if required) for these logs?
Thank you for your support.
The logs contain entries like the following:
The reason you can see these events in archives.log but they do not appear in alerts.json is that, while the logs are being collected correctly, no alert is being generated for them. In Wazuh, events are only written to alerts.json when a rule matches and the rule level is equal to or greater than the configured log_alert_level (level 3 by default).
In this case, the authentication success logs are not being matched by any decoder/rule that generates an alert, or the rule exists but has a level below the alert threshold.
Based on the actual format of your logs, you can use a specific decoder for connection authorized events, for example:
<decoder name="postgres_auth_success">
<prematch>connection authorized:</prematch>
<regex>^(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})\s+\w+:(\[local\]|\d+\.\d+\.\d+\.\d+)(?:\((\d+)\))?:([^@]+)@([^:]+):\[(\d+)\]:LOG:\s+connection authorized:\s+user=([^\s]+)\s+database=([^\s]+).*$</regex>
<order>timestamp, srcip, srcport, session_user, database, pid, user, database</order>
</decoder>
And an associated rule with a sufficient level so it appears in alerts.json:
<group name="postgresql,auth,">
<rule id="110001" level="5">
<decoded_as>postgres_auth_success</decoded_as>
<description>PostgreSQL login succeeded: user=$(user) db=$(database) srcip=$(srcip)</description>
</rule>
</group>
For validation:
Run wazuh-logtest on the manager and paste one real connection authorized log line to confirm the decoder and rule match.
Restart the manager after adding the decoder and rule.
Generate a successful login and verify that the alert appears in alerts.json.
In summary, the authentication success logs are being ingested correctly, but a dedicated decoder and rule (with an appropriate level) are required to turn them into alerts.
Hi Chandra,
I re-tested the entire flow end-to-end, making a few adjustments to the decoder, and confirmed that everything works correctly.
Below are the exact steps and configurations I used, so you can follow them to validate the behavior in your environment.
1) Add the decoder
Edit the following file:
/var/ossec/etc/decoders/local_decoder.xml
Add this decoder (keep the rest of the file unchanged and paste this as a new <decoder> block):
<decoder name="postgres_auth_success">
<parent>windows-date-format</parent>
<use_own_name>true</use_own_name>
<prematch>connection authorized:</prematch>
<regex type="pcre2">^(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})\s+(\w+):(\[local\]|\d+\.\d+\.\d+\.\d+)(?:\((\d+)\))?:(?:([^@]+)@([^:]+):)?\[(\d+)\]:LOG:\s+connection authorized:\s+user=([^\s]+)\s+database=([^\s]+).*$</regex>
<order>timestamp, tz, srcip, srcport, session_user, session_db, pid, user, database</order>
</decoder>
Important notes:
type="pcre2" is required to avoid regex syntax errors.
<use_own_name>true</use_own_name> is required so rules can match this decoder using <decoded_as>postgres_auth_success</decoded_as>
Edit the file:
/var/ossec/etc/rules/local_rules.xml
Add the following rule:
<group name="postgresql,auth,">
<rule id="110001" level="5">
<decoded_as>postgres_auth_success</decoded_as>
<description>PostgreSQL login succeeded: user=$(dstuser) database=$(database) srcip=$(srcip)</description>
</rule>
</group>
Restart Wazuh :
/var/ossec/bin/wazuh-control restart
3) Validate using wazuh-logtest
Run:
/var/ossec/bin/wazuh-logtest
Paste one of your PostgreSQL authentication success logs, for example:
2026-02-02 10:19:05 UTC:172.30.4.44(21237):reward_service@lxme_prod:[21880]:LOG: connection authorized: user=reward_service database=lxme_prod SSL enabled (protocol=TLSv1.2, cipher=ECDHE-RSA-AES256-GCM-SHA384, bits=256)
You should see:
Phase 2 showing:
name: 'postgres_auth_success'
Phase 3 showing:
id: '110001'
level: '5'
**Alert to be generated.
Once the log is being collected by Wazuh (i.e., it appears in archives.log), you should see the alert in:
tail -n 200 /var/ossec/logs/alerts/alerts.json | egrep '"id":"110001"|postgres_auth_success|PostgreSQL login succeeded'
You should see an alert similar to:
"decoder":{"parent":"windows-date-format","name":"postgres_auth_success"},
"rule":{"id":"110001","level":5,"description":"PostgreSQL login succeeded: user=test_user database=testdb srcip=10.0.0.5"},
"data":{"srcip":"10.0.0.5","srcport":"5555","dstuser":"test_user","database":"testdb"}
Please follow these steps on your side and let me know if you see the same results.