Hello, Nicolae
Composite rules (those defined with a frequency and timeframe) are designed for detecting repetition and patterns within a specific timeframe, not for filtering or deduplication. When a composite rule triggers, it only replaces the alert for the specific event that met the frequency threshold (the last one in the sequence). This means the preceding events still generate alerts based on the base rule, while only the final "triggering" event is promoted to the composite alert.
If you were to set a composite rule to a level lower than 3 (the default minimum for alert generation), it would not act as a deduplicator for the entire sequence; instead, it would simply silence the final alert in a burst. Because the engine is only comparing the
rule.id and not the actual content or payload of the logs, this approach leads to "half-reporting." You would be suppressing the notification for the triggering event while allowing the previous events to fire, effectively hiding the conclusion of the sequence without ever verifying if the logs were truly duplicates or distinct alerts matching the same rule.
The most effective way to resolve this issue is to address the source of the logs to ensure they are not being duplicated before reaching the manager. You should verify that the same events are not being written to multiple locations or collected simultaneously from different log files, as resolving the redundancy at the ingestion point is far more reliable than attempting to filter it through rule correlation.