Custom rules issue with frequency

34 views
Skip to first unread message

Eric J

unread,
Jan 29, 2026, 9:45:03 AM (4 days ago) Jan 29
to Wazuh | Mailing List
Hello,
I am having an issue with a rule concerning Synology logs.
I would like to receive an alert when a user reads (or downloads) more than 2,000 files.
If I use these rules, I receive an alert (it's ok):

<!--WinFileService-->
    <rule id="100011" level="3">
        <decoded_as>synology-winfileservice</decoded_as>
        <description>Synology file $(action).</description>
        <mitre>
            <id>T1565</id>
        </mitre>
    </rule>
   
    <rule id="100012" level="12" frequency="2" timeframe="3600">
        <if_matched_sid>100011</if_matched_sid>
        <same_srcuser />
        <same_action />
        <description>Many Synology files sup 100 en 1h $(action) by same user.</description>
        <mitre>
            <id>T1565</id>
        </mitre>
    </rule>

If I replace ‘frequency=2’ with ‘frequency=2000’ in rule 100012 and test it, I do not receive an alert.

I tried adding another rule, 100013, to limit the ‘frequency’ in rule 100012, but 100013 never starts.

<!--WinFileService-->
    <rule id="100011" level="3">
        <decoded_as>synology-winfileservice</decoded_as>
        <description>Synology file $(action).</description>
        <mitre>
            <id>T1565</id>
        </mitre>
    </rule>
   
    <rule id="100012" level="12" frequency="2" timeframe="3600">
        <if_matched_sid>100011</if_matched_sid>
        <same_srcuser />
        <same_action />
        <description>Many Synology files sup 100 en 1h $(action) by same user.</description>
        <mitre>
            <id>T1565</id>
        </mitre>
    </rule>

    <rule id="100013" level="12" frequency="2" timeframe="43200">
        <if_matched_sid>100012</if_matched_sid>
        <same_srcuser />
    <!--    <same_action /> -->
        <description>Many Synology files sup 2000 en 12h $(action) from same srcip.</description>
        <mitre>
            <id>T1565</id>
        </mitre>
    </rule>

When I test :


bash-5.2# /var/ossec/bin/wazuh-logtest <<'EOF'
2026 Jan 27 10:22:15 Nas-archive->192.168.1.1 Jan 27 11:22:13 Nas-archive01 WinFileService Event: read, Path: /Archive-Sharepoint/archive/FileA.xlsx, File/Folder: File, Size: 65.64 KB, User: CH\admin, IP: 10.1.129.10
2026 Jan 27 10:22:16 Nas-archive->192.168.1.1 Jan 27 11:22:14 Nas-archive01 WinFileService Event: read, Path: /Archive-Sharepoint/archive/FileB.xlsx, File/Folder: File, Size: 65.64 KB, User: CH\admin, IP: 10.1.129.10
2026 Jan 27 10:35:01 Nas-archive->192.168.1.1 Jan 27 11:35:01 Nas-archive01 WinFileService Event: read, Path: /Archive-Sharepoint/archive/FileC.xlsx, File/Folder: File, Size: 70.00 KB, User: CH\admin, IP: 10.1.129.10
2026 Jan 27 10:35:02 Nas-archive->192.168.1.1 Jan 27 11:35:02 Nas-archive01 WinFileService Event: read, Path: /Archive-Sharepoint/archive/FileD.xlsx, File/Folder: File, Size: 70.00 KB, User: CH\admin, IP: 10.1.129.10
EOF
Starting wazuh-logtest v4.14.1
Type one log per line


**Phase 1: Completed pre-decoding.
        full event: '2026 Jan 27 10:22:15 Nas-archive->192.168.1.1 Jan 27 11:22:13 Nas-archive01 WinFileService Event: read, Path: /Archive-Sharepoint/archive/FileA.xlsx, File/Folder: File, Size: 65.64 KB, User: CH\admin, IP: 10.1.129.10'
        timestamp: '2026 Jan 27 10:22:15'

**Phase 2: Completed decoding.
        name: 'synology-winfileservice'
        action: 'read'
        data: '/Archive-Sharepoint/archive/FileA.xlsx'
        size_num: '65.64'
        size_unit: 'KB'
        srcip: '10.1.129.10'
        srcuser: 'CH\admin'
        type: 'File'

**Phase 3: Completed filtering (rules).
        id: '100011'
        level: '3'
        description: 'Synology file read.'
        groups: '['remote', 'syslog', 'synology']'
        firedtimes: '1'
        mail: 'False'
        mitre.id: '['T1565']'
        mitre.tactic: '['Impact']'
        mitre.technique: '['Data Manipulation']'
**Alert to be generated.


**Phase 1: Completed pre-decoding.
        full event: '2026 Jan 27 10:22:16 Nas-archive->192.168.1.1 Jan 27 11:22:14 Nas-archive01 WinFileService Event: read, Path: /Archive-Sharepoint/archive/FileB.xlsx, File/Folder: File, Size: 65.64 KB, User: CH\admin, IP: 10.1.129.10'
        timestamp: '2026 Jan 27 10:22:16'

**Phase 2: Completed decoding.
        name: 'synology-winfileservice'
        action: 'read'
        data: '/Archive-Sharepoint/archive/FileB.xlsx'
        size_num: '65.64'
        size_unit: 'KB'
        srcip: '10.1.129.10'
        srcuser: 'CH\admin'
        type: 'File'

**Phase 3: Completed filtering (rules).
        id: '100012'
        level: '12'
        description: 'Many Synology files sup 100 en 1h read by same user.'
        groups: '['remote', 'syslog', 'synology']'
        firedtimes: '1'
        frequency: '2'
        mail: 'True'
        mitre.id: '['T1565']'
        mitre.tactic: '['Impact']'
        mitre.technique: '['Data Manipulation']'
**Alert to be generated.


**Phase 1: Completed pre-decoding.
        full event: '2026 Jan 27 10:35:01 Nas-archive->192.168.1.1 Jan 27 11:35:01 Nas-archive01 WinFileService Event: read, Path: /Archive-Sharepoint/archive/FileC.xlsx, File/Folder: File, Size: 70.00 KB, User: CH\admin, IP: 10.1.129.10'
        timestamp: '2026 Jan 27 10:35:01'

**Phase 2: Completed decoding.
        name: 'synology-winfileservice'
        action: 'read'
        data: '/Archive-Sharepoint/archive/FileC.xlsx'
        size_num: '70.00'
        size_unit: 'KB'
        srcip: '10.1.129.10'
        srcuser: 'CH\admin'
        type: 'File'

**Phase 3: Completed filtering (rules).
        id: '100012'
        level: '12'
        description: 'Many Synology files sup 100 en 1h read by same user.'
        groups: '['remote', 'syslog', 'synology']'
        firedtimes: '2'
        frequency: '2'
        mail: 'True'
        mitre.id: '['T1565']'
        mitre.tactic: '['Impact']'
        mitre.technique: '['Data Manipulation']'
**Alert to be generated.


**Phase 1: Completed pre-decoding.
        full event: '2026 Jan 27 10:35:02 Nas-archive->192.168.1.1 Jan 27 11:35:02 Nas-archive01 WinFileService Event: read, Path: /Archive-Sharepoint/archive/FileD.xlsx, File/Folder: File, Size: 70.00 KB, User: CH\admin, IP: 10.1.129.10'
        timestamp: '2026 Jan 27 10:35:02'

**Phase 2: Completed decoding.
        name: 'synology-winfileservice'
        action: 'read'
        data: '/Archive-Sharepoint/archive/FileD.xlsx'
        size_num: '70.00'
        size_unit: 'KB'
        srcip: '10.1.129.10'
        srcuser: 'CH\admin'
        type: 'File'

**Phase 3: Completed filtering (rules).
        id: '100012'
        level: '12'
        description: 'Many Synology files sup 100 en 1h read by same user.'
        groups: '['remote', 'syslog', 'synology']'
        firedtimes: '3'
        frequency: '2'
        mail: 'True'
        mitre.id: '['T1565']'
        mitre.tactic: '['Impact']'
        mitre.technique: '['Data Manipulation']'
**Alert to be generated.

Do you have any idea why 100013 never starts?
Thank you

Olamilekan Abdullateef Ajani

unread,
Jan 29, 2026, 10:29:47 AM (4 days ago) Jan 29
to Wazuh | Mailing List
Hello Eric,

I am looking into this and also testing. I will write back shortly.

Best regards,

Olamilekan Abdullateef Ajani

unread,
Jan 29, 2026, 12:32:46 PM (4 days ago) Jan 29
to Wazuh | Mailing List
Hello Eric,

Your rule did not work because you chained rules on each other, the last rule relies on the previous one, which would not work based on your use case. Currently, the Wazuh analysis engine has its limitations in the correlation functionality, which makes it difficult to achieve the third rule, as it does not support nested correlation.

We are currently developing a new component called "wazuh-engine," which is intended to replace "wazuh-analysisd" and significantly enhance the processing pipeline. It introduces major improvements in usability and flexibility, including the ability to build complex decoding trees and to fully transform events at the decoding stage, whether they are plain text, XML, or JSON.
At the correlation and rules stage, the same event will also be able to match and trigger multiple rules in parallel, which will greatly improve advanced use cases such as multi-level correlation and UEBA scenarios and help overcome the limitations seen with the current rule-chaining model.

I will consult internally to see if there is a workaround for your use case and let you know.

You can also check out an open GitHub issue around this improvement here.

Best regards,

Eric J

unread,
Jan 29, 2026, 1:47:47 PM (4 days ago) Jan 29
to Wazuh | Mailing List
Hello,

Thank you very much for your prompt reply.
Are there any settings that need to be changed in order to increase the frequency of rule 100012?

Have a good evening.

Olamilekan Abdullateef Ajani

unread,
Jan 29, 2026, 3:54:07 PM (4 days ago) Jan 29
to Wazuh | Mailing List
Hello Eric,

I apologize, I missed that part of your query. So the frequency works, the problem is not in your rule logic, it is how Wazuh-analysisd processes inflow of events and when your rule get evaluated. 
Wazuh's event analysis (wazuh-analysisd) works on a stream-based basis: Each new event line is decoded, matched against rules, and correlated if necessary. Composite Rules (if_matched_sid + frequency/timeframe) do not "look" back into an arbitrarily large time window, but rather into a limited cache of past analyzed events. Historically, this cache in Wazuh/OSSEC is 8192 events by default, regardless of how large the timeframe is set.

The consequence: At a high event rate, the effective review time can shrink drastically. If, for example, 100 events per second are analyzed on a manager (alerts and non-alerts), 8192 events correspond to only 82 seconds of "review time." This explains why a "1-hour" and, in your case, 2000 frequency window correlation in composite rules is practically unattainable in high-traffic environments.
You can test this via logtest engine and see that it works, the high influx of events is the hinderance.

Regards

Eric J

unread,
Jan 30, 2026, 2:20:58 AM (3 days ago) Jan 30
to Wazuh | Mailing List
Hello,

Thank you very much for your reply.

Best Regards

Reply all
Reply to author
Forward
0 new messages