Wazuh Custom Index

456 views
Skip to first unread message

yaswanth ryali

unread,
Mar 10, 2025, 1:49:16 AM3/10/25
to Wazuh | Mailing List
Hi Team, 

  Can anyone explain the process of creating a new customized index for my incoming logs . For example :- I was getting defender alerts to cribl datastream and from there to my wazuh standalone instance. 

So, My alerts are storing in archival index and i used index pattern to see those logs on my dashboards. i did not wrote any rules and decoders for this as i was doing testing to create a separate index for my incoming logs .  I want my defender logs to save in my customized index.

NOTE:- I was new to wazuh and was trying to implement wazuh for my organization .Currently we are in testing phase if everything goes well we will be implementing this further. 

So, Please guide me to this process. My wazuh was using filebeat , opensearch , kibana components. 

  Thanks in advance .....   

hasitha.u...@wazuh.com

unread,
Mar 10, 2025, 2:37:54 AM3/10/25
to Wazuh | Mailing List
Hi  yaswanth,

You can separate the events into different indexes and in each index you would see the information you are interested in, you can do this by following these steps: The Wazuh alerts or archives are indexed through an ingest pipeline. These pipelines have a processor that sets the index name using the date_index_name processor with the following schema:
a prefix: wazuh-alerts-4.x- for alerts or wazuh-archives-4.x-* for archives
a suffix: date in the format YYYY.MM.DD

Generating an index name as wazuh-alerts-4.x-2024.10.20 or wazuh-archives-4.x-2024.10.20. Note that alerts and archives datasets have different ingest pipelines. Reference of usage date_index_name processor in the alerts ingest pipeline of the wazuh module for Filebeat: https://github.com/wazuh/wazuh/blob/v4.5.3/extensions/filebeat/7.x/wazuh-module/alerts/ingest/pipeline.json#L83-L91.
To separate some events to another index, add after the default date_index_name processor, another processor that sets conditionally the index name for a subset of events. Add the condition to apply these processors using the if property. Documentation: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html#conditionally-run-processor.
Depending on the format of the index name, contains the date: use the date_index_name processor https://www.elastic.co/guide/en/elasticsearch/reference/7.10/date-index-name-processor.html
The filebeat pipeline sits at /usr/share/filebeat/module/wazuh/archives/ingest/pipeline.json. You can place something like this in your pipelines:

Replace in /usr/share/filebeat/module/wazuh/archives/ingest/pipeline.json this:
  {
      "date_index_name": {
        "field": "timestamp",
        "date_rounding": "d",
        "index_name_prefix": "{{fields.index_prefix}}",
        "index_name_format": "yyyy.MM.dd",
        "ignore_failure": false
      }
    },
By this:
{
  "date_index_name": {
  "if" : "ctx?.<field> == '<value>'",
  "field": "timestamp",
  "date_rounding": "d",
  "index_name_prefix": "{{fields.index_prefix}}<sufix>-",
  "index_name_format": "yyyy.MM.dd",
  "ignore_failure": false
  }
},
{
      "date_index_name": {
        "if": "ctx?.<field> != '<value>'",
        "field": "timestamp",
        "date_rounding": "d",
        "index_name_prefix": "{{fields.index_prefix}}",
        "index_name_format": "yyyy.MM.dd",
        "ignore_failure": false
    }
 },
Where:
<field> is the field (or subfield in the format field.subfield) that you need to consider in order to catch the logs you need.
<value> is the value that needs to take the field defined before.
<sufix> needed to add a separate name for the index

This will create an index for defender archive logs, it will be wazuh-archives-4.x-<sufix>-*  You can check the indexes from Indexer Management -> Dev Tools:
Here you will obtain an index with this name: wazuh-archives-4.x-<sufix>-yyyy.mm.dd
You will need to add one block of this per Operating System. If you need more information about these procedures, you can check the Filebeat documentation: Parse data by using ingest node
You can also refer to https://documentation.wazuh.com/current/user-manual/wazuh-indexer/wazuh-indexer-indices.html

Let me know if you need further assistance on this.

Regards,
Hasitha Upekshitha

yaswanth ryali

unread,
Mar 10, 2025, 3:41:12 AM3/10/25
to Wazuh | Mailing List
Actually i was getting logs from syslog to my wazuh manager. These logs are saved at archival by default and was able to view them using index pattern. But my main concern is all the logs including dashbaord logs and manager logs were coming into the archival index. so i want to create a seperate index for my defender logs so that i can see those logs and also it will be good to have a seperate index for my logs. 

You have explained everything clearly but the main concern is i was new to wazuh and i was unable understand some steps in that. So, it would be great if you help me in creating index by giving step to step guidance please. 

Please help me in solving this @Hasitha Upekshitha

yaswanth ryali

unread,
Mar 10, 2025, 3:41:13 AM3/10/25
to Wazuh | Mailing List
  Thanks @Hasitha Upekshitha for your help! However, it seems quite complex. Could you please explain it in a simpler way to make it easier to understand?  
On Monday, March 10, 2025 at 12:07:54 PM UTC+5:30 hasitha.u...@wazuh.com wrote:

hasitha.u...@wazuh.com

unread,
Mar 22, 2025, 4:08:17 AM3/22/25
to Wazuh | Mailing List
Hi  yaswanth,

First, take a backup of your pipeline.json file before making any changes.
cp usr/share/filebeat/module/wazuh/archives/ingest/pipeline.json  usr/share/filebeat/module/wazuh/archives/ingest/pipeline.json.bkp

Then you need to replace this code 
  1. {
  2.       "date_index_name": {
  3.         "field": "timestamp",
  4.         "date_rounding": "d",
  5.         "index_name_prefix": "{{fields.index_prefix}}",
  6.         "index_name_format": "yyyy.MM.dd",
  7.         "ignore_failure": false
  8.       }
  9.     },

by this.
  1. {
  2.    "date_index_name": {
  3.    "if" : "ctx?.full_log?.contains('keyword')",
  1.   "field": "timestamp",
  2.   "date_rounding": "d",
  1.   "index_name_prefix": "{{fields.index_prefix}}defender-",
  1.   "index_name_format": "yyyy.MM.dd",
  2.   "ignore_failure": false
  3.   }
  4. },
  5. {
  6.       "date_index_name": {
  1.         "if": "!ctx?.full_log?.contains('keyword')",
  1.         "field": "timestamp",
  2.         "date_rounding": "d",
  3.         "index_name_prefix": "{{fields.index_prefix}}",
  4.         "index_name_format": "yyyy.MM.dd",
  5.         "ignore_failure": false
  6.     }
  7.  },

    You need to replace the keyword with a unique section specific to your Defender logs for all Microsoft Defender-related log entries.

    After making the changes, run these commands to reload the pipeline and restart Filebeat:
    filebeat setup --pipelines
    systemctl restart filebeat

    Then it will create a new index for your defender logs. you can check in Indexer management -> Indexes


    Let me know if you need further assistance on this.

    Regards,
    Hasitha Upekshitha

    Reply all
    Reply to author
    Forward
    0 new messages