Unable to send custom logs from logstash to Wazuh Indexer [NOOB]

194 views
Skip to first unread message

Shobit Mahajan

unread,
Jul 3, 2024, 11:41:07 AM7/3/24
to Wazuh | Mailing List
I am trying to send some logs from one of my logstash to wazuh indexer (new index). I get an error in logstash.

This is the logline that I get from logstash

```  [2024-07-03T15:32:20,046][ERROR][logstash.outputs.elasticsearch][main][aef3bdf60f7d53715c711eaac732305c273f4323e4d0f5b94f5dade6fccd17b1] Encountered a retryable error. Will Retry with exponential backoff  {:code=>400, :url=>"https://wazuh.foo:9200/_bulk", :body=>"{\"error\":{\"root_cause\":[{\"type\":\"illegal_argument_exception\",\"reason\":\"Action/metadata line [1] contains an unknown parameter [_type]\"}],\"type\":\"illegal_argument_exception\",\"reason\":\"Action/metadata line [1] contains an unknown parameter [_type]\"},\"status\":400}"}
.
As far as I know I am not using any _type anywhere.

Shobit Mahajan

unread,
Jul 3, 2024, 11:44:46 AM7/3/24
to Wazuh | Mailing List
The Wazuh Indexer version I am using is 4.7.1
The version of logstash is 7.10.2

Shobit Mahajan

unread,
Jul 3, 2024, 11:47:24 AM7/3/24
to Wazuh | Mailing List
Here is the config file



input {
  file {
    path => "/apps/syslog-ng/messages_*"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    add_field => { "log_type" => "firewalllog" }
  }
}

filter {
  if [log_type] == "firewalllog" {
    grok {
      match => {
        "message" => "%{TIMESTAMP_ISO8601:time}\s*%{URIHOST:host}\s*(?:%{NUMBER:id:int}|-).*? .*?,.*?,%{GREEDYDATA:pal_config},.*,.*,%{GREEDYDATA:pal_ig},%{GREEDYDATA:source_ip},,.*?,%{USERNAME:username},.*?,.*? \S+\S+,.*?,\S+,,%{URIHOST:hostname}"
      }
    }
    grok {
      match => {
        "message" => "%{TIMESTAMP_ISO8601:date}.*user.*?%{USERNAME:user}.*address.*%{IP:serverip}.*From:\s*%{IP:ip}.*%{URIHOST:host}"
      }
    }
  }

  mutate {
    remove_field => ["_type"]
  }
}

output {
  stdout {
    codec => rubydebug
  }

  file {
    path => "/tmp/log.log"
    codec => "json_lines"
  }

  elasticsearch {
    hosts => ["https://your-elasticsearch-host:9200"]
    index => "your-index-name-%{+YYYY.MM.dd}"
    user  => "your-username"
    password => "your-password"
    ssl => true
    ssl_certificate_verification => false
  }
}

Abdullah Al Rafi Fahim

unread,
Jul 11, 2024, 1:43:12 AM7/11/24
to Wazuh | Mailing List
Hello Shobit,

Sending logs directly to Wazuh Indexer from an external source falls out of Wazuh Support scope. You can use the other Opensearch capabilities of the indexer for separate use cases but this configuration and troubleshooting need to be done at your end. 

This seems to be an issue with the log format or the logstash configuration that you are using. As I checked online, I found some related discussions that may help you.
You can also use Wazuh agent here to monitor that file, forward the logs to Wazuh Manager and index them to the Wazuh Indexer accordingly. For that, you can review this documentation: https://documentation.wazuh.com/current/user-manual/capabilities/log-data-collection/monitoring-log-files.html

I hope you can resolve the issue accordingly. Please let us know if you need any help regarding Wazuh components. 
Reply all
Reply to author
Forward
0 new messages