Kibanah Not Receiving Data

830 views
Skip to first unread message

Marc Baker

unread,
Mar 13, 2017, 4:03:54 PM3/13/17
to Wazuh mailing list
I have been working for a week to try and install Wazuh HIDS with the ELK stack integration. Everything is setup but no data is being received in Kibana. I have worked with a rep through posting on the Github site and was referred to this group when he was unable to resolve the issue. We have been told that the data will not be received in Kibana without Docker and installed it but no data is shown. I found on another Wazuh site a single node configuration and applied it to Logstash:
 
input {
	file {
		type => "ossec-alerts"
		path => "/var/ossec/logs/alerts/alerts.json"
		codec => "json"
	}
}
filter {
	geoip {
		source => "srcip"
		target => "geoip"
		database => "/etc/logstash/GeoLiteCity.dat"
		add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
		add_field => [ "[geoip][location]", "%{[geoip][latitude]}"  ]
	}
	mutate {
		convert => [ "[geoip][location]", "float"]
		rename => [ "geoip", "GeoLocation" ]
		remove_field => [ "timestamp" ]
	}
}

output {
	elasticsearch {
		hosts => ["localhost:9200"]
		index => "wazuh-alerts-%{+YYYY.MM.dd}"
		document_type => "wazuh"
		template => "/etc/logstash/wazuh-elastic2-template.json"
		template_name => "wazuh"
		template_overwrite => true
	}
}
 
This did not resolve the issue. All services (Logstash, Elasticstack, and Kibana) are running and alerts are being logged in alerts.json but still no data is available in Kibana. We tried going to localhost:9200 but only receive and error and as a result added a firewall rule allowing access to that port - still no resolution. Any assistance on this issue would be greatly appreciated.

Thank you,
 
Marc Baker

Santiago Bassett

unread,
Mar 13, 2017, 7:45:01 PM3/13/17
to Marc Baker, Wazuh mailing list
Hi Marc,

in order to help here I will need some more info. Is Wazuh running in the same server as Elastic Stack? (I read you are using single node configuration, so I would assume it is).

Since you already have alerts logged in alerts.json file, at least we know that everything works well there. Meaning that the issue is probably related to how Logstash reads that file and sends it to Elasticsearch. 

In order to know if Logstash is actually reading the file, please run:

lsof /var/ossec/logs/alerts/alerts.json

You should get something like this:

root@vpc-ossec-manager:~# lsof /var/ossec/logs/alerts/alerts.json 
COMMAND     PID  USER   FD   TYPE DEVICE SIZE/OFF   NODE NAME
filebeat  18432  root    3r   REG  202,1 95656073 400441 /var/ossec/logs/alerts/alerts.json
ossec-ana 27013 ossec    9w   REG  202,1 95656073 400441 /var/ossec/logs/alerts/alerts.json

In my case it is filebeat the process that is reading the file, as I use it as a forwarder that feeds into logstash. In your case, it looks like logstash is configured to read that file directly, so you should be able to see a logstash process reading the file (instead of filebeat).

If everything looks good so far, it would be time to check if you have Elasticsearch Wazuh template in place, try running:


If everything is ok I would try starting logstash in foreground to see if it is showing errors:

/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/01-wazuh.conf --path-settings=/etc/logstash/

For the command above, in your case, the config file may be 01-ossec-singlehost.conf instead of 01-wazuh.conf

Let us know if that helps.

Santiago




--
You received this message because you are subscribed to the Google Groups "Wazuh mailing list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+unsubscribe@googlegroups.com.
To post to this group, send email to wa...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/wazuh/2f26e4b1-6890-4c4f-8017-5a549bd2cfac%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Marc Baker

unread,
Mar 14, 2017, 7:51:30 AM3/14/17
to Wazuh mailing list, marcjb...@gmail.com
Santiago,

Thank you for your response and assistance. I have answered your questions in the post. It appears that the /us/share/bin/logstash file does not exist.
 
V/r
 
Marc Baker

On Monday, March 13, 2017 at 7:45:01 PM UTC-4, Santiago Bassett wrote:
Hi Marc,

in order to help here I will need some more info. Is Wazuh running in the same server as Elastic Stack? (I read you are using single node configuration, so I would assume it is).
We are beginning with only one stack
Since you already have alerts logged in alerts.json file, at least we know that everything works well there. Meaning that the issue is probably related to how Logstash reads that file and sends it to Elasticsearch. 

In order to know if Logstash is actually reading the file, please run:

lsof /var/ossec/logs/alerts/alerts.json
 
java       4480 logstash   15r   REG  252,0 222043493 14811176 /var/ossec/logs/alerts/alerts.json
ossec-ana 25893    ossec    9w   REG  252,0 222043493 14811176 /var/ossec/logs/alerts/alerts.json

You should get something like this:

root@vpc-ossec-manager:~# lsof /var/ossec/logs/alerts/alerts.json 
COMMAND     PID  USER   FD   TYPE DEVICE SIZE/OFF   NODE NAME
filebeat  18432  root    3r   REG  202,1 95656073 400441 /var/ossec/logs/alerts/alerts.json
ossec-ana 27013 ossec    9w   REG  202,1 95656073 400441 /var/ossec/logs/alerts/alerts.json

In my case it is filebeat the process that is reading the file, as I use it as a forwarder that feeds into logstash. In your case, it looks like logstash is configured to read that file directly, so you should be able to see a logstash process reading the file (instead of filebeat).

If everything looks good so far, it would be time to check if you have Elasticsearch Wazuh template in place, try running:

{
  "ossec" : {
    "order" : 0,
    "template" : "ossec*",
    "settings" : {
      "index" : {
        "refresh_interval" : "5s"
      }
    },
    "mappings" : {
      "ossec" : {
        "dynamic_templates" : [ {
          "notanalyzed" : {
            "mapping" : {
              "index" : "not_analyzed",
              "type" : "string",
              "doc_values" : "true"
            },
            "match_mapping_type" : "string",
            "match" : "*"
          }
        } ],
        "properties" : {
          "srcip" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "data" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "dstport" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "program_name" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "rule" : {
            "properties" : {
              "firedtimes" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              },
              "cve" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "PCI_DSS" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "description" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "groups" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "AlertLevel" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              },
              "sidid" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              },
              "CIS" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "info" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "frequency" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              }
            }
          },
          "type" : {
            "type" : "string"
          },
          "full_log" : {
            "type" : "string"
          },
          "protocol" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "dstuser" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "@version" : {
            "type" : "string"
          },
          "host" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "action" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "AlertsFile" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "AgentName" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "dstip" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "id" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "offset" : {
            "type" : "string"
          },
          "systemname" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "decoder" : {
            "properties" : {
              "parent" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "fts" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              },
              "name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "ftscomment" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "accumulate" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              }
            }
          },
          "message" : {
            "type" : "string"
          },
          "command" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "url" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "srcuser" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "@timestamp" : {
            "format" : "dateOptionalTime",
            "index" : "not_analyzed",
            "type" : "date"
          },
          "AgentIP" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "location" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "GeoLocation" : {
            "properties" : {
              "timezone" : {
                "type" : "string"
              },
              "area_code" : {
                "type" : "long"
              },
              "ip" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "latitude" : {
                "type" : "double"
              },
              "coordinates" : {
                "type" : "double"
              },
              "continent_code" : {
                "type" : "string"
              },
              "city_name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "country_code2" : {
                "type" : "string"
              },
              "country_name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "dma_code" : {
                "type" : "long"
              },
              "country_code3" : {
                "type" : "string"
              },
              "location" : {
                "type" : "geo_point"
              },
              "region_name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "real_region_name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "postal_code" : {
                "type" : "string"
              },
              "longitude" : {
                "type" : "double"
              }
            }
          },
          "SyscheckFile" : {
            "properties" : {
              "path" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "sha1_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "owner_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "perm_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "gowner_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "md5_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "perm_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "sha1_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "md5_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "gowner_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "owner_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              }
            }
          },
          "status" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          }
        }
      }
    },
    "aliases" : { }
  },
  "wazuh" : {
    "order" : 0,
    "template" : "ossec*",
    "settings" : {
      "index" : {
        "refresh_interval" : "5s"
      }
    },
    "mappings" : {
      "ossec" : {
        "dynamic_templates" : [ {
          "notanalyzed" : {
            "mapping" : {
              "index" : "not_analyzed",
              "type" : "string",
              "doc_values" : "true"
            },
            "match_mapping_type" : "string",
            "match" : "*"
          }
        } ],
        "properties" : {
          "srcip" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "data" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "dstport" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "program_name" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "rule" : {
            "properties" : {
              "firedtimes" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              },
              "cve" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "PCI_DSS" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "description" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "groups" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "AlertLevel" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              },
              "sidid" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              },
              "CIS" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "info" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "frequency" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              }
            }
          },
          "type" : {
            "type" : "string"
          },
          "full_log" : {
            "type" : "string"
          },
          "protocol" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "dstuser" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "@version" : {
            "type" : "string"
          },
          "host" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "action" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "AlertsFile" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "AgentName" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "dstip" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "id" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "offset" : {
            "type" : "string"
          },
          "systemname" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "decoder" : {
            "properties" : {
              "parent" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "fts" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              },
              "name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "ftscomment" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "accumulate" : {
                "index" : "not_analyzed",
                "type" : "long",
                "doc_values" : "true"
              }
            }
          },
          "message" : {
            "type" : "string"
          },
          "command" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "url" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "srcuser" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "@timestamp" : {
            "format" : "dateOptionalTime",
            "index" : "not_analyzed",
            "type" : "date"
          },
          "AgentIP" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "location" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          },
          "GeoLocation" : {
            "properties" : {
              "timezone" : {
                "type" : "string"
              },
              "area_code" : {
                "type" : "long"
              },
              "ip" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "latitude" : {
                "type" : "double"
              },
              "coordinates" : {
                "type" : "double"
              },
              "continent_code" : {
                "type" : "string"
              },
              "city_name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "country_code2" : {
                "type" : "string"
              },
              "country_name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "dma_code" : {
                "type" : "long"
              },
              "country_code3" : {
                "type" : "string"
              },
              "location" : {
                "type" : "geo_point"
              },
              "region_name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "real_region_name" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "postal_code" : {
                "type" : "string"
              },
              "longitude" : {
                "type" : "double"
              }
            }
          },
          "SyscheckFile" : {
            "properties" : {
              "path" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "sha1_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "owner_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "perm_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "gowner_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "md5_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "perm_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "sha1_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "md5_after" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "gowner_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              },
              "owner_before" : {
                "index" : "not_analyzed",
                "type" : "string",
                "doc_values" : "true"
              }
            }
          },
          "status" : {
            "index" : "not_analyzed",
            "type" : "string",
            "doc_values" : "true"
          }
        }
      }
    },
    "aliases" : { }
  }

If everything is ok I would try starting logstash in foreground to see if it is showing errors:

/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/01-wazuh.conf --path-settings=/etc/logstash/
Command:  /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/01-ossec-singlehost.conf --path-settings=/etc/logstash/
 
Result: -bash: /usr/share/logstash/bin/logstash: No such file or directory

To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+un...@googlegroups.com.

Kat

unread,
Mar 14, 2017, 1:44:02 PM3/14/17
to Wazuh mailing list
Mark,

Is it possible that somehow the "logstash" install process was skipped or there was an error? It seems odd the the binary itself is missing. I have done a few dozen Wazuh installs, and following the steps closely, always gets a running instance, however, there were a couple of times in the beginning, even I missed a step and ended up with a broken config. In fact, once I used the settings for the "cluster" instead of a single node install, and that broke everything, and it was very hard to debug.  If you already have all the steps except the logstash, you should be able to just install logstash and it should work. 

Cheers
Kat
Message has been deleted

Jose Luis Ruiz

unread,
Mar 14, 2017, 2:08:49 PM3/14/17
to Marc Baker, Wazuh mailing list, jo...@wazuh.com
Hi Mark,

In single-host deployments, you also need to grant the logstash user access to OSSEC alerts file:

$ sudo usermod -a -G ossec logstash
you did this before?

Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com
--

Marc Baker

unread,
Mar 14, 2017, 2:33:57 PM3/14/17
to Wazuh mailing list, marcjb...@gmail.com, jo...@wazuh.com
Jose,

 

Thank you. I have done that. A permission check on the alerts file returns:
 
lsof /var/ossec/logs/alerts/alerts.json
 
java       4480 logstash   15r   REG  252,0 222043493 14811176 /var/ossec/logs/alerts/alerts.json
ossec-ana 25893    ossec    9w   REG  252,0 222043493 14811176 /var/ossec/logs/alerts/alerts.json
It seems like logstash has rights to read the file. 
 
Santiago asked me to also run /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/01-wazuh.conf --path-settings=/etc/logstash/ which returned the following error:
 
-bash: /usr/share/logstash/bin/logstash: No such file or directory
I am thinking that this may be the issue. This is my first time installing ELK stack and I am sure there is just something small being missed. Original instructions followed were on the documentation.wazuh.com website, ossec index on one Wazuh Github, and the configuration posted previously on yet another Wazuh github. With everything being spread across multiple websites it has been an interesting installation and hopefully if I can get past this last piece we will be done.
 
V/r

Marc Baker
To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+unsubscribe@googlegroups.com.

Jose Luis Ruiz

unread,
Mar 14, 2017, 3:20:47 PM3/14/17
to Marc Baker, Wazuh mailing list, jo...@wazuh.com
Sorry Marc, i did not see this previous mail, but no problem we will find out

We go to do some tests:

In your logstash config under output configuration add “stdout { codec => rubydebug }":

output {
    stdout { codec => rubydebug }
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "wazuh-alerts-%{+YYYY.MM.dd}"
        document_type => "wazuh"
        template => "/etc/logstash/wazuh-elastic5-template.json"
        template_name => "wazuh"
        template_overwrite => true
    }
}
Then restart log stash, and do a tail from /var/log/logstash/logstash.stdout

In other tab, log in the manager and restart the manager with /var/ossec/bin/ossec-control restart, if logstash is working properly you will see in the log something like the following alert, with differents timestamp, hosts, etc..

{
          "rule" => {
              "sidid" => 502,
         "firedtimes" => 1,
             "groups" => [
            [0] "ossec"
        ],
            "PCI_DSS" => [
            [0] "10.6.1"
        ],
        "description" => "Ossec server started.",
         "AlertLevel" => 3
    },
      "full_log" => "ossec: Ossec started.",
       "decoder" => {
        "name" => "ossec"
    },
      "location" => "ossec-monitord",
      "@version" => "1",
    "@timestamp" => "2017-03-14T19:02:22.000Z",
          "path" => "/var/ossec/logs/alerts/alerts.json",
          "host" => "c768d7e8cb8a",
          "type" => "ossec-alerts",
     "AgentName" => "c768d7e8cb8a"
}

If this previous option works, we need to see if elastic is running properly, run the next command:

curl 'localhost:9200/_cat/indices?v'

you should have something like:

root@c768d7e8cb8a:~# curl 'localhost:9200/_cat/indices?v'
health status index            pri rep docs.count docs.deleted store.size pri.store.size
green  open   .kibana            1   0         46            2     46.4kb         46.4kb
green  open   ossec-2017.03.14   1   0         71            0     70.8kb         70.8kb
root@c768d7e8cb8a:~#


Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

Marc Baker

unread,
Mar 14, 2017, 3:47:18 PM3/14/17
to Wazuh mailing list, marcjb...@gmail.com, jo...@wazuh.com
Jose,
 
Is the logstash config you are referring to: 01-ossec-singlehost.conf or possibly the one that was reported missing? I am still learning the system and which files are which. The second command for elasticsearch (curl 'localhost:9200/_cat/indices?v') with the following results:
 
health status index                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open   ossec-2017.03.13          5   1          1            0      8.2kb          8.2kb
yellow open   wazuh-alerts-2017.03.13   5   1     196953            0    119.3mb        119.3mb
yellow open   .kibana                   1   1        148            3    136.9kb        136.9kb
yellow open   wazuh-alerts-2017.03.14   5   1     312863            0    215.3mb        215.3mb
Thank you,
 
Marc

Marc Baker

unread,
Mar 14, 2017, 3:49:30 PM3/14/17
to Wazuh mailing list, marcjb...@gmail.com, jo...@wazuh.com
I apologize - just realized which file for logstash config and am now making the change.
 
Thank you,
 
Marc

Jose Luis Ruiz

unread,
Mar 14, 2017, 4:03:49 PM3/14/17
to Marc Baker, Wazuh mailing list, jo...@wazuh.com
Hi Marc

As i see, your index has information 215mb one 119mb other.

Try the next command, in order to verify the elastic search has information from ossec:



This command will give you all content from this index, can you send me one of these alerts in order to verify is ossec?


Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

Marc Baker

unread,
Mar 14, 2017, 4:04:28 PM3/14/17
to Wazuh mailing list, marcjb...@gmail.com, jo...@wazuh.com
Result of the test: 
 
:timestamp=>"2017-03-14T12:12:31.135000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but an error occurred$
> {:timestamp=>"2017-03-14T12:12:31.273000+0000", :message=>"localhost:9200 failed to respond", :class=>"Manticore::ClientProtocolException", :backtrace=>["/opt/logstash/vendor/b$
-bash: Attempted to send a bulk request to Elasticsearch configured at '["http://localhost:9200/"]', but an error occurred$
{:timestamp=>2017-03-14T12:12:31.273000+0000, :message=>localhost:9200: No such file or directory
:/tmp# {:timestamp=>"2017-03-14T12:12:33.537000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but Elasticsearch app$
> {:timestamp=>"2017-03-14T12:12:35.564000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but Elasticsearch app$
-bash: Attempted to send a bulk request to Elasticsearch configured at '["http://localhost:9200/"]', but Elasticsearch app$
{:timestamp=>2017-03-14T12:12:35.564000+0000, :message=>Attempted: No such file or directory
:/tmp# {:timestamp=>"2017-03-14T12:12:37.572000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but Elasticsearch app$
> {:timestamp=>"2017-03-14T12:12:39.576000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but Elasticsearch app$
-bash: Attempted to send a bulk request to Elasticsearch configured at '["http://localhost:9200/"]', but Elasticsearch app$
{:timestamp=>2017-03-14T12:12:39.576000+0000, :message=>Attempted: No such file or directory
:/tmp# {:timestamp=>"2017-03-14T12:12:41.600000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but Elasticsearch app$
> {:timestamp=>"2017-03-14T12:12:43.606000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but Elasticsearch app$
-bash: Attempted to send a bulk request to Elasticsearch configured at '["http://localhost:9200/"]', but Elasticsearch app$
{:timestamp=>2017-03-14T12:12:43.606000+0000, :message=>Attempted: No such file or directory
 {:timestamp=>"2017-03-14T14:55:49.318000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but an error occurred$
Elasticsearch appears to be working though:
 
health status index                   pri rep docs.count docs.deleted store.size                                                                                                  pri.store.size
yellow open   ossec-2017.03.13          5   1          1            0      8.2kb                                                                                                           8.2kb
yellow open   wazuh-alerts-2017.03.13   5   1     196953            0    119.3mb                                                                                                         119.3mb
yellow open   .kibana                   1   1        148            3    136.9kb                                                                                                         136.9kb
yellow open   wazuh-alerts-2017.03.14   5   1     312863            0    215.3mb                                                                                                         215.3mb
Thank you.
 
Marc

Marc Baker

unread,
Mar 14, 2017, 4:11:11 PM3/14/17
to Wazuh mailing list, marcjb...@gmail.com, jo...@wazuh.com
Response from this command:
The '>' is only result from issuing the command.

Jose Luis Ruiz

unread,
Mar 14, 2017, 4:17:28 PM3/14/17
to Marc Baker, Wazuh mailing list, jo...@wazuh.com

i try the command in Markdown i think the mail format did something wrong here.

 curl -XGET 'http://localhost:9200/ossec-2017.03.14/_search?pretty'

or

curl -XGET 'http://localhost:9200/wazuh-alerts–2017.03.14/_search?pretty'  



Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

Marc Baker

unread,
Mar 14, 2017, 4:20:29 PM3/14/17
to Wazuh mailing list, marcjb...@gmail.com, jo...@wazuh.com
Result of command this time:
 
{
  "error" : {
    "root_cause" : [ {
      "type" : "index_not_found_exception",
      "reason" : "no such index",
      "resource.type" : "index_or_alias",
      "resource.id" : "wazuh-alerts¬タモ2017.03.14",
      "index" : "wazuh-alerts¬タモ2017.03.14"
    } ],
    "type" : "index_not_found_exception",
    "reason" : "no such index",
    "resource.type" : "index_or_alias",
    "resource.id" : "wazuh-alerts¬タモ2017.03.14",
    "index" : "wazuh-alerts¬タモ2017.03.14"
  },
  "status" : 404

Marc Baker

unread,
Mar 14, 2017, 4:22:14 PM3/14/17
to Wazuh mailing list, marcjb...@gmail.com, jo...@wazuh.com
The index in Kibana is titled:

ossec-*

Indexes were created using the Python script on the Wazuh Github site.

Jose Luis Ruiz

unread,
Mar 14, 2017, 4:24:25 PM3/14/17
to Marc Baker, Wazuh mailing list, jo...@wazuh.com
Ok we know what happen here, you have a little problem with version, you are using script from version 1.1.1 with templates for version 2.0.

Can you do run the following command?

cat /var/ossec/etc/ossec-init.conf

Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

Marc Baker

unread,
Mar 14, 2017, 4:28:44 PM3/14/17
to Wazuh mailing list, marcjb...@gmail.com, jo...@wazuh.com
 
DIRECTORY="/var/ossec"
NAME="Wazuh"
VERSION="v2.0"
DATE="Thu Mar  9 18:20:15 UTC 2017"
TYPE="server"

Jose Luis Ruiz

unread,
Mar 14, 2017, 5:09:31 PM3/14/17
to Marc Baker, Wazuh mailing list, jo...@wazuh.com
Ok Mark, 

You are sending your alerts to the index wazuh-alerts-xxxxxxx

Logstash conf:

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "wazuh-alerts-%{+YYYY.MM.dd}"

        document_type => "wazuh"
        template => "/etc/logstash/wazuh-elastic5-template.json"
        template_name => "wazuh"
        template_overwrite => true
    }
}
i was following the conversation in the issue https://github.com/wazuh/wazuh/issues/85 you are following two documentations, http://documentation.wazuh.com and http://documentation-dev.wazuh.com each one has different procedures, templates and versions.

In your case for the version 2.0, you should have installed Elasticsearch 5.2.x , Logstash 5.2.x and Kibana 5.2.x. and all this version report to the index wazuh-alerts-xxxxx

So in oder to read this index wazuh-alerts-xxx you need to create the pattern in Kibana, to help you in do that

Old version: settings—> about

New version: Management.

you have both screenshot in this mail.




Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

Jose Luis Ruiz

unread,
Mar 14, 2017, 7:01:04 PM3/14/17
to Marc Baker, jo...@wazuh.com, wa...@googlegroups.com
Hi Marc, 

Here we have two options, or keep wazuh 2.0 with ELK 2.x or wazuh 2.0 with ELK 5.x. The first need less steps than the second of course, which one you like to do?



Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

On March 14, 2017 at 6:04:13 PM, Marc Baker (marcjb...@gmail.com) wrote:

Jose,

Thank you for your message. I began the installation following instructions posted on http://documentation.wazuh.com/en/latest/. The site has no instructions concerning indexes and only references configuration files. I had to Google the configuration file names to find your Github sites. One of these sites had a Python script for Kibana indexes and since this was the only reference to indexes available from Wazuh, it was used. Obviously this was a mistake as we have now found it is for version 2.0. Does Wazuh have a guide for upgrading or should I go to the Logstash site for guidance? Also, would it be easier to uninstall Elasticsearc, Logstash, and Kibana to install the newest version or it the upgrade our best option?

V/r

Marc Baker
2FED5694-4B2B-44FB-9C77-6BD5849D3EA3
D60E785F-3C31-4E54-A56A-82797CCD860B

Jose Luis Ruiz

unread,
Mar 14, 2017, 7:20:21 PM3/14/17
to Marc Baker, jo...@wazuh.com, Wazuh mailing list
Hi Mark, 

Both are nice, with Kibana 2.x you only need to do one step in Kibana interface in order to see the alerts, and a couple more modifications in the Dashboards renaming the index by the new one.

With Kibana 5 you will have a latest version and can use WazuhAPP if you install Wazuh-API, but you need to delete the old ELK repositories, add the new ones, upgrade your system and change some configuration files (or start with a fresh installation if you don’t need to keep any information), up to you.

Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

On March 14, 2017 at 7:07:58 PM, Marc Baker (marcjb...@gmail.com) wrote:

I am thinking that Elk 5.x may be best since it is the latest version and this is our first instance. Are there any known issues with the upgrade or is it considered stable?

Thank you,

Marc Baker 
<D60E785F-3C31-4E54-A56A-82797CCD860B><2FED5694-4B2B-44FB-9C77-6BD5849D3EA3>

Jose Luis Ruiz

unread,
Mar 14, 2017, 7:29:33 PM3/14/17
to Marc Baker, wa...@googlegroups.com
Witch OS are you working? RHEL/CentOS/Debian/Ubuntu??

Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

On March 14, 2017 at 7:27:07 PM, Marc Baker (marcjb...@gmail.com) wrote:

Jose,

Since this is a new installation we can upgrade as long as it does not affect the deployed HIDS agents (which I do not think it will). If you can provide direction to delete the old and install the new I will be glad to do that.

Thank you,

Marc

Marc Baker

unread,
Mar 14, 2017, 7:31:52 PM3/14/17
to Jose Luis Ruiz, wa...@googlegroups.com
Jose,

I apologize about not adding the group address. The operating system we are using is Ubuntu 14.04.

Thank you,

Marc Baker

Jose Luis Ruiz

unread,
Mar 14, 2017, 9:04:13 PM3/14/17
to Marc Baker, wa...@googlegroups.com

We need to do some steps here always assuming you was following our guide:

1.- Stop all ELK services and verify the services are stoped with ps axe | grep service-name.

service logstash stop
service elasticsearch stop
service kibana stop

2.- Delete the old ELK respositories

edit /etc/apt/sources.list

And delete the following lines:

deb https://packages.elasticsearch.org/logstash/2.1/debian stable main
deb http://packages.elastic.co/kibana/4.5/debian stable main

3.- Add the new repositories

curl -s https://artifacts.elastic.co/GPG-KEY-elasticsearch | apt-key add -
apt-get install apt-transport-https
echo "deb https://artifacts.elastic.co/packages/5.x/apt stable main" | tee /etc/apt/sources.list.d/elastic-5.x.list

4.- Upgrade ELK (if you like you can upgrade all your system, but because i don’t know your environment only give you the command for these three packages)

apt-get update
apt-get install --only-upgrade kibana logstash elasticsearch

5.- If you have a question like the follow text reply no

Configuration file '/etc/elasticsearch/elasticsearch.yml'
 ==> Modified (by you or by a script) since installation.
 ==> Package distributor has shipped an updated version.
   What would you like to do about it ?  Your options are:
    Y or I  : install the package maintainer's version
    N or O  : keep your currently-installed version
      D     : show the differences between the versions
      Z     : start a shell to examine the situation
 The default action is to keep your current version.
*** elasticsearch.yml (Y/I/N/O/D/Z) [default=N] ?

6.- Modify the file /etc/elasticsearch/jvm.options, line 22 and 23, the best configuration is give to Elasticsearch half of your RAM never more than 32GB, the next configuration is for a 4gb machine.

-Xms2g
-Xmx2g

7.- Modify Kibana configuration /etc/kibana/kibana.yml, line 7, in order to accept connections from different addresses than localhost, 0.0.0.0 for all but less secure, or your machine ip, this depends from your environment.

server.host: "0.0.0.0"

8.- Start elasticsearch service:

service elasticsearch start

8.- Now install wazuh-app with the next command (this can take a while)

/usr/share/kibana/bin/kibana-plugin install https://packages.wazuh.com/wazuhapp/wazuhapp.zip

9.- Now start logstash, kibana and verify with ps axe | grep service-name

service logstash restart
service kibana restar

The last part in this manual is install wazuh-api in order to connect with wazuh-app, for that follow the next guide, only the api section:

https://documentation-dev.wazuh.com/installation-guide/installing-wazuh-server/wazuh_server_deb.html#installing-wazuh-api 

and connect Wazuh-API with Wazuh-app follow the next link:

https://documentation-dev.wazuh.com/installation-guide/installing-elastic-stack/connect_wazuh_app.html

Your manager ip will be localhost, you have all in the same machine.

plese let me know if it helps.

Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

Marc Baker

unread,
Mar 15, 2017, 9:37:04 AM3/15/17
to Wazuh mailing list, marcjb...@gmail.com
Thank you Jose. Everything ran smoothly until the step to start the easticsearch service which generated the following error:
 
Error: encountered environment variables that are no longer supported
Use jvm.options or ES_JAVA_OPTS to configure the JVM
ES_HEAP_SIZE=4g: set -Xms4g and -Xmx4g in jvm.options or add "-Xms4g -Xmx4g" to ES_JAVA_OPTS
                                                                                                    
I am digging through the forums as this seems to be a common error when doing an upgrade but have not found a solution yet so any guidance would be greatly appreciated.
 
V/r
 
Marc Baker

Marc Baker

unread,
Mar 15, 2017, 11:13:18 AM3/15/17
to Wazuh mailing list, marcjb...@gmail.com
Commented out #ES_HEAP_SIZE=4g and was able to start elasticsearch with the following response:
 
/etc/default/elasticsearch: line 32: ES_JAVA_OPTS: command not found
 * Starting Elasticsearch Server                                                                                                                                                 [2017-03-15T15:08:38,146][WARN ][o.e.c.l.LogConfigurator  ] ignoring unsupported logging configuration file [/etc/elasticsearch/logging.yml], logging is configured via [/etc/elasticsearch/log4j2.properties]
Moving on with the rest of the instructions for upgrade now. Please let me know if the above results of starting elasticsearch will cause issues.
 
Thank you,
 
Marc Baker

Marc Baker

unread,
Mar 15, 2017, 11:21:02 AM3/15/17
to Wazuh mailing list
Error was received installing the wazuh-api:
 
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package wazuh-api
Please advise if this has been moved.

Thank you,
 
Marc Baker

Marc Baker

unread,
Mar 15, 2017, 1:08:45 PM3/15/17
to Wazuh mailing list
We have finished the upgrade process and are now attempting to load the Wazuh API as instructed but receive the following error:
 
apt-get install wazuh-api
 
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package wazuh-api
Please advise as to the new location for the wazuh-api.

Thank you,
 
Marc Baker
 

Jose Luis Ruiz

unread,
Mar 15, 2017, 1:50:50 PM3/15/17
to Marc Baker, Wazuh mailing list
Hi Marck

You should have the Wazuh repositories in your lists.


Add the repositories and install wazuh-api

Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

--

You received this message because you are subscribed to the Google Groups "Wazuh mailing list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+un...@googlegroups.com.

To post to this group, send email to wa...@googlegroups.com.

Marc Baker

unread,
Mar 15, 2017, 2:21:32 PM3/15/17
to Wazuh mailing list, marcjb...@gmail.com
The following were results of the steps outlined on site::
 
1 ) apt-get install curl apt-transport-https lsb-release
 
Reading package lists... Done
Building dependency tree
Reading state information... Done
apt-transport-https is already the newest version.
curl is already the newest version.
lsb-release is already the newest version.
The following packages were automatically installed and are no longer required:
  rlwrap xsltproc
Use 'apt-get autoremove' to remove them.
0 upgraded, 0 newly installed, 0 to remove and 13 not upgraded.
2) curl -s https://packages.wazuh.com/key/GPG-KEY-WAZUH | apt-key add -
OK
3) CODENAME=$(lsb_release -cs)
root@NSMSen05HIDSRXNow:/var/ossec/api# echo "deb https://packages.wazuh.com/apt $CODENAME main" \
> | tee /etc/apt/sources.list.d/wazuh.list
deb https://packages.wazuh.com/apt trusty main
(I copied this as a command - is it supposed to be added to a configuration file?)
 
 4) apt-get update
 
W: Duplicate sources.list entry https://packages.elastic.co/elasticsearch/2.x/debian/ stable/main amd64 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-amd64_Packages)
W: Duplicate sources.list entry https://packages.elastic.co/elasticsearch/2.x/debian/ stable/main i386 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-i386_Packages)
W: Duplicate sources.list entry http://packages.elastic.co/elasticsearch/2.x/debian/ stable/main amd64 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-amd64_Packages)
W: Duplicate sources.list entry http://packages.elastic.co/elasticsearch/2.x/debian/ stable/main i386 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-i386_Packages)
W: You may want to run apt-get update to correct these problems
 
5) apt-get install wazuh-manager
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following package was automatically installed and is no longer required:
  rlwrap
Use 'apt-get autoremove' to remove it.
The following NEW packages will be installed:
  wazuh-manager
0 upgraded, 1 newly installed, 0 to remove and 13 not upgraded.
Need to get 0 B/1,136 kB of archives.
After this operation, 8,501 kB of additional disk space will be used.
(Reading database ... 192628 files and directories currently installed.)
Preparing to unpack .../wazuh-manager_2.0-1trusty_amd64.deb ...
=====================================================================================
= Backup from your ossec.conf has been created at /var/ossec/etc/ossec.conf.deborig =
= Please verify your ossec.conf configuration at /var/ossec/etc/ossec.conf          =
=====================================================================================
Unpacking wazuh-manager (2.0-1trusty) ...
dpkg: error processing archive /var/cache/apt/archives/wazuh-manager_2.0-1trusty_amd64.deb (--unpack):
 trying to overwrite '/var/ossec/agentless/su.exp', which is also in package ossec-hids-server 2.8.2-ubuntu10securityonion3
dpkg-deb: error: subprocess paste was killed by signal (Broken pipe)
 Removing any system startup links for /etc/init.d/wazuh-manager ...
Errors were encountered while processing:
 /var/cache/apt/archives/wazuh-manager_2.0-1trusty_amd64.deb
W: Duplicate sources.list entry https://packages.elastic.co/elasticsearch/2.x/debian/ stable/main amd64 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-amd64_Packages)
W: Duplicate sources.list entry https://packages.elastic.co/elasticsearch/2.x/debian/ stable/main i386 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-i386_Packages)
W: Duplicate sources.list entry http://packages.elastic.co/elasticsearch/2.x/debian/ stable/main amd64 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-amd64_Packages)
W: Duplicate sources.list entry http://packages.elastic.co/elasticsearch/2.x/debian/ stable/main i386 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-i386_Packages)
W: You may want to run apt-get update to correct these problems
E: Sub-process /usr/bin/dpkg returned an error code (1)
6) curl -sL https://deb.nodesource.com/setup_6.x | sudo -E bash -
Hit https://packages.elastic.co stable/main i386 Packages
Fetched 1,745 kB in 4s (374 kB/s)
Reading package lists... Done
W: Duplicate sources.list entry https://packages.elastic.co/elasticsearch/2.x/debian/ stable/main amd64 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-amd64_Packages)
W: Duplicate sources.list entry https://packages.elastic.co/elasticsearch/2.x/debian/ stable/main i386 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-i386_Packages)
W: Duplicate sources.list entry http://packages.elastic.co/elasticsearch/2.x/debian/ stable/main amd64 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-amd64_Packages)
W: Duplicate sources.list entry http://packages.elastic.co/elasticsearch/2.x/debian/ stable/main i386 Packages (/var/lib/apt/lists/packages.elastic.co_elasticsearch_2.x_debian_dists_stable_main_binary-i386_Packages)
W: You may want to run apt-get update to correct these problems
## Run `apt-get install nodejs` (as root) to install Node.js v6.x and npm
7) apt-get install nodejs
 
Reading package lists... Done
Building dependency tree
Reading state information... Done
nodejs is already the newest version.
The following packages were automatically installed and are no longer required:
  rlwrap xsltproc
Use 'apt-get autoremove' to remove them.
0 upgraded, 0 newly installed, 0 to remove and 13 not upgraded.

3) cd /var/ossec/api/configuration/auth
 
bash: cd: /var/ossec/api/configuration/auth: No such file or directory
 
4) sudo node htpasswd -c user newname
 
module.js:471
    throw err;
    ^
Error: Cannot find module '/var/ossec/api/htpasswd'
    at Function.Module._resolveFilename (module.js:469:15)
    at Function.Module._load (module.js:417:25)
    at Module.runMain (module.js:604:10)
    at run (bootstrap_node.js:394:7)
    at startup (bootstrap_node.js:149:9)
    at bootstrap_node.js:509:3
Without the username and password I am unable to save the API configuration.
 
Thank you,
 
Marc Baker
To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+unsubscribe@googlegroups.com.

Santiago Bassett

unread,
Mar 15, 2017, 2:29:20 PM3/15/17
to Marc Baker, Wazuh mailing list
Hi Marc,

I didn't read the whole email thread, but it looks like you are experiencing a lot of issues. Some of them seem to be caused because you are using elastic 2.x repository, and because you were trying to install a wazuh-manager package on top of a system that already had it installed (maybe via sources?).

I believe it is going to be much harder to troubleshoot all this issues individually than re-installing the whole thing from scratch. My advice is to start from a clean system and do the whole thing again following our latest docs at:


As well, if you plan to monitor a high number of systems I would advice to use a distributed architecture with the Wazuh server and Elastic Stack running in different systems.

Regards


Marc Baker

unread,
Mar 15, 2017, 2:47:03 PM3/15/17
to Wazuh mailing list, marcjb...@gmail.com
Santiago,
 
Thank you for your response. Our plan was to test the system before moving to a distributed system. We started with https://documentation/wazuh.com which did not contain links to config files or indexes but reference file names. Google searches of those file names led to identification of two Github sites which were used to download a python script for indexes and the config files used. We have now been told that these were for newer versions and the wazuh site contains directions for elastic 2.x. We have worked to upgrade the whole system and are only missing the APIs from what Jose has communicated. My question is that if we uninstall everything and start with the same directions, will we not receive the same results? Is there no was to get the API working without finishing these last steps so the past weeks work will not be lost?
 
V/r
 
Marc Baker

Jose Luis Ruiz

unread,
Mar 15, 2017, 2:52:55 PM3/15/17
to Marc Baker, Wazuh mailing list
Mark 

in the reinstallation guide i wrote you to remove the old elk repositories

2.- Delete the old ELK respositories

edit /etc/apt/sources.list

And delete the following lines:

deb https://packages.elasticsearch.org/logstash/2.1/debian stable main
deb http://packages.elastic.co/kibana/4.5/debian stable main
and then you only need to install wazuh-api the manager is already running, but you was reinstalling all.

Then i see in your packages you have wazuh-manager installed from sources over a ossec-hids-2.8.2
trying to overwrite '/var/ossec/agentless/su.exp', which is also in package ossec-hids-server 2.8.2-ubuntu10securityonion3


Sorry Marc but i think your system is very unstable, my recommendation, backup /var/ossec/etc/client.keys (is the keys from your agents, so you don’t need to register again), /var/ossec/etc/ossec.conf, and if you have a special configuraion in local_rules and decoders, do the same backup it.


Then reinstall all from scratch, following the guide in https://documentation-dev.wazuh.com/installation-guide/index.html, all with last versions. If you follow this and only this guide don’t take long the installation and restoring the ossec.conf and client.keys if the wazuh-manager ip is the same, all your agents will reconnect without problems.


Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+un...@googlegroups.com.

To post to this group, send email to wa...@googlegroups.com.

Santiago Bassett

unread,
Mar 15, 2017, 2:56:53 PM3/15/17
to Marc Baker, Wazuh mailing list
Hi Marc,

I am pretty confident that a clean installation will work nicely. If you want to save previous work (like client keys or rules) please backup and import later.

Regarding the documentation, it is true that the one in our website is not the latest (although it should also work fine). This is because we are still finalizing the release with support for Elastic Stack 5. Once that is done we will update the website too.

Best regards


Marc Baker

unread,
Mar 16, 2017, 12:38:50 AM3/16/17
to Santiago Bassett, wa...@googlegroups.com
A new server is being prepared. New build will be done using instructions at https://documentation-dev.wazuh.com/installation-guide/installing-wazuh-server/wazuh_server_deb.html. Previous build was done using the old instructions on a new server and resulted in the OS becoming corrupted beyond repair. OS is being reloaded and hopefully the issues encountered the first time through have been corrected in the new documentation. We appreciate the Wazuh.com staff's patience and assistance in walking us through the extremely complicated install process involved with the Wazuh HIDS ELK Stack.

Jose Luis Ruiz

unread,
Mar 16, 2017, 8:20:38 AM3/16/17
to wa...@googlegroups.com, Marc Baker, Santiago Bassett
Hi Marc,

Let us know if you have any problems, follow this guide let us know.

Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+un...@googlegroups.com.

To post to this group, send email to wa...@googlegroups.com.

Marc Baker

unread,
Mar 16, 2017, 9:19:59 AM3/16/17
to Wazuh mailing list, marcjb...@gmail.com, sant...@wazuh.com
Thank you Jose. One question about the steps outlined. Step 3 of the Wazuh repositories directs distribution and adding the repository using:

CODENAME=$(lsb_release -cs)
echo "deb https://packages.wazuh.com/apt $CODENAME main" \
| tee /etc/apt/sources.list.d/wazuh.list

I am assuming these two lines are entered at the command line but have not had prior experience with "CODENAME" and so wanted to verify before preceding.

V/r

Marc Baker

Jose Luis Ruiz

unread,
Mar 16, 2017, 9:38:57 AM3/16/17
to Marc Baker, Wazuh mailing list, sant...@wazuh.com

Hi Marc

This command only search you OS distribution and add to the echo line.

CODENAME is a variable, nothing special, the content is the result from the command lsb_release -cs

You can run this command manually and see your OS version.

root@debian:~# lsb_release -cs
jessie
root@debian:~#
root@debian:~# CODENAME=$(lsb_release -cs)
root@debian:~# echo $CODENAME
jessie
root@debian:~#
root@debian:~# echo "deb https://packages.wazuh.com/apt $CODENAME main" \
> | tee /etc/apt/sources.list.d/wazuh.list
deb https://packages.wazuh.com/apt jessie main
root@debian:~# cat /etc/apt/sources.list.d/wazuh.list
deb https://packages.wazuh.com/apt jessie main
root@debian:~#



Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+un...@googlegroups.com.

To post to this group, send email to wa...@googlegroups.com.

Marc Baker

unread,
Mar 16, 2017, 12:16:50 PM3/16/17
to Wazuh mailing list, marcjb...@gmail.com, sant...@wazuh.com
Jose,

Thank you. I am now beginning the installation and encountered an error:

apt-get install wazuh-manager

Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package wazuh-manager

This happened yesterday when trying to upgrade and the recommendation was made to do a fresh install which we are now doing on a new server. Has the wazuh-manager been moved?

V/r

Marc

Jose Luis Ruiz

unread,
Mar 16, 2017, 12:19:17 PM3/16/17
to Marc Baker, Wazuh mailing list, sant...@wazuh.com

Marc

Try the next

  • apt-get clean all
  • apt-get update
  • apt-get install wazuh-manager



Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

On March 16, 2017 at 12:16:52 PM, Marc Baker (marcjb...@gmail.com) wrote:

etc/apt/sources.list.d/wazuh.list

Marc Baker

unread,
Mar 16, 2017, 12:29:26 PM3/16/17
to Wazuh mailing list, marcjb...@gmail.com, sant...@wazuh.com
Thank you. That worked. The following error was returned:

Errors were encountered while processing:
 /var/cache/apt/archives/wazuh-manager_2.0-1trusty_amd64.deb
E: Sub-process /usr/bin/dpkg returned an error code (1)

Will this be an issue with the remaining steps?

V/r

Marc

Marc Baker

unread,
Mar 16, 2017, 12:33:23 PM3/16/17
to Wazuh mailing list, marcjb...@gmail.com, sant...@wazuh.com

Jose Luis Ruiz

unread,
Mar 16, 2017, 12:38:34 PM3/16/17
to Marc Baker, Wazuh mailing list, sant...@wazuh.com

Marc try the next commands:

Looks like the package give you any problem during the installation, this is a new system or is the same than yesterday?

  • pkill -f ossec
  • apt-get remove —-purge wazuh-manager
  • apt-get clean all
  • apt-get get update
  • apt-get install wazuh-manager



Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+un...@googlegroups.com.

To post to this group, send email to wa...@googlegroups.com.

Marc Baker

unread,
Mar 16, 2017, 12:46:13 PM3/16/17
to Wazuh mailing list, marcjb...@gmail.com, sant...@wazuh.com
This is a new server. Results of the commands:

pkill -f ossec

apt-get remove —-purge wazuh-manager
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package —-purge

apt-get clean all

apt-get update

apt-get install wazuh-manager
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
  gir1.2-json-1.0 gir1.2-timezonemap-1.0 gir1.2-xkl-1.0 libtimezonemap1
  linux-headers-4.4.0-36 linux-headers-4.4.0-36-generic
  linux-image-4.4.0-36-generic linux-image-extra-4.4.0-36-generic
Use 'apt-get autoremove' to remove them.
The following NEW packages will be installed:
  wazuh-manager
0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded.
1 not fully installed or removed.
Need to get 1,136 kB of archives.
After this operation, 8,501 kB of additional disk space will be used.
Fetched 1,136 kB in 0s (2,214 kB/s)
(Reading database ... 192625 files and directories currently installed.)
Preparing to unpack .../wazuh-manager_2.0-1trusty_amd64.deb ...
=====================================================================================
= Backup from your ossec.conf has been created at /var/ossec/etc/ossec.conf.deborig =
= Please verify your ossec.conf configuration at /var/ossec/etc/ossec.conf          =
=====================================================================================
Unpacking wazuh-manager (2.0-1trusty) ...
dpkg: error processing archive /var/cache/apt/archives/wazuh-manager_2.0-1trusty_amd64.deb (--unpack):
dpkg-deb: error: subprocess paste was killed by signal (Broken pipe)
 Removing any system startup links for /etc/init.d/wazuh-manager ...
Errors were encountered while processing:
 /var/cache/apt/archives/wazuh-manager_2.0-1trusty_amd64.deb
E: Sub-process /usr/bin/dpkg returned an error code (1)

Thank you,

Marc

Marc Baker

unread,
Mar 16, 2017, 1:15:25 PM3/16/17
to Wazuh mailing list, marcjb...@gmail.com, sant...@wazuh.com
Jose,

I posted the results of below commands in a separate post. It looks like the error received when attempting to install the Wazuh Manager is similar to what was encountered on the old server. The common link seems to be attempting to install the Wazuh manager. OSSEC was installed on the server but appears to now be removed. Is there a way to restore OSSEC to its state prior to attempting to install the OSSEC manager?

Thank you,

Marc

Jose Luis Ruiz

unread,
Mar 16, 2017, 1:20:15 PM3/16/17
to Marc Baker, Wazuh mailing list, sant...@wazuh.com
Marc, you had ossec installed in this server before install wazuh-manager?

“OSSEC was installed on the server but appears to now be removed”

I thought you was doing a fresh installation

A new server is being prepared. New build will be done using instructions at https://documentation-dev.wazuh.com/installation-guide/installing-wazuh-server/wazuh_server_deb.html. Previous build was done using the old instructions on a new server and resulted in the OS becoming corrupted beyond repair. OS is being reloaded and hopefully the issues encountered the first time through have been corrected in the new documentation. We appreciate the Wazuh.com staff's patience and assistance in walking us through the extremely complicated install process involved with the Wazuh HIDS ELK Stack.

Regards
-----------------------
Jose Luis Ruiz
Wazuh Inc.
jo...@wazuh.com

--

You received this message because you are subscribed to the Google Groups "Wazuh mailing list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to wazuh+un...@googlegroups.com.
To post to this group, send email to wa...@googlegroups.com.

Marc Baker

unread,
Mar 16, 2017, 1:30:57 PM3/16/17
to Wazuh mailing list, marcjb...@gmail.com, sant...@wazuh.com
The OS has OSSEC installed in the base. We built the server and then began following the installation instructions starting with the instruction given to remove OSSEC: apt-get remove ossec-hids --purge. 



Marc
<div style="font-family:"helvetica Neue",helvetica;font-size

Marc Baker

unread,
Mar 17, 2017, 3:19:59 PM3/17/17
to Wazuh mailing list, marcjb...@gmail.com, sant...@wazuh.com
I want to acknowledge Jose's efforts in working with us to get this rolled out. He definitely went above and beyond. With his help we discovered that the issue was a result of the Ubuntu distribution being used. I am deploying an Ubuntu server with no extra software today and will be installing the Wazuh manager over the weekend. 
Marc
<div style="margin:0px;color:rgba(0,0,0,1);font-family:He
Reply all
Reply to author
Forward
0 new messages