ElasticSearch as a log source for Wazuh

952 views
Skip to first unread message

Павел Покровский

unread,
Oct 11, 2022, 1:31:36 AM10/11/22
to Wazuh mailing list
Hi.

We have a bunch of nginx servers which we use for web serving and load balancing for our apps. We're storing logs from nginx to our elasticsearch using vector.dev. We also would like to send these logs to our Wazuh cluster but also to avoid additional extensive configuration. Preferably we would like to have an option to somehow read directly from corresponding elastic index. We can, of course, configure nginx or vector to send logs to agent on local machine or remotely but that does not look very clean. 

Is there any approach / experience to implement elasticsearch as a log source for Wazuh?

elw...@wazuh.com

unread,
Oct 11, 2022, 5:33:43 AM10/11/22
to Wazuh mailing list
Hello Pavel,

If configuring vector to sink its logs into a file (https://vector.dev/docs/reference/configuration/sinks/file/) then monitoring it with a wazuh agent (which is the straightforward option) is not what you are looking for. You can create an integration at the level of the Wazuh manager which would perform the following:
  • Connect to your Elasticsearch cluster.
  • Use the search API endpoint to retrieve the data from the indices.
  • Push the data to the Wazuh manager to generate alerts.
I am sharing an example where I perform the same steps above but instead, I get only the hits (you will need to extract the logs instead):

  • The esquery.py script:

    #!/var/ossec/framework/python/bin/python3
    import json
    import requests
    from requests.auth import HTTPBasicAuth
    from socket import socket, AF_UNIX, SOCK_DGRAM
    socketAddr = '/var/ossec/queue/ossec/queue'
    def send_event(msg):
        string = '1:ES_query:{}'.format(msg)
        sock = socket(AF_UNIX, SOCK_DGRAM)
        sock.connect(socketAddr)
        sock.send(string.encode())
        sock.close()
    query={"query":{"bool": {"must": [],"filter": [{"match_all" : {}},{"range": {"timestamp":{"gte":"now-5m/m"}}}]}}}
    response = requests.get('https://localhost:9200/wazuh-alerts*/_search', auth = HTTPBasicAuth('admin', 'admin'), verify=False, json=query)
    hits = json.loads(response.text)['hits']['total']['value']
    send_event('Event query on Elasticsearch returned {} hits'.format(hits))


  • Wazuh configuration to run it periodically:

    <wodle name="command">
      <disabled>no</disabled>
      <command>/var/ossec/etc/esquery.py</command>
      <interval>5m</interval>
      <ignore_output>yes</ignore_output>
      <run_on_start>yes</run_on_start>
      <timeout>0</timeout>
    </wodle>


  • A simple rule to trigger everything coming from the script:

     <rule id="100002" level="3">
        <location>ES_query</location>
        <description>ESquery returned hits</description>
      </rule>


I hope this helps.

Regards,
Wali

Павел Покровский

unread,
Oct 12, 2022, 1:37:52 AM10/12/22
to Wazuh mailing list
Hi Wali

Thank you very much for prompt response!

Configuring vector sink is indeed one of the options, we thought though there is a more straightforward and integrated approach since Wazuh pretty much relies on elastic as database. Developing custom integration looks a bit complex but is also an option which we missed - thank you for pointing this out.

вторник, 11 октября 2022 г. в 12:33:43 UTC+3, elw...@wazuh.com:

elw...@wazuh.com

unread,
Oct 13, 2022, 2:57:40 AM10/13/22
to Wazuh mailing list
Hello Pavel,

Wazuh indeed relies on Opensearch/Elastic as a database to store the alerts, however, in this case, indices are in separate Opensearch/Elasticsearch clusters/nodes. Another possibility would be to set up a Cross cluster Search node https://opensearch.org/docs/latest/security-plugin/access-control/cross-cluster-search/ that would allow searching in both indices in one single Wazuh dashboard/Kibana.

Hope this helps.

Regards,
Wali
Reply all
Reply to author
Forward
0 new messages