About Wazuh blog post: "Leveraging artificial intelligence for threat hunting in Wazuh"

55 views
Skip to first unread message

suricata

unread,
Sep 19, 2025, 3:13:26 AM (yesterday) Sep 19
to Wazuh | Mailing List
Hello everyone, Has anyone successfully implemented the solution described in the article about Wazuh/Ollama?

Best Regards, 


Message has been deleted

Bony V John

unread,
Sep 19, 2025, 3:46:52 AM (yesterday) Sep 19
to Wazuh | Mailing List
Hi,

I haven’t tried it yet. However, I can test it and assist you if you could provide more details about your issue or where exactly you are blocked.
In the meantime, you can also refer to the Wazuh blog and make sure you have followed the steps correctly.

If you have already deployed the threat_hunter.py script in your environment, you can verify whether it is working by checking the log file it generates at: /var/ossec/logs/threat_hunter.log

suricata

unread,
Sep 19, 2025, 3:58:14 AM (yesterday) Sep 19
to Wazuh | Mailing List
Hi, Yes, one of the problems is resource management. So I'm testing it by running the server for a very short period of time to generate shorter logs. Depending on how I use it, it produces different types of errors. After many tests, I'm now running the Python script from the Wazuh server and Ollama remotely. It works, but it's unable to open the chat dialog window, which is located on the remote server (http://192.168.7.100:8000/).

It also doesn't create the  /var/ossec/logs/threat_hunter.log  log file.


INFO:     Started server process [61094]
INFO:     Waiting for application startup.
🚀 Starting FastAPI app and loading vector store...
🔄 Initializing QA chain with logs from past 1 days...
✅ 126951 logs loaded from the last 1 days.
📦 Creating vectorstore...
✅ QA chain initialized successfully.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)

Bony V John

unread,
Sep 19, 2025, 6:39:30 AM (yesterday) Sep 19
to Wazuh | Mailing List
Hi,

If you can’t access the chat window, it’s likely due to either the Python script not running properly on the server or a network configuration issue.

Please ensure your remote server meets these requirements to run the LLM reliably:


Make sure ports 22 and 8000 are open and reachable from the Wazuh Manager.

On the Wazuh Manager, test connectivity to the remote server:

nc -vz 192.168.7.100 22
nc -vz 192.168.7.100 8000

Replace 192.168.7.100 with your remote server’s IP address.

Expected success output:

.Ncat: Connected to  192.168.7.100  :22.
.Ncat: Connected to  192.168.7.100  :8000.

If these fail, review your firewall/security group/routing settings.

Check if port 8000 is occupied on the remote server:
sudo lsof -iTCP:8000 -sTCP:LISTEN
  • If another process is using port 8000, the script won’t be able to bind and the chat window won’t appear.
  • Stop the conflicting process or change the script’s port.

The path /var/ossec/logs/threat_hunter.log won’t exist on a generic remote server (that path is specific to Wazuh installations).
Update the script’s log_file_path to a standard location, e.g.:  
/var/log/threat_hunter.log
After changing the path, the script should start writing logs there.  

On the remote server, start the script and point it to your Wazuh Manager IP:  
python3 threat_hunter.py -H <WAZUH_SERVER_IP>

Replace <WAZUH_SERVER_IP> with your Wazuh Manager’s IP address.  

Verify it’s now listening:  
sudo lsof -iTCP:8000 -sTCP:LISTEN

If the issue persist, please share:

  1. The output of the connectivity checks (nc -vz),

  2. The lsof output for port 8000, and

  3. The contents (or errors) from /var/log/threat_hunter.log (if created),

so I can help diagnose further.

suricata

unread,
Sep 19, 2025, 7:02:13 AM (yesterday) Sep 19
to Wazuh | Mailing List
Hí,

I'll review all of that. For now:

Ollama is running remotely. I execute the script on the Wazuh server. The chat opens correctly, and any command I enter (e.g., /help) works perfectly. The issue arises when I ask about an alert, as you can see in the log:

root   wazuh:/var/ossec/integrations# python3.10 ./threat_hunter.py
/var/ossec/integrations/./threat_hunter.py:448: DeprecationWarning:
        on_event is deprecated, use lifespan event handlers instead.

        Read more about it in the
        [FastAPI docs for Lifespan Events](https://fastapi.tiangolo.com/advanced/events/).

  @app.on_event("startup")
INFO:     Started server process [61847]

INFO:     Waiting for application startup.
🚀 Starting FastAPI app and loading vector store...
🔄 Initializing QA chain with logs from past 1 days...
✅ 126951 logs loaded from the last 1 days.
📦 Creating vectorstore...
✅ QA chain initialized successfully.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
INFO:     192.168.7.188:11124 - "GET / HTTP/1.1" 401 Unauthorized
INFO:     192.168.7.188:11125 - "GET / HTTP/1.1" 200 OK
INFO:     192.168.7.188:11125 - "GET /favicon.ico HTTP/1.1" 404 Not Found
INFO:     192.168.7.188:11128 - "WebSocket /ws/chat" [accepted]
INFO:     connection open
🧠 Received question: Hola
❌ Error in websocket: [Errno 111] Connection refused
INFO:     connection closed
INFO:     192.168.7.188:11127 - "GET / HTTP/1.1" 200 OK
INFO:     192.168.7.188:11139 - "WebSocket /ws/chat" [accepted]
INFO:     connection open
🧠 Received question: Give me information about the most active agents.
❌ Error in websocket: [Errno 111] Connection refused
INFO:     connection closed
INFO:     192.168.7.188:11184 - "GET / HTTP/1.1" 200 OK
INFO:     192.168.7.188:11186 - "WebSocket /ws/chat" [accepted]
INFO:     connection open
🧠 Received question: Give me information about the most common location
❌ Error in websocket: [Errno 111] Connection refused
INFO:     connection closed
INFO:     192.168.7.188:11225 - "GET / HTTP/1.1" 200 OK
INFO:     192.168.7.188:11227 - "WebSocket /ws/chat" [accepted]
INFO:     connection open
🧠 Received question: Give me information about brute force alerts on SSH
❌ Error in websocket: [Errno 111] Connection refused
INFO:     connection closed

Reply all
Reply to author
Forward
0 new messages