Centralized Configuration for agents is not synchronized

726 views
Skip to first unread message

Bloom

unread,
Nov 13, 2023, 5:24:38 AM11/13/23
to Wazuh | Mailing List
Hello guys, 

I hope you're having a good day !

I have noticed a problem when it comes to pushing centralized agent configurations after updating my managers to the new 4.6.0 version. 

Here is my current setup : 

3 managers (v4.6.0)
200+ agents (v4.5.2 - v4.6.0)

I can change my group configurations, but the agents of that group do not get the new configuration and do not restart.

All the nodes are detected and have the correct version : 

conf-sync4.PNG

Here is the agent_control and agent_groups for a linux server : 

conf-sync.PNG

Same Problem for a windows  server : 

conf-sync2.PNG

In the screens, I have only shown agents that are 4.6.0, but the problem is also present for older versions.

I did not have this problem before when my cluster was on v4.5.2.

The shared configuration is also correct : 

conf-sync3.PNG

The logs (Both managers and agents) do not show anything relating to a group configuration change.

What do you think the problem might be ?

Let me know if you need any extra informations.

Thanks !


Alejandro Ruiz Becerra

unread,
Nov 13, 2023, 8:18:30 AM11/13/23
to Wazuh | Mailing List
Hello Bloom!

Thanks for the report. I'll try to replicate it and ask the development if they are aware of this problem. I'll be back as soon as possible.

Do the agents belong to the "test" group? 

Regards
Alex

Bloom

unread,
Nov 13, 2023, 8:24:38 AM11/13/23
to Wazuh | Mailing List
Hello Alex,

Thanks for the quick reply.

Yes, the agents are part of the group "test". They are my test agents, but all groups are affected by this.

Thanks !

Alejandro Ruiz Becerra

unread,
Nov 13, 2023, 1:02:38 PM11/13/23
to Wazuh | Mailing List
Hello again!


I haven't been able to reproduce your problem, my configuration is being centralized just fine.


root@wazuh:/var/ossec/bin# ./agent_control -i 002

Wazuh agent_control. Agent information:
Agent ID: 002
Agent Name: 393b41a52d0a
IP address: any
Status: Active

Operating system: Linux |393b41a52d0a |6.2.0-36-generic |#37~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Mon Oct 9 15:34:04 UTC 2 |x86_64
Client version: Wazuh v4.6.0
Configuration hash: 651e32d69c75858b0f3c78e2b5930d9f
Shared file hash: 18c2eb52804ec2175164dafffb179ada
Last keep alive: 1699897901

Syscheck last started at: Mon Nov 13 17:42:32 2023
Syscheck last ended at: Mon Nov 13 17:42:33 2023

root@wazuh:/var/ossec/bin# ./agent_groups -S -i 002
Agent '002' is synchronized.

root@wazuh:/var/ossec/bin# ./agent_groups -s -i 002
The agent '393b41a52d0a' with ID '002' belongs to groups: test.



Can you check that the centralized configuration is not disabled

Also, please verify the agent.conf file the the tool /var/ossec/bin/verify-agent-conf

Bloom

unread,
Nov 15, 2023, 9:13:12 AM11/15/23
to Wazuh | Mailing List
Hello Alex ! 

The centralized configuration is not disabled, here is my local_internal_options.conf file : 

conf1.PNG

I have already done the verify-agent-conf in my first email, but here it is again : 

conf4.PNG

I have another manager, in a single node configuration, that stopped synchronizing the agent.conf after the update as well.

I did not change anything apart from the update, and it was working correctly before. If I didn't perform the update correctly, is there a way to know that ?

Is there a way to force the synchronisation ? Or to activate some logs about it for more visibility ?

Thanks !

Alejandro Ruiz Becerra

unread,
Nov 15, 2023, 1:30:32 PM11/15/23
to Wazuh | Mailing List
Hello Bloom

Sorry, it seems like I missed the part where you ran the verify-agent-conf tool. I've reported the latest info to the development team and I'm awaiting a response

I'll get in touch as soon as possible



Alejandro Ruiz Becerra

unread,
Nov 16, 2023, 5:37:18 AM11/16/23
to Wazuh | Mailing List
Hi again!

I've got some news. The daemons in charge of the synchronization are wazuh-remoted in the manager and wazuh-agentd in the agent. You can enable verbose logging using the -d flag on each of them

Manager
=======
root@wazuh:/var/ossec/bin# ./wazuh-remoted -d

Agent
=====
root@wazuh:/var/ossec/bin# ./wazuh-agentd -d

I hope that yields some results, so we can continue debugging this problem.

Regards,
Alex

Bloom

unread,
Nov 16, 2023, 10:33:39 AM11/16/23
to Wazuh | Mailing List
Hello again,

Here are the notable results I have found with the debug mode : 

con1.PNG

This error is getting repeated over and over, for most agents. The ones that didn't have this error are the agents that only belonged to one group.

Upon removing an agent from all but one group, the centralized configuration started working again.

There are no errors in the configurations of the different groups, most of them are empty and only used for grouping purposes.

I tried this on an older version (4.3.10), and the configuration was being sent even with the agent belonging to multiple groups.

con4.PNG

The error was triggered once, but the configuration was still sent. After that, the configuration was sent with no errors : 

con5.PNG

I'll let you perform your tests to see if this is just on my end or really some sort of bug.

Thanks !

Alejandro Ruiz Becerra

unread,
Nov 16, 2023, 12:06:27 PM11/16/23
to Wazuh | Mailing List
Hi,

Good to hear that it's solved. I'll share these findings with the development team so we can try to reproduce that.

Regards,
Alex
Reply all
Reply to author
Forward
0 new messages