Increase the maximum number of shards and determine the recommended number.

507 views
Skip to first unread message

Braulio Rodríguez

unread,
May 9, 2024, 10:55:26 AM5/9/24
to Wazuh | Mailing List
Hello everyone!

I want to increase the maximum number of shards per node. I've read that these command can help me:

curl -k -u USERNAME:PASSWORD -XPUT ELASTICSEARCH_HOST_ADDRESS/_cluster/settings -H "Content-Type: application/json" \
-d '{ "persistent": { "cluster.max_shards_per_node": "MAX_SHARDS_PER_NODE" } }'

Is it possible to use a similar command in the Wazuh dashboard's devtools?

Another question about shards: How can I determine if increasing the number of shards will cause performance issues?

Marcos Darío Buslaiman

unread,
May 9, 2024, 1:27:43 PM5/9/24
to Wazuh | Mailing List
Hi Braulio,
Its important to mention some definitions about this setting:

cluster.max_shards_per_node

Limits the total number of primary and replica shards for the cluster.
It only apply to actions which create shards and do not limit the number of shards assigned to each node.

To limit the number of shards assigned to each node, use the cluster.routing.allocation.total_shards_per_node setting.
Plese check the following link of elasticsearch documentation.
Also you can verify the following issue on elsticsearch GH:
https://github.com/elastic/elasticsearch/issues/51839

The settings can be updated by using the dev tool like below.
PUT _cluster/settings
{
  "persistent": {
    "cluster.max_shards_per_node": 
MAX_SHARDS_PER_NODE
  }
}

Or using cluster.routing.allocation.total_shards_per_node
PUT _cluster/settings
{
  "persistent" : {
    "cluster.routing.allocation.total_shards_per_node" : 1200
  }
}


Take into account that change this settings it is not advisable for the long run as it will bring more problems in the future, Please check the following documents which describe the best practices like create index policy management to delete indices.

Braulio Rodríguez

unread,
May 13, 2024, 12:14:04 PM5/13/24
to Wazuh | Mailing List
Thanks for the information!
Reply all
Reply to author
Forward
0 new messages