[2024-07-24T14:39:19,692][WARN ][o.o.b.JNANatives ] [node-1] Unable to lock JVM Memory: error=12, reason=Cannot allocate memory
[2024-07-24T14:39:19,695][WARN ][o.o.b.JNANatives ] [node-1] This can result in part of the JVM being swapped out.
[2024-07-24T14:39:19,695][WARN ][o.o.b.JNANatives ] [node-1] Increase RLIMIT_MEMLOCK, soft limit: 65536, hard limit: 65536
[2024-07-24T14:39:19,695][WARN ][o.o.b.JNANatives ] [node-1] These can be adjusted by modifying /etc/security/limits.conf, for example:
[2024-07-24T14:39:19,696][WARN ][o.o.b.JNANatives ] [node-1] If you are logged in interactively, you will have to re-login for the new limits to take effect.
[2024-07-24T14:39:19,817][INFO ][o.o.n.Node ] [node-1] JVM arguments [-Xshare:auto, -Dopensearch.networkaddress.cache.ttl=60, -Dopensearch.networkaddress.cache.negative.ttl=10, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -XX:+ShowCodeDetailsInExceptionMessages, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dio.netty.allocator.numDirectArenas=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Djava.locale.providers=SPI,COMPAT, -Xms4g, -Xmx4g, -XX:+UseG1GC, -XX:G1ReservePercent=25, -XX:InitiatingHeapOccupancyPercent=30, -Djava.io.tmpdir=/tmp/opensearch-7420340218558769111, -XX:+HeapDumpOnOutOfMemoryError, -XX:HeapDumpPath=/var/lib/wazuh-indexer, -XX:ErrorFile=/var/log/wazuh-indexer/hs_err_pid%p.log, -Xlog:gc*,gc+age=trace,safepoint:file=/var/log/wazuh-indexer/gc.log:utctime,pid,tags:filecount=32,filesize=64m, -Dclk.tck=100, -Djdk.attach.allowAttachSelf=true, -Djava.security.policy=file:///etc/wazuh-indexer/opensearch-performance-analyzer/opensearch_security.policy, --add-opens=jdk.attach/sun.tools.attach=ALL-UNNAMED, -Dclk.tck=100, -Djdk.attach.allowAttachSelf=true, -Djava.security.policy=file:///usr/share/wazuh-indexer/plugins/opendistro-performance-analyzer/pa_config/es_security.policy, -XX:MaxDirectMemorySize=2147483648, -Dopensearch.path.home=/usr/share/wazuh-indexer, -Dopensearch.path.conf=/etc/wazuh-indexer, -Dopensearch.distribution.type=rpm, -Dopensearch.bundled_jdk=true]
[2024-07-24T14:39:21,341][WARN ][o.o.s.OpenSearchSecurityPlugin] [node-1] Directory /etc/wazuh-indexer/opensearch-performance-analyzer/backup has insecure file permissions (should be 0700)
[2024-07-24T14:39:25,529][WARN ][o.o.s.c.Salt ] [node-1] If you plan to use field masking pls configure compliance salt e1ukloTsQlOgPquJ to be a random string of 16 chars length identical on all nodes
[2024-07-24T14:39:25,585][ERROR][o.o.s.a.s.SinkProvider ] [node-1] Default endpoint could not be created, auditlog will not work properly.
[2024-07-24T14:39:25,586][WARN ][o.o.s.a.r.AuditMessageRouter] [node-1] No default storage available, audit log may not work properly. Please check configuration.
[2024-07-24T14:39:26,783][WARN ][o.o.s.p.SQLPlugin ] [node-1] Master key is a required config for using create and update datasource APIs. Please set plugins.query.datasources.encryption.masterkey config in opensearch.yml in all the cluster nodes. More details can be found here: https://github.com/opensearch-project/sql/blob/main/docs/user/ppl/admin/datasources.rst#master-key-config-for-encrypting-credential-information
[2024-07-24T14:39:27,248][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,279][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,279][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,279][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,280][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,280][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,280][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,280][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,280][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,280][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,281][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,281][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,281][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,281][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,281][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,281][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,282][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,282][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,282][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,283][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,283][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,283][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,283][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,283][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,283][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,283][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,284][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,284][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,284][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,284][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,284][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,284][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,284][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,285][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,285][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:27,285][WARN ][o.o.p.c.ThreadPoolMetricsCollector] [node-1] Fail to read queue capacity via reflection
[2024-07-24T14:39:28,070][WARN ][o.o.g.DanglingIndicesState] [node-1] gateway.auto_import_dangling_indices is disabled, dangling indices will not be automatically detected or imported and must be managed manually
[2024-07-24T14:39:28,945][WARN ][o.o.b.BootstrapChecks ] [node-1] memory locking requested for opensearch process but memory is not locked
[2024-07-24T14:39:29,265][WARN ][o.o.p.c.s.h.ConfigOverridesClusterSettingHandler] [node-1] Config override setting update called with empty string. Ignoring.
[2024-07-24T14:39:29,481][ERROR][o.o.s.a.BackendRegistry ] [node-1] Not yet initialized (you may need to run securityadmin)
[2024-07-24T14:39:29,494][ERROR][o.o.s.a.BackendRegistry ] [node-1] Not yet initialized (you may need to run securityadmin)
[2024-07-24T14:39:29,496][ERROR][o.o.s.a.BackendRegistry ] [node-1] Not yet initialized (you may need to run securityadmin)
[2024-07-24T14:39:29,498][ERROR][o.o.s.a.BackendRegistry ] [node-1] Not yet initialized (you may need to run securityadmin)
[2024-07-24T14:39:29,507][ERROR][o.o.i.i.ManagedIndexCoordinator] [node-1] get managed-index failed: NoShardAvailableActionException[No shard available for [org.opensearch.action.get.MultiGetShardRequest@3a7e91ee]]
[2024-07-24T14:39:29,532][WARN ][o.o.o.i.ObservabilityIndex] [node-1] message: index [.opensearch-observability/WXCWZgv0Q9CXAAztLQN8Cg] already exists
[2024-07-24T14:39:29,534][WARN ][o.o.s.SecurityAnalyticsPlugin] [node-1] Failed to initialize LogType config index and builtin log types
[2024-07-24T14:39:29,543][ERROR][o.o.i.i.ManagedIndexCoordinator] [node-1] Failed to get ISM policies with templates: Failed to execute phase [query], all shards failed
[2024-07-24T14:39:29,783][ERROR][o.o.s.a.BackendRegistry ] [node-1] Not yet initialized (you may need to run securityadmin)
[2024-07-24T14:39:32,122][WARN ][r.suppressed ] [node-1] path: /.kibana/_count, params: {index=.kibana}
[2024-07-24T14:39:34,638][WARN ][r.suppressed ] [node-1] path: /.kibana/_count, params: {index=.kibana}
[2024-07-24T14:39:37,145][WARN ][r.suppressed ] [node-1] path: /.kibana/_count, params: {index=.kibana}
[2024-07-24T14:39:39,652][WARN ][r.suppressed ] [node-1] path: /.kibana/_count, params: {index=.kibana}