Prometheus PushGateway - SocketTimeoutException: Read timed out

24 views
Skip to first unread message

Dhrubajyoti Sadhu

unread,
Jul 1, 2024, 3:36:56 PM (6 days ago) Jul 1
to Prometheus Users
I have a Spring boot batch application that sends metrics to pushgateway but at times I am getting the below error due to which partial data is reaching the Prometheus server.

This is happening intermittently and not all the time.

ERROR Message:
org.springframework.boot.actuate.metrics.export.prometheus.PrometheusPushGatewayManager$PushGatewayTaskScheduler:218 - Shutting down ExecutorService
2024-07-01T17:12:03,782 ERROR org.springframework.boot.actuate.metrics.export.prometheus.PrometheusPushGatewayManager:119 - Unable to push metrics to Prometheus Pushgateway
java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method) ~[?:1.8.0_392]
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116) ~[?:1.8.0_392]
at java.net.SocketInputStream.read(SocketInputStream.java:171) ~[?:1.8.0_392]
at java.net.SocketInputStream.read(SocketInputStream.java:141) ~[?:1.8.0_392]
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246) ~[?:1.8.0_392]
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286) ~[?:1.8.0_392]
at java.io.BufferedInputStream.read(BufferedInputStream.java:345) ~[?:1.8.0_392]
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:743) ~[?:1.8.0_392]
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678) ~[?:1.8.0_392]
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1595) ~[?:1.8.0_392]
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1500) ~[?:1.8.0_392]
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480) ~[?:1.8.0_392]
at io.prometheus.client.exporter.PushGateway.doRequest(PushGateway.java:315) ~[hadoop-unjar3434452571338587139/:?]
at io.prometheus.client.exporter.PushGateway.pushAdd(PushGateway.java:182) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.boot.actuate.metrics.export.prometheus.PrometheusPushGatewayManager.push(PrometheusPushGatewayManager.java:108) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.boot.actuate.metrics.export.prometheus.PrometheusPushGatewayManager.shutdown(PrometheusPushGatewayManager.java:146) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.boot.actuate.metrics.export.prometheus.PrometheusPushGatewayManager.shutdown(PrometheusPushGatewayManager.java:136) ~[hadoop-unjar3434452571338587139/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_392]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_392]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_392]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_392]
at org.springframework.beans.factory.support.DisposableBeanAdapter.invokeCustomDestroyMethod(DisposableBeanAdapter.java:339) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:273) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:587) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:559) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:1152) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingletons(DefaultSingletonBeanRegistry.java:520) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingletons(DefaultListableBeanFactory.java:1145) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.context.support.AbstractApplicationContext.destroyBeans(AbstractApplicationContext.java:1111) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1080) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:1026) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.boot.SpringApplication.close(SpringApplication.java:1369) ~[hadoop-unjar3434452571338587139/:?]
at org.springframework.boot.SpringApplication.exit(SpringApplication.java:1356) ~[hadoop-unjar3434452571338587139/:?]
at com.hotels.bdp.cloverleaf.Cloverleaf.run(Cloverleaf.java:104) ~[hadoop-unjar3434452571338587139/:?]
at com.hotels.bdp.cloverleaf.CloverleafRunner$DefaultCloverleafRunner.run(CloverleafOrchestrator.java:131) ~[hadoop-unjar3434452571338587139/:?]
at com.hotels.bdp.cloverleaf.CloverleafOrchestrator.startCloverleaf(CloverleafOrchestrator.java:78) ~[hadoop-unjar3434452571338587139/:?]
at com.hotels.bdp.cloverleaf.CloverleafOrchestrator.lambda$start$0(CloverleafOrchestrator.java:49) ~[hadoop-unjar3434452571338587139/:?]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) [?:1.8.0_392]
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384) [?:1.8.0_392]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) [?:1.8.0_392]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) [?:1.8.0_392]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) [?:1.8.0_392]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) [?:1.8.0_392]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) [?:1.8.0_392]
at com.hotels.bdp.cloverleaf.CloverleafOrchestrator.start(CloverleafOrchestrator.java:57) [hadoop-unjar3434452571338587139/:?]
at com.hotels.bdp.cloverleaf.CloverleafOrchestrator.main(CloverleafOrchestrator.java:28) [hadoop-unjar3434452571338587139/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_392]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_392]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_392]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_392]
at org.apache.hadoop.util.RunJar.run(RunJar.java:244) [hadoop-common-2.10.1-amzn-4.jar:?]
at org.apache.hadoop.util.RunJar.main(RunJar.java:158) [hadoop-common-2.10.1-amzn-4.jar:?]
24/07/01 17:12:03 INFO metrics.MetricReporterService: Shutting down MetricReporterService...
24/07/01 17:12:03 INFO metrics.MetricReporterService: MetricReporterService shutdown complete.
24/07/01 17:12:03 INFO cloverleaf.CloverleafOrchestrator: Cloverleaf finished for target table 'coupons.coupons_eg_domain_event_v4' with exit code 0.

Thanks for your help,
Dhruv

Bjoern Rabenstein

unread,
Jul 3, 2024, 8:20:32 AM (4 days ago) Jul 3
to Dhrubajyoti Sadhu, Prometheus Users
On 01.07.24 12:04, Dhrubajyoti Sadhu wrote:
> I have a Spring boot batch application that sends metrics to pushgateway
> but at times I am getting the below error due to which partial data is
> reaching the Prometheus server.
>
> This is happening intermittently and not all the time.

Just wildly guessing here (because that's a common problem PGW users
encounter): Maybe your Pushgateway is just overloaded? How many
metrics do you try to push? And how many metrics have already been
pushed? Note that the intended use case is to push a handful of
metrics, and have maybe a few hundreds of metrics on the PGW at the
same time.

--
Björn Rabenstein
[PGP-ID] 0x851C3DA17D748D03
[email] bjo...@rabenste.in
Reply all
Reply to author
Forward
0 new messages