--
You received this message because you are subscribed to the Google Groups "Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prometheus-use...@googlegroups.com.
To post to this group, send email to promethe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/prometheus-users/a131d573-9b67-404a-b5d9-eb2edd66c488%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
I am recordind the size of the "data" directory. It increases at rate of ~2 bytes per sample. After the "2 hours update" almost 1.6 MB is added. I monitored along two full days and this value was very reproducible.
The main question is that kind of data (constant, with few significant numbers and with fixed scrape time) should take a negligible amount of storage. But 2 bytes per sample doesn't seems right!
--
You received this message because you are subscribed to the Google Groups "Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prometheus-use...@googlegroups.com.
To post to this group, send email to promethe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/prometheus-users/16d6dcdb-8a74-4041-8ed0-308aeea9e599%40googlegroups.com.
Thank you for the attention. These are some information about the test.
The scraped text is always the same (1000 gauge variables every 10 seconds; "only" 6 significant digits and no Prometheus variables, but 4):
# HELP Var0Unit0 Informative/explanatory text about the variable
# TYPE Var0Unit0 gauge
Var0Unit0 123.456
# HELP Var1Unit1 Informative/explanatory text about the variable
# TYPE Var1Unit1 gauge
Var1Unit1 123.456
...
... ...
# HELP Var998Unit998 Informative/explanatory text about the variable
# TYPE Var998Unit998 gauge
Var998Unit998 123.456
# HELP Var999Unit999 Informative/explanatory text about the variable
# TYPE Var999Unit999 gauge
Var999Unit999 123.456
I restarted the test some days ago and these are the files inside the "data" directory:
The content of the meta.json of the first two (the biggest ones):
{
"ulid": "01CK78AN71HA96JCVPBK9B7EV8",
"minTime": 1532404800000,
"maxTime": 1532455200000,
"stats": {
"numSamples": 4495912,
"numSeries": 1004,
"numChunks": 37148
},
"compaction": {
"level": 3,
"sources": [
"01CK5HCTVQY6A2ME51J08PFPPV",
"01CK5R8J3E02ZCFHQVPD7R8Z30",
"01CK5Z49B8S05H0DVV24V4AHZ8",
"01CK6600N006JRK0961YVVZP4F",
"01CK6CVQVGEHA97XBNY9CQWM3X",
"01CK6KQF3JV3H8Y2XDRD505QJM",
"01CK6TK6B2GX1ZWPSNTH5HENVX"
]
},
"version": 1
}
And
{
"ulid": "01CK8Z8NAWXF4RQEWDAWNQSJAB",
"minTime": 1532455200000,
"maxTime": 1532520000000,
"stats": {
"numSamples": 4669384,
"numSeries": 1004,
"numChunks": 39212
},
"compaction": {
"level": 3,
"sources": [
"01CK71EXKADASB438GYEMFT15B",
"01CK78AMV4KX5PKP28JRHQ5113",
"01CK7F6C2XMY501KXBCHACH63G",
"01CK7P23B7NV935XWHVQXH9P8E",
"01CK7WXTK2RE21V4AV989828B7",
"01CK84QNB5NTPSK5CWBNYNJEKK",
"01CK8ANFFQ9MGYNDW0SZXM79HF",
"01CK8HH6Q6QEX8H4F2P3352YX4",
"01CK8RCXZSCZXKMP9X4F5ET1MT"
]
},
"version": 1
}
And, finally, the following graph shows the behavior of the size of the "data" directory:
My simplistic test is very easy to replicate (as described in the first message). If you try, could you inform if your results are consistent to those present in the graph ("data" folder grows ~0.75MB every hour)?
Putting my question in another form:
How do you estimate the storage needed when dealing with values that comes from industrial sensors (figures with few significant numbers and an inerent noise, i.e. ever changing)? Is there a way to force float32 representation?
Thanks! ;)
--
You received this message because you are subscribed to the Google Groups "Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prometheus-use...@googlegroups.com.
To post to this group, send email to promethe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/prometheus-users/f044043e-8695-4799-95c3-92b164f5c76b%40googlegroups.com.
--
You received this message because you are subscribed to the Google Groups "Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prometheus-users+unsubscribe@googlegroups.com.
To post to this group, send email to prometheus-users@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/prometheus-users/8841205b-fb05-481e-90f4-ef0f972ac0a5%40googlegroups.com.
It seems we are (actually "I am"!) near to finish this subject, but I still have some points to comment.
I should be deadlocked in a wrong perspective (but how to ignore the ever growing size of the folder - I run for days and the slope was constant) !!
Could you shortly explain how is your procedure (calculation or which file are you monitoring the size) ?
Thank you very much!
--
You received this message because you are subscribed to the Google Groups "Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prometheus-users+unsubscribe@googlegroups.com.
To post to this group, send email to prometheus-users@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/prometheus-users/34aca717-b7fc-419c-ae19-2fe030d5fe2e%40googlegroups.com.
Same result! As you can see, I am still stucked in ~2 bytes/sample.
Far, far away from 0.066 bits per sample:
As stressed by Björn Rabenstein in his speach: "bits not bytes"!! (Did anybody know how to reproduce it?)
My goal now is to reproduce Julius Volz's 0.68 bytes per sample (some answers above)!!! "My" present value is three times that!
What could cause the diference? I am using Prometheus 2.3.2 on Windows 10.
Maybe some configuration tweak or flag in "yml" file? I am using the following to start the test:
prometheus.exe --web.enable-admin-api --config.file=.\prometheus_size_test.yml --storage.tsdb.path=.\data --storage.tsdb.retention=280d
I noticed in the "meta.json" files the entry "version":1 . Is there another version ("2"?) with better compaction?
That's it. I feel myself trapped in a maze!