The answer is "it depends". The network traffic caused by the scrsape
and the CPU usage for sending and receiving the scrape payload itself
obviously doubles. However, the storage requirement might not increase
by much, because very similar or very regularly changing samples
compress very well. On the other hand, it might increase by a lot
(almost twice as much) if the metric values are more or less randomly
changing.
In practice, you simply have to try it out. You can use Prometheus to
monitor how the Prometheus resource usage changes. :o)
--
Björn Rabenstein
[PGP-ID] 0x851C3DA17D748D03
[email]
bjo...@rabenste.in