multi-container python client

24 views
Skip to first unread message

Adam Hendel

unread,
Aug 10, 2020, 1:08:18 PM8/10/20
to Prometheus Developers
Hi team,

We ran into an issue when running single or multiple python processes via multiple Docker containers with a single MultiProcessCollector w/ Flask. We have a draft PR open but still working through a proposed solution.

Let's assume there are 4 containers - 1 container running Flask w/ MultiProcessCollector and 3 containers which are replicas of one process, for this example each writing Summary metrics. These four containers are sharing the "prometheus_multiproc_dir" via a volume mount across the containers. Each of the single processes in the 3 replica containers will write summary_1.db to the prometheus_multiproc_dir and overwrite each others file. Since they are separate containers, and each is a replica, the pid will be the same in each container. There's a couple ways we are looking at solving this:

(1) Define our own "process_identifier()". We also have to handle the cleanup w/ mark_process_dead() to include the new file name convention.

(2) Change the way the .db files, essentially just add in $HOST to the filename for metric files by default. The metric files become <metric>_{host}_{pid}.db, which ensures unique file names across Docker containers.

If (2) is not a good idea to explore, we could just add some documentation to the python_client readme based on (1).

Adam
Reply all
Reply to author
Forward
0 new messages