Docker-Plugin spawns more containers than needed

10 views
Skip to first unread message

Alexis Morelle

unread,
Dec 24, 2014, 10:16:14 AM12/24/14
to jenkins...@googlegroups.com
Hello,

I've been playing around at work with the Jenkins Docker plugin and right before making it available, I hit a strange behavior that holds me from using it widely for now.

I have a pretty simple setup with few images I made myself but I only install the tools I need, set up a jenkins user and add our public key then finally start a SSH server. The plugin is configured to contact a Docker deamon over TCP on a CoreOS instance.
It very straightforward and works fine. I can even start a container from one of these images manually and create a new node out of it if a persistent slave is needed.

But from time to time (quite often in fact), the plugin takes a little bit of time to start up a container or may be to ssh into it, I can't really say (I'm not talking about the first time where the image needs to be download from the registry). When that happens, it spawns another container, and then another one... and another one... It's generally not more than 3 or 4 for one build but the containers are left there and never killed. I can stop them manually of course from the interface or the host but I don't think that's an expected behavior. It's not a glitch since the containers are very well alive on the host.

I've seen this behavior mentioned once in a previous thread from a while ago but no further discussion about it. I'm not sure where to start looking for answers, let me know if that belongs to the Jenkins Developers group.

Has anybody experienced that as well and may be has some answers/explanations?

Thanks in advance for your answers.
Alexis.
Reply all
Reply to author
Forward
0 new messages