One of the key features that made us decide to migrate to gocd several months ago was the idea that pipelines are able to run in parallel and that more than one instance of a pipeline can be started.
Only now are we starting to realise that actually the same pipeline cannot be executed multiple times in parallel since each stage will always run sequentially even if it's the same stage in different pipeline instances. I'm stating this only based on
this post which is the only mention of this issue that I've been able to find.
So first off I'd like to verify if this is correct. Is it not possible to run several instances of the same pipeline concurrently (in parallel )without one instance being constrained by another (i.e. the second instance of the same pipeline is able to complete even if a previously initiated instance has not done so)?
If this is the case, I'd really appreciate any help/ideas on overcoming this limitation in some way.
The pipeline I'm working on has only one stage which simply runs a docker and then deletes the container and image. The docker does some work on our machine learning models, there is no problem to run several containers of this docker at the same time - and that's exactly what I'd like to do - i.e. run another docker each time that the pipeline is triggered (we're using the api to trigger it).
I'd like to see the output created by each docker and, of course, see whether each pipeline has finished successfully or failed. The order of execution and even the material version is not relevant, each instance of the pipeline/docker has it's job to do.
Finally, I'd like to ask if you guys feel that a feature that enables the same stage to run concurrently in different pipeline instances is feasible. And that a request for this is something which has a chance of being accepted (I will, of course, contribute anything I can within my technical skills) .
I'm sure that the are many use cases that will benefit from such a feature.
Thanks