On Monday, 5 August 2019 13:23:01 UTC+3, Frederick Gotham wrote:
> So let's say I have three programs. Normally I would run these three programs at the commandline as follows:
>
> prog1 | prog2 | prog3
More or less, there are typically some command line arguments.
>
> So let's say I take the source code for these 3 programs and try to combine them into one.
Before doing it you should think why. Are pipes inefficient for your
use case? There is Boost.Interprocess with plenty of tools for more
efficient inter-process communications. Do you hope for optimizations
in interfaces between modules? The streams won't anyway allow much.
Do such modules share lot of code? Use shared objects or DLLs.
> Have any of you ever done this before? What do you think of my idea? What way would you do it?
I like to keep modules small if possible. I have done in other
direction split single large code base into several. For frequent
example kicked filters/converters of old, rarely used file formats
or versions or functionality into separate, rarely used processes.
It lets main processing module to use single input format/version
and single output format version and that can simplify it a lot.
It can cause some performance hit to rarely used functionality but
more frequently needed modules load and execute quicker and take
less resources. I have sometimes replaced pipes with RPC so I can do
more than pipes allow and can spread the modules to different
hosts easier. Only thing that is needed for such decisions is to
collect statistics of frequency and performance of feature usage.
That can be tricky with on-premise or embedded software (that C++
is often about).