I suppose it is just about the same. My machine only has so many threads.
I saw here:
That with Async steps stdin redirection could fail.
does that mean something like
file < do stuff
Would fail?
What about piped commands?
I'm asking because I'm thinking about extending a perl module I'm writing to support drake workflow outputs, but I work in the bioinformatics/genomics fields where most of the commands have pipes or stdin/stdout redirection. For instance
bcftools view {$self->indir}/{$sample}.vcf.gz | sed 's/ID=AD,Number=./ID=AD,Number=R/' \
| vt decompose -s - \
| vt normalize -r $REFGENOME - \
| java -Xmx4G -jar $SNPEFF/snpEff.jar -c \$SNPEFF/snpEff.config -formatEff -classic GRCh37.75 \
| bgzip -c > \
{$self->{outdir}}/{$sample}.norm.snpeff.gz && tabix {$self->{outdir}}/{$sample}.norm.snpeff.gz
Is the kind of command we see quite a bit.
To break it down a bit
bcftools view {$self->indir}/{$sample}.vcf.gz | sed 's/ID=AD,Number=./ID=AD,Number=R/' | vt decompose -s -
in "-s -" the second "-" means read data from STDIN.
Would strange things happen if using drake and asynchronous workflows?
Best,
Jillian