Thanks. I actually knew about both; not wanting to bother the list with
questions answered in the manual, I read it "cover to cover", so to
speak, but feel they are only a partial solution. However, I apologize
if I am misunderstanding some nuance of what you propose.
The problem, as far as I can see (and also as per the notes in the
"Transitive Dependencies" in the "Workflow" section of the manual) short
of creating "virtual" packages, there is no way to load a consistent set
of "top-level" libraries which potentially share nodes in the DAG, nor
is there a way to find the concretized "least common denominator" for
two libraries, whose abstract depency DAGs may overlap.
In general, I am more interested in leveraging existing, pre-packaged
solutions if possible, rather than migrating them to spack, as pypi,
anaconda, R etc. all have a strong packaging community behind them.
Perhaps having the concept of "abstract repos" as possible providers for
packages would be an alternative solution.
spack install pypi.<packagename>
anyone?
Spack sort of already has a hard-coded concept of this for "locally
installed packages" using the packages.yaml concept. In this framework,
this would just become another repo. Just a thought.
All the best,
Tomas