Spack and (Ana)Conda - living side by side

894 views
Skip to first unread message

Tomas Puverle

unread,
Dec 1, 2016, 2:28:37 PM12/1/16
to Spack
Hello,

I have searched the archives but not found a related topic.  I realize that people have talked about packaging spack, but this is not what this post is about.

Is there a recommended approach for the co-existence of the two environments?  In general, our preference is to use anaconda for python distributions, given the large number of existing packages.  At the same time, when building either spack packages (or internal libraries), ideally these would be built with the *anaconda* provided python binaries/libraries.  Of course, this presents a challenge as anaconda environments don't necessarily live in the same filesystem locations, so for example, building boost python (and its rpaths) become problematic.

I realize that Python being a C-library, strictly speaking, there generally shouldn't be binary compatibility concerns, and we could just use the spack provided one, but it still somehow feels "wrong".

Has anyone thought about this (or a similar problem)?  I can think of some "hacky" ways of addressing this problem but I wondered if there is a better/recommended approach.

Thanks,

Tom

Elizabeth A. Fischer

unread,
Dec 3, 2016, 4:14:01 PM12/3/16
to Tomas Puverle, Spack
Tomas,

Look up `spack module loads --dependencies`.  This will load a consistent set of modules.  Also, look up Spack views.

-- Elizabeth

Tomas Puverle

unread,
Dec 3, 2016, 4:57:48 PM12/3/16
to elizabet...@columbia.edu, Spack
Thanks. I actually knew about both; not wanting to bother the list with
questions answered in the manual, I read it "cover to cover", so to
speak, but feel they are only a partial solution. However, I apologize
if I am misunderstanding some nuance of what you propose.

The problem, as far as I can see (and also as per the notes in the
"Transitive Dependencies" in the "Workflow" section of the manual) short
of creating "virtual" packages, there is no way to load a consistent set
of "top-level" libraries which potentially share nodes in the DAG, nor
is there a way to find the concretized "least common denominator" for
two libraries, whose abstract depency DAGs may overlap.

In general, I am more interested in leveraging existing, pre-packaged
solutions if possible, rather than migrating them to spack, as pypi,
anaconda, R etc. all have a strong packaging community behind them.
Perhaps having the concept of "abstract repos" as possible providers for
packages would be an alternative solution.

spack install pypi.<packagename>

anyone?

Spack sort of already has a hard-coded concept of this for "locally
installed packages" using the packages.yaml concept. In this framework,
this would just become another repo. Just a thought.

All the best,

Tomas
Reply all
Reply to author
Forward
0 new messages