On 4/22/15, 2:00 PM, "Ondřej Čertík" <
ondrej...@gmail.com> wrote:
>Hi,
>
>I am still trying to understand how package versions work in Spack.
>
>Can I build scipy with lapack 3.4.2? Sure.
>Can I build numpy with lapack 3.4.1? Sure.
>
>Can I then load numpy and scipy into the same environment? I think
>that is asking for trouble. How does Spack prevent that?
Nothing's preventing you from doing this with the modules right now --
more module support would be nice. I think modules are a good way to do
this because it's familiar to users and you could potentially augment your
numpy installs with modules for hand installs or external system software.
But there's a tair amount of support that would need to be implemented
before the modules prevent you from doing something stupid.
You bring up a good point with lapack. Say you had this:
nu...@0.15.0 ^numpy@1.6
sc...@0.15.0 ^numpy@1.7
If you activated the first scipy, it would also activate its numpy in the
same python install. If you then try to activate the second scipy, Spack
will complain because the already-activated numpy conflicts with the one
scipy brings in. You could still force-deactivate the first numpy, with
the -a flag to take out its dependencies too, and then bring in scipy,
But once you start using -f flags you're on your own. The less risky way
to swap is to deactivate -a the first scipy, then activate the second one.
I am leaning more and more towards moving the activate functionality into
something resembling profiles. I think that's a better way to do this,
and it gets you virtualenv-ish semantics for all your packages, not just
python ones.
>The same question with compilers --- i.e. if I build lapack with gcc for
>scipy, with intel for numpy and load both.
I'm not ruling this out at the moment, as I consider it to be nice that
you don't have to build a stack with a common compiler. e.g., we have
tools right now that want to be built with gcc that we would also want to
use with an icc build. What might be useful here would be some
consistency checks, which could happen at multiple levels. You could
check in modules whether it makes sense to load something with the current
environment, AND you could check at build time to ensure that icc's and
gcc's in the same build are actually compatible. One thing spack could do
automatically with compiler wrappers is ensure that your icc link is done
right, and that it provides the right -gcc-version flag to go with the gcc
you're cross-linking with.
There's no such check right now, but the default concretization strategy
does keep the compilers consistent by default. For any module that has no
assigned compilers, spack will try to be consistent with any ancestor in
the DAG that has a compiler set. This is why when you do "spack install
libelf%intel", all the dependencies end up building with intel by default,
too.
-Todd
>
>Ondrej
>
>--
>You received this message because you are subscribed to the Google Groups
>"Spack" group.
>To unsubscribe from this group and stop receiving emails from it, send an
>email to
spack+un...@googlegroups.com.
>To post to this group, send email to
sp...@googlegroups.com.
>Visit this group at
http://groups.google.com/group/spack.
>For more options, visit
https://groups.google.com/d/optout.