pyspark

0 views
Skip to first unread message

Jose Antonio Martin H.

unread,
Oct 6, 2015, 6:16:30 AM10/6/15
to conda - Public
Hello, I want to install pyspark in conda.

I am using PYTHONPATH, I haven't found any other option.

There is a way to tell conda to include a hard link or my folder of pyspark into the conda site packages ?


Thanks !
Jose

Michael Nazario

unread,
Oct 6, 2015, 4:05:59 PM10/6/15
to conda - Public
PySpark is not currently installable through pip. There's a pull request in Spark for this: https://github.com/apache/spark/pull/8318/files.

I'd suggest using PYTHONPATH since it is the "correct" way to do it for PySpark right now.

Ben Zaitlen

unread,
Oct 6, 2015, 4:12:29 PM10/6/15
to Michael Nazario, conda - Public
I've spark conda packages in the past and try to stay up to date.  You can pull them from anaconda.org

conda install -c anaconda-cluster spark

I'm currently building spark 1.5.1 for for OS X and Linux py27/py3 now

--Ben 

--
You received this message because you are subscribed to the Google Groups "conda - Public" group.
To unsubscribe from this group and stop receiving emails from it, send an email to conda+un...@continuum.io.
To post to this group, send email to co...@continuum.io.
Visit this group at http://groups.google.com/a/continuum.io/group/conda/.

Reply all
Reply to author
Forward
0 new messages