jupyter pyspark issue

0 views
Skip to first unread message

krishnachai...@gmail.com

unread,
Jun 27, 2016, 11:56:47 PM6/27/16
to Anaconda - Public
after activating the anaconda parcel, i’m getting below error while executing the script
[root@hostname~]# PYSPARK_PYTHON=/opt/cloudera/parcels/Anaconda/bin/python spark-submit pi.py 1000
WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/spark) overrides detected (/opt/cloudera/parcels/CDH/lib/spark/).
WARNING: Running spark-class from user-defined location.
jupyter: '/home/admin/pi.py' is not a Jupyter command

Kristopher Overholt

unread,
Jun 29, 2016, 2:52:47 PM6/29/16
to Anaconda - Public, krishnachai...@gmail.com
Aside from the warnings (which don't appear to be causing any errors), I am unable to determine why you are getting an error about Jupyter when using spark-submit. Is there something in pi.py that calls Jupyter? Can you post the contents of pi.py?

ben.z...@continuum.io

unread,
Jun 29, 2016, 3:43:37 PM6/29/16
to Anaconda - Public, krishnachai...@gmail.com
It would also be great if you could post other env variables set:

For example:

mrmr@asdb:~$ env
TERM_PROGRAM=iTerm.app
TERM=xterm-color
SHELL=/bin/bash
HADOOP_HOME=/usr/local/Cellar/hadoop/2.6.0/
TMPDIR=/var/folders/1t/t94brwgx7sjcn8jgz4gr3_c00000gq/T/
mrmr@asdb:~$

ss.sha...@gmail.com

unread,
Nov 16, 2016, 8:42:18 AM11/16/16
to Anaconda - Public, krishnachai...@gmail.com
hi i have the same problem with running spark...it keeps saying that .py is not a jupiter command...any help is appreciated.
Reply all
Reply to author
Forward
0 new messages