I have been searching in stackoverflow and other places for the error I am seeing now and tried a few "answers", none is working here (I will continue search though and update here):
I have a new Ubuntu and Anaconda3 is installed, Spark 2 is installed:
Anaconda3: /home/rxie/anaconda Spark2: /home/rxie/Downloads/spark
I am able to start up Jupyter Notebook, however, not able to create SparkSession:
from pyspark.conf import SparkConf
ModuleNotFoundError Traceback (most recent call last) in () ----> 1 from pyspark.conf import SparkConf
ModuleNotFoundError: No module named 'pyspark'
Here is my environments in the .bashrc:
export JAVA_HOME=/usr/lib/jvm/java-8-oracle
export SPARK_HOME=/home/rxie/spark/
export SBT_HOME=/usr/share/sbt/bin/sbt-launch.jar
export SCALA_HOME=/usr/local/src/scala/scala-2.10.4
export PATH=$SCALA_HOME/bin:$PATH
export PATH=$SPARK_HOME/bin:$PATH
export PATH=$PATH:$SBT_HOME/bin:$SPARK_HOME/bin
# added by Anaconda3 installer
export PATH="/home/rxie/anaconda3/bin:$PATH"
export PATH=$SPARK_HOME/bin:$PATH
export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS='notebook'
What's wrong with the import SparkConf in jupyter notebook?
It is greatly appreciated if anyone can shed me with any light, thank you very much.