How to configure/Integrate PySpark with Jupyter on Windows

157 views
Skip to first unread message

Vikas K.

unread,
Dec 8, 2016, 12:51:42 PM12/8/16
to Project Jupyter

Hey Guys, 
Hope you are doing well.

I am getting issue while integrating PySpark with Jupyter. I tried many ways but not able to configure it. Whenever I am trying I am getting following error, Please help me out.
Please tell me If you have any proper link to configure\Intergate it. Thanks.




MinRK

unread,
Dec 19, 2016, 7:45:58 AM12/19/16
to Project Jupyter
The default IPython kernel doesn't set up a spark context by default. There are lots of ways to hook Python up to Spark, so I'm not sure which you have tried, but it is quite possible that the Spark integration is done through a custom Kernel spec, which you may find in the Kernel drop-down menu.

-Min

--
You received this message because you are subscribed to the Google Groups "Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jupyter+unsubscribe@googlegroups.com.
To post to this group, send email to jup...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jupyter/e524fb77-e203-446b-8c1a-b14bdb57ec4a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Sam cris

unread,
Mar 14, 2017, 7:06:11 AM3/14/17
to Project Jupyter
i am facing the same problem.did you find a solution??
please help me out

Vikas Kumar

unread,
Mar 15, 2017, 1:12:17 AM3/15/17
to jup...@googlegroups.com
Nope, not yet.
Suggestion - Setting up environment is bit difficult. So Its better to use Databricks cloud which gives you 6 GB free space.

--
You received this message because you are subscribed to a topic in the Google Groups "Project Jupyter" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/jupyter/-0qrhMD1VkU/unsubscribe.
To unsubscribe from this group and all its topics, send an email to jupyter+unsubscribe@googlegroups.com.

To post to this group, send email to jup...@googlegroups.com.

Peter Parente

unread,
Mar 19, 2017, 4:39:53 PM3/19/17
to Project Jupyter
If you're open to using docker on your Windows machine, you can "docker pull jupyter/all-spark-notebook" and have a working environment immediately (see https://github.com/jupyter/docker-stacks/tree/master/all-spark-notebook). 

If you're set on having a native-to-Windows Jupyter plus Spark installation, and are open to using conda packages, then https://github.com/conda-forge/staged-recipes/pull/2497 will provide a way to do "conda create -n my-spark-env notebook pyspark" in the near future (assuming it merges).

Cheers,
Pete


On Wednesday, March 15, 2017 at 1:12:17 AM UTC-4, Vikas K. wrote:
Nope, not yet.
Suggestion - Setting up environment is bit difficult. So Its better to use Databricks cloud which gives you 6 GB free space.
On Tue, Mar 14, 2017 at 4:36 PM, Sam cris <akonc...@gmail.com> wrote:
i am facing the same problem.did you find a solution??
please help me out

On Thursday, December 8, 2016 at 11:21:42 PM UTC+5:30, Vikas K. wrote:

Hey Guys, 
Hope you are doing well.

I am getting issue while integrating PySpark with Jupyter. I tried many ways but not able to configure it. Whenever I am trying I am getting following error, Please help me out.
Please tell me If you have any proper link to configure\Intergate it. Thanks.




--
You received this message because you are subscribed to a topic in the Google Groups "Project Jupyter" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/jupyter/-0qrhMD1VkU/unsubscribe.
To unsubscribe from this group and all its topics, send an email to jupyter+u...@googlegroups.com.

To post to this group, send email to jup...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages