Hello!
Glad you discovered us and you like the tool 😄
Sure you can run your program because you can inject jars in a notebook. To run your program with it, they will restore to take the sparkContext as an argument since it's provided by the notebook itself.
Hope it answers your question 😅
Enjoy
Cheers
Andy
--
You received this message because you are subscribed to the Google Groups "spark-notebook-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-notebook-...@googlegroups.com.
To post to this group, send email to spark-not...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/spark-notebook-user/391643c0-7873-460d-a137-a81bb98fe239%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups "spark-notebook-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-notebook-...@googlegroups.com.
To post to this group, send email to spark-not...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/spark-notebook-user/be9a8a5e-12aa-4178-a57d-bafc304ee069%40googlegroups.com.
oki
the easiest is to use the cp version then. just create a cell with this :cp pointing to your jar
the thing is that your jar will need to have all dependencies in it (Uber jar or assembly).
to use dp it'll require you to have your project in a ivy2 repo (sbt publishLocal) or in a maven repo. For the first you can use the local repo pointing to your .ivy2 folder, for the latter you'll need to add your local or remote maven repo in the remote repo
sorry writing on a phone now... hard