:cp and :dp not working

86 views
Skip to first unread message

Franklin Zhang

unread,
Aug 17, 2015, 1:36:43 PM8/17/15
to spark-notebook-user
Hello, 

When I run :cp and :dp, 

I get the following error:

:cp /home/zhangfranklin_org/.ivy2/local/default/neuron_2.11/0.1-SNAPSHOT/jars/neuron_2.11.jar
<console>:55: error: not found: value jars
              jars.toList
              ^
<console>:59: error: not found: value jars
val $ires3 = jars
             ^
<console>:50: error: not found: value jars
              jars = (List("/home/zhangfranklin_org/.ivy2/local/default/neuron_2.11/0.1-SNAPSHOT/jars/neuron_2.11.jar") ::: jars.toList).distinct.toArray


andy petrella

unread,
Aug 23, 2015, 1:56:45 PM8/23/15
to Franklin Zhang, spark-notebook-user

arf damned, didn't see your message, I was just catching up my mails after my yep to SFO, sorry.

do you use a distro? 2.11 based? it'll help me debugging you

Cheers
Andy


--
You received this message because you are subscribed to the Google Groups "spark-notebook-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-notebook-...@googlegroups.com.
To post to this group, send email to spark-not...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/spark-notebook-user/3edadc0c-77bd-48bd-ae70-6db8068d60a0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
andy

florian....@googlemail.com

unread,
Aug 28, 2015, 11:35:11 AM8/28/15
to spark-notebook-user, zhangfra...@gmail.com
Hi Andy!

I get the same error when I use :dp to load a dependency.

I have two cells:
:remote-repo oss-sonatype % default % https://oss.sonatype.org/content/repositories/releases/

:dp "net.sourceforge.jregex % jregex % 1.2_01"

The error is the following:
<console>:55: error: not found: value jars
              jars.toList
              ^
<console>:59: error: not found: value jars
val $ires3 = jars
             ^
<console>
:50: error: not found: value jars
              jars = (List("/tmp/spark-notebook/aether/12c780b7-11b9-47a9-9c0d-97e66686892e/net/sourceforge/jregex/jregex/1.2_01/jregex-1.2_01.jar") ::: jars.toList).distinct.toArray
              ^

I'm using this version of notebook: spark-notebook-0.6.0-scala-2.11.6-spark-1.4.1-hadoop-1.0.4

Do you have any idea on how to fix it?

Btw: Is there any place where I can find example-notebooks? Sometimes it helps to browse through some working examples.

Thx!
Cheers,
Florian

Am Sonntag, 23. August 2015 19:56:45 UTC+2 schrieb andy petrella:

arf damned, didn't see your message, I was just catching up my mails after my yep to SFO, sorry.

do you use a distro? 2.11 based? it'll help me debugging you

Cheers
Andy


Le lun. 17 août 2015 19:36, Franklin Zhang <zhangfra...@gmail.com> a écrit :
Hello, 

When I run :cp and :dp, 

I get the following error:

:cp /home/zhangfranklin_org/.ivy2/local/default/neuron_2.11/0.1-SNAPSHOT/jars/neuron_2.11.jar
<console>:55: error: not found: value jars
              jars.toList
              ^
<console>:59: error: not found: value jars
val $ires3 = jars
             ^
<console>:50: error: not found: value jars
              jars = (List("/home/zhangfranklin_org/.ivy2/local/default/neuron_2.11/0.1-SNAPSHOT/jars/neuron_2.11.jar") ::: jars.toList).distinct.toArray


--
You received this message because you are subscribed to the Google Groups "spark-notebook-user" group.
--
andy

andy petrella

unread,
Aug 28, 2015, 11:45:14 AM8/28/15
to Franklin Zhang, spark-notebook-user
Yes, that's something that has been fixed on master for scala 2.11 only.

So you can build the master branch for instance, or switch back to 2.10.

I might publish a patch sooner for this problem, like 0.6.1 actually.

Sorry for the inconvenience :-(
--
andy

Florian Witteler

unread,
Aug 28, 2015, 12:09:15 PM8/28/15
to andy petrella, Franklin Zhang, spark-notebook-user
Awesome, thanks a lot. 

Don't be sorry. Thanks for your effort creating and maintaining this awesome tool!

Cheers, Florian
You received this message because you are subscribed to a topic in the Google Groups "spark-notebook-user" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-notebook-user/5vx7B_B1xW4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to spark-notebook-...@googlegroups.com.

To post to this group, send email to spark-not...@googlegroups.com.

andy petrella

unread,
Aug 28, 2015, 7:13:46 PM8/28/15
to Florian Witteler, Franklin Zhang, spark-notebook-user
Thanks a lot for the kind words :-D

So, just to be sure, do you need this new minor? Or can you live without it until next 0.7.0? Probably within the next coming couple of months.
Anyway, I'll publish a 0.6.1 if 1.5.0 is released before... ^-^

cheers

--
andy

Florian Witteler

unread,
Aug 28, 2015, 10:12:09 PM8/28/15
to andy petrella, Florian Witteler, Franklin Zhang, spark-notebook-user
A new minor would be awesome. 
We're parsing some log-files of our nginx'es with spark in an exploratory manner.
I noticed that java's regex is the bottleneck there. 
Using that other regex-lib is ~3,5x faster. (1,2mins vs. 7,3mins)

An out-of-the-box working spark-notebook would come in handy. 
Hope there aren't too many manual steps involved. 
Thanks for your time!

Best,
Florian 

andy petrella

unread,
Aug 30, 2015, 10:58:17 AM8/30/15
to Florian Witteler, Franklin Zhang, spark-notebook-user
Okido,

Actually, I guess you could live with the master build then, it's rather stable, and you can have a distro you like easily from the http://spark-notebook.io too. Just pick the master version for the notebook, and after ~4minutes you'll have the path in you mail to download it :-).

Since, it'd require me to backport the fix in the branch and publish it to the generator, it could take me sometime and won't be as fast as the option above. However, of course, if it's a blocker for you to use the master build.

Hope it can work :-P

--
andy

razumo...@gmail.com

unread,
Aug 31, 2015, 3:27:22 AM8/31/15
to spark-notebook-user, florian....@googlemail.com, zhangfra...@gmail.com
Hi,
 got the same problem, and just wanted to  add that switching to 2.10 release is only fixing :cp:dp part of the problem; updating local jars like in Simple(hard link) classpath example still giving the same exception.

Cheers
Max
To unsubscribe from this group and stop receiving emails from it, send an email to spark-notebook-user+unsub...@googlegroups.com.
--
andy
--
andy

--
You received this message because you are subscribed to a topic in the Google Groups "spark-notebook-user" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-notebook-user/5vx7B_B1xW4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to spark-notebook-user+unsub...@googlegroups.com.
--
andy
--
andy

andy petrella

unread,
Sep 2, 2015, 3:31:31 PM9/2/15
to razumo...@gmail.com, spark-notebook-user, florian....@googlemail.com, zhangfra...@gmail.com

Hi Max,

Just has some time to check the notebook you talked about.
So it needs to be updated because

jars = ("~/.m2/repository/joda-time/joda-time/2.4/joda-time-2.4.jar" :: jars.toList).toArray
reset()

is wrong and outdated
1/ the path is wrong :-/
2/ it’s not necessary anymore :-) → the jar is put in the spark context by default since 0.6.0/

So I just updated it locally with a newer version of the jar that I check existing locally at that path then the notebook worked fine:

:cp ~/.ivy2/cache/joda-time/joda-time/jars/joda-time-2.8.2.jar

HTH

andy

To unsubscribe from this group and stop receiving emails from it, send an email to spark-notebook-...@googlegroups.com.
To post to this group, send email to spark-not...@googlegroups.com.
--
andy
--
andy

--
You received this message because you are subscribed to a topic in the Google Groups "spark-notebook-user" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-notebook-user/5vx7B_B1xW4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to spark-notebook-...@googlegroups.com.

To post to this group, send email to spark-not...@googlegroups.com.
--
andy
--
andy

--
You received this message because you are subscribed to the Google Groups "spark-notebook-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-notebook-...@googlegroups.com.

To post to this group, send email to spark-not...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

--
andy
Reply all
Reply to author
Forward
0 new messages