Get java.lang.ExceptionInInitializerError after command "sbt\sbt package"

1,214 views
Skip to first unread message

PeacefulBY

unread,
May 1, 2013, 2:49:50 PM5/1/13
to spark...@googlegroups.com
I want to install spark

My machine:
windows server 2003
spark 0.7.0, unzip to D:\spark
scala 2.9.2, unzip to D:\scala(env variable Path add folder bin)
java ditto

using cmd cd to D:\spark, get java.lang.ExceptionInInitializerError after command "sbt\sbt package"
log(all jars were downloaded by last try):
D:\spark>sbt\sbt package
[info] Loading project definition from D:\spark\project\project
[info] Loading project definition from D:\spark\project
[info] Set current project to root (in build file:/D:/spark/)
[info] Compiling twirl template ...\spark\common\layout.scala.html to .../layout.template.scala
[error] {file:/D:/spark/}core/*:twirl-compile: java.lang.ExceptionInInitializerError
[error] Total time: 1 s, completed 2013-5-2 2:40:17

error log in file D:\spark\core\target\streams\$global\twirl-compile\$global\out:
[debug] Preparing 16 Twirl template(s) ...
[info] Compiling twirl template ...\spark\common\layout.scala.html to .../layout.template.scala
java.lang.ExceptionInInitializerError
at twirl.compiler.TwirlCompiler$TemplateAsFunctionCompiler$.getFunctionMapping(TwirlCompiler.scala:549)
at twirl.compiler.TwirlCompiler$.generateFinalTemplate(TwirlCompiler.scala:488)
at twirl.compiler.TwirlCompiler$.compile(TwirlCompiler.scala:186)
at twirl.sbt.TemplateCompiler$$anonfun$compile$3.apply(TemplateCompiler.scala:46)
at twirl.sbt.TemplateCompiler$$anonfun$compile$3.apply(TemplateCompiler.scala:44)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:44)
at twirl.sbt.TemplateCompiler$.compile(TemplateCompiler.scala:44)
at twirl.sbt.TwirlPlugin$Twirl$$anonfun$settings$4.apply(TwirlPlugin.scala:50)
at twirl.sbt.TwirlPlugin$Twirl$$anonfun$settings$4.apply(TwirlPlugin.scala:50)
at sbt.Scoped$$anonfun$hf6$1.apply(Structure.scala:477)
at sbt.Scoped$$anonfun$hf6$1.apply(Structure.scala:477)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:41)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$11.apply(Structure.scala:295)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$11.apply(Structure.scala:295)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$5.work(System.scala:67)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:221)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:221)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:227)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:221)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:221)
at sbt.CompletionService$$anon$1$$anon$2.call(CompletionService.scala:26)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: scala.tools.nsc.MissingRequirementError: object scala not found.
at scala.tools.nsc.symtab.Definitions$definitions$.getModuleOrClass(Definitions.scala:655)
at scala.tools.nsc.symtab.Definitions$definitions$.getModule(Definitions.scala:605)
at scala.tools.nsc.symtab.Definitions$definitions$.ScalaPackage(Definitions.scala:145)
at scala.tools.nsc.symtab.Definitions$definitions$.ScalaPackageClass(Definitions.scala:146)
at scala.tools.nsc.symtab.Definitions$definitions$.AnyClass(Definitions.scala:176)
at scala.tools.nsc.symtab.Definitions$definitions$.init(Definitions.scala:814)
at scala.tools.nsc.Global$Run.<init>(Global.scala:697)
at scala.tools.nsc.interactive.Global$TyperRun.<init>(Global.scala:927)
at scala.tools.nsc.interactive.Global.newTyperRun(Global.scala:950)
at scala.tools.nsc.interactive.Global.<init>(Global.scala:166)
at twirl.compiler.TwirlCompiler$TemplateAsFunctionCompiler$CompilerInstance.compiler(TwirlCompiler.scala:602)
at twirl.compiler.TwirlCompiler$TemplateAsFunctionCompiler$PresentationCompiler$.<init>(TwirlCompiler.scala:646)
at twirl.compiler.TwirlCompiler$TemplateAsFunctionCompiler$PresentationCompiler$.<clinit>(TwirlCompiler.scala)
at twirl.compiler.TwirlCompiler$TemplateAsFunctionCompiler$.getFunctionMapping(TwirlCompiler.scala:549)
at twirl.compiler.TwirlCompiler$.generateFinalTemplate(TwirlCompiler.scala:488)
at twirl.compiler.TwirlCompiler$.compile(TwirlCompiler.scala:186)
at twirl.sbt.TemplateCompiler$$anonfun$compile$3.apply(TemplateCompiler.scala:46)
at twirl.sbt.TemplateCompiler$$anonfun$compile$3.apply(TemplateCompiler.scala:44)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:44)
at twirl.sbt.TemplateCompiler$.compile(TemplateCompiler.scala:44)
at twirl.sbt.TwirlPlugin$Twirl$$anonfun$settings$4.apply(TwirlPlugin.scala:50)
at twirl.sbt.TwirlPlugin$Twirl$$anonfun$settings$4.apply(TwirlPlugin.scala:50)
at sbt.Scoped$$anonfun$hf6$1.apply(Structure.scala:477)
at sbt.Scoped$$anonfun$hf6$1.apply(Structure.scala:477)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:41)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$11.apply(Structure.scala:295)
at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$11.apply(Structure.scala:295)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$5.work(System.scala:67)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:221)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:221)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:227)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:221)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:221)
at sbt.CompletionService$$anon$1$$anon$2.call(CompletionService.scala:26)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
[error] {file:/D:/spark/}core/*:twirl-compile: java.lang.ExceptionInInitializerError

How can I fix it?

Patrick Wendell

unread,
May 1, 2013, 9:06:16 PM5/1/13
to spark...@googlegroups.com
Hi There,

I haven't seen that particular error, but it's an interaction between your system and the Twirl template engine. I see on the Twirl users list that people have seen this error before because of using older Java compilers - you might try making user you have at least Java 6 installed.

Also, in general if you are building spark for development you may want to setup a Linux VM or something, developing on Windows is not widely done AFAIK.


--
You received this message because you are subscribed to the Google Groups "Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-users...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

PeacefulBY

unread,
May 2, 2013, 5:34:25 AM5/2/13
to spark...@googlegroups.com
Thank you:)
My Java version is 1.7
Yes your are right, I think it might be something wrong in Windows Server/XP, but now I need to use this environment temporarily.
In Windows Server/XP the .sbt folder will be in "C:\Documents and Settings\[User]", with spaces in it, that's the problem I think
then my question is how can I config it to change the default .sbt folder path? Is build.properties or plugins.sbt or SparkBuild.scala in D:/spark/project?

在 2013年5月2日星期四UTC+8上午9时06分16秒,Patrick Wendell写道:

PeacefulBY

unread,
May 8, 2013, 4:59:42 AM5/8/13
to spark...@googlegroups.com
fix it by edit sbt.cmd from:

set SPARK_HOME=%~dp0..
java -Xmx1200M -XX:MaxPermSize=200m %EXTRA_ARGS% -jar %SPARK_HOME%\sbt\sbt-launch-0.11.3-2.jar "%*"

to:

set SBT_TEMP=-Dsbt.global.base=D:\sbt\.sbt
set IVY_HOME=-Dsbt.ivy.home=D:\sbt\.ivy2
set SPARK_HOME=%~dp0..
java -Xmx1200M -XX:MaxPermSize=200m %EXTRA_ARGS% %SBT_TEMP% %IVY_HOME% -jar %SPARK_HOME%\sbt\sbt-launch-0.11.3-2.jar "%*"

在 2013年5月2日星期四UTC+8下午5时34分25秒,PeacefulBY写道:
Reply all
Reply to author
Forward
0 new messages