play framework 2.5.x and incompatibility with spark 1.6.1

859 views
Skip to first unread message

Karthik Ram

unread,
May 19, 2016, 5:00:02 PM5/19/16
to play-framework
I'm struggling to get play 2.5.x to work with spark 1.6.1 

A simple scala spark app works without any issues if I use 2.2.0 and spark 1.0.1 (i know ancient) - anything newer fails at run time.

Can some one help  / fix this ? 

here's my build.sbt.

name := """spark-test-1"""

version := "1.0-SNAPSHOT"

lazy val root = (project in file(".")).enablePlugins(PlayScala)

scalaVersion := "2.11.7"

libraryDependencies ++= Seq(
  jdbc,
  cache,
  ws,
  "org.scalatestplus.play" %% "scalatestplus-play" % "1.5.1" % Test,
  "org.apache.spark"  %% "spark-core"              % "1.6.1",
  "com.typesafe.akka" %% "akka-actor"              % "2.2.3", 
  "com.typesafe.akka" %% "akka-slf4j"              % "2.2.3",
  "org.apache.spark"  %% "spark-sql"               % "1.6.1",
  "org.apache.spark"  %% "spark-mllib"             % "1.6.1"
)

resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"


I've tried with higher versions of akka  (2.4.4) and they all sadly give the same result..

[info] application - ApplicationTimer demo: Starting application at 2016-05-19T20:51:33.009Z.
[info] play.api.Play - Application started (Dev)
[warn] o.a.h.u.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
java.lang.VerifyError: class com.fasterxml.jackson.module.scala.ser.ScalaIteratorSerializer overrides final method withResolved.(Lcom/fasterxml/jackson/databind/BeanProperty;Lcom/fasterxml/jackson/databind/jsontype/TypeSerializer;Lcom/fasterxml/jackson/databind/JsonSerializer;)Lcom/fasterxml/jackson/databind/ser/std/AsArraySerializerBase;
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.fasterxml.jackson.module.scala.ser.IteratorSerializerModule$class.$init$(IteratorSerializerModule.scala:70)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.<init>(DefaultScalaModule.scala:19)
at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<init>(DefaultScalaModule.scala:35)
at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<clinit>(DefaultScalaModule.scala)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:81)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)
at utils.SparkSQL$.simpleSparkSQLApp(SparkSQL.scala:21)
at controllers.HomeController$$anonfun$index$1$$anonfun$apply$1.apply$mcV$sp(HomeController.scala:25)
at controllers.HomeController$$anonfun$index$1$$anonfun$apply$1.apply(HomeController.scala:25)
at controllers.HomeController$$anonfun$index$1$$anonfun$apply$1.apply(HomeController.scala:25)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Greg Methvin

unread,
May 21, 2016, 4:30:35 PM5/21/16
to play-framework
It looks like a Jackson version incompatibility. Spark 1.6.1 uses an older version of Jackson and includes jackson-module-scala. Play uses a newer version of Jackson but not jackson-module-scala (since it doesn't use Jackson directly for Scala JSON serialization). I would try first upgrading to the latest jackson-module-scala in your project (2.7.2). I'm not sure what the binary compatibility is like for Jackson so if that doesn't work you may have to play with the versions. At least Jackson and jackson-module-scala should have compatible versions.

Greg Methvin

unread,
May 22, 2016, 1:15:13 AM5/22/16
to play-framework
Hi Karthik,

I just downloaded the test application you posted on github. It seems like adding this line to build.sbt to set the jackson-module-scala version gets it to start correctly:

  "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.7.2",

There still could be compatibility issues but it's probably okay as long as Spark isn't using any deprecated Jackson functionality. We've generally been able to upgrade minor Jackson versions in Play with no or very few changes.

--
You received this message because you are subscribed to the Google Groups "play-framework" group.
To unsubscribe from this group and stop receiving emails from it, send an email to play-framewor...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/play-framework/7a1a5061-0128-46f1-9970-4b67f47ce219%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--
Greg Methvin
Senior Software Engineer

Reply all
Reply to author
Forward
0 new messages