--
You received this message because you are subscribed to the Google Groups "Druid User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-user+...@googlegroups.com.
To post to this group, send email to druid...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-user/1333f559-c887-47fd-a288-0fb2f1e8b004%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to a topic in the Google Groups "Druid User" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/druid-user/R7RZ30iq8Vw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to druid-user+...@googlegroups.com.
To post to this group, send email to druid...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-user/CACZNdYDPuZeba6Ob4uf-9wBcEwvynaiwLCLN8VX3aQgPAaxYjQ%40mail.gmail.com.
Thanks Gian for your response.I have both versions of jackson jars in classpath and spark is picking up the old one, which is bundled with spark and not the newer, which is bundled with my application jar.I have received an answer on how to force the newer one at http://stackoverflow.com/questions/34431329/spark-druid-tranquility-library-version-conflictI'm going to try that when I'm back from vacations next week. I will update the post once I've tried that.Thanks againAshish
On Tue, Dec 29, 2015 at 4:02 AM, Gian Merlino <gi...@imply.io> wrote:
Hey Ashish,I'm guessing this is due to mixing different versions of jackson-databind with jackson-datatype-joda (possibly 2.6.1 of joda with an older databind). Would it work for you to include the same version of both jackson jars on Spark's classpath? I *think* you shouldn't have to recompile Spark- just get it to load the newer jacksons.If not then we can possibly bundle tranquility-spark specifically with an older version of jackson.Either way, if you could update this thread or https://github.com/druid-io/tranquility/issues/76 with whether you do end up finding a workaround, that would be super helpful.Thanks!
Gian
On Wed, Dec 23, 2015 at 3:59 AM, Ashish Awasthi <ashish....@gmail.com> wrote:
I've posted an issue I'm facing with using Spark and Tranquility to: http://stackoverflow.com/questions/34431329/spark-druid-tranquility-library-version-conflictI'm writing here, just to get the attention of Druid Users, don't want to duplicate the post here.I wonder if Tranquility-Spark works with standard spark build.Any pointers to resolve the conflict?
--
You received this message because you are subscribed to the Google Groups "Druid User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-user+unsubscribe@googlegroups.com.
To post to this group, send email to druid...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-user/1333f559-c887-47fd-a288-0fb2f1e8b004%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to a topic in the Google Groups "Druid User" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/druid-user/R7RZ30iq8Vw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to druid-user+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-user/d0403df3-4242-44ff-bfdb-6243e460c5c3%40googlegroups.com.
Caused by: com.google.inject.CreationException: Guice creation errors: 1) An exception was caught and reported. Message: Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath. at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133) 2) No implementation for javax.validation.Validator was bound. at io.druid.guice.ConfigModule.configure(ConfigModule.java:37) 2 errors at com.google.inject.internal.Errors.throwCreationExceptionIfErrorsExist(Errors.java:435) at com.google.inject.internal.InternalInjectorCreator.initializeStatically(InternalInjectorCreator.java:154) at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:106) at com.google.inject.Guice.createInjector(Guice.java:95) at com.google.inject.Guice.createInjector(Guice.java:72) at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57) at com.metamx.tranquility.druid.DruidGuicer$.<init>(DruidGuicer.scala:39) at com.metamx.tranquility.druid.DruidGuicer$.<clinit>(DruidGuicer.scala) ... 17 more Caused by: javax.validation.ValidationException: Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath. at javax.validation.Validation$GenericBootstrapImpl.configure(Validation.java:271) at javax.validation.Validation.buildDefaultValidatorFactory(Validation.java:110) at io.druid.guice.ConfigModule.configure(ConfigModule.java:37) at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223) at com.google.inject.spi.Elements.getElements(Elements.java:101) at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133) at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103)
--
You received this message because you are subscribed to the Google Groups "Druid User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-user+...@googlegroups.com.
To post to this group, send email to druid...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-user/11a649c7-c07b-433d-816e-0f0ead57ff86%40googlegroups.com.
"io.druid" % "druid" % "0.7.3",
"io.druid" % "druid-processing" % "0.7.3",
"io.druid" % "tranquility-core_2.10" % "0.6.4",
"io.druid" % "tranquility-spark_2.10" % "0.6.4",
"org.hibernate" % "hibernate-validator" % "4.2.0.Final",
"org.hibernate" % "hibernate-validator-annotation-processor" % "4.1.0.Final"
16/01/07 16:04:54 INFO Guice: An exception was caught and reported. Message: javax.validation.ValidationException: Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath.
javax.validation.ValidationException: Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath.at javax.validation.Validation$GenericBootstrapImpl.configure(Validation.java:271)at javax.validation.Validation.buildDefaultValidatorFactory(Validation.java:110)at io.druid.guice.ConfigModule.configure(ConfigModule.java:37)
at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:230)at com.google.inject.spi.Elements.getElements(Elements.java:103)at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:136)at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104)at com.google.inject.Guice.createInjector(Guice.java:96)at com.google.inject.Guice.createInjector(Guice.java:73)
at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57)at com.metamx.tranquility.druid.DruidGuicer$.<init>(DruidGuicer.scala:39)at com.metamx.tranquility.druid.DruidGuicer$.<clinit>(DruidGuicer.scala)
at com.metamx.tranquility.druid.DruidBeams$BuilderConfig$$anon$6.<init>(DruidBeams.scala:261)at com.metamx.tranquility.druid.DruidBeams$BuilderConfig.buildAll(DruidBeams.scala:259)at com.metamx.tranquility.druid.DruidBeams$Builder.buildBeam(DruidBeams.scala:182)at com.hulu.metrics.streaming.MapBeamFactory.makeBeam$lzycompute(MapBeamFactory.scala:47)at com.hulu.metrics.streaming.MapBeamFactory.makeBeam(MapBeamFactory.scala:22)at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:37)at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:36)at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:903)at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:903)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1935)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1935)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:67)at org.apache.spark.scheduler.Task.run(Task.scala:88)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)at java.lang.Thread.run(Thread.java:745)16/01/07 16:04:54 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)java.lang.NoClassDefFoundError: Could not initialize class com.metamx.tranquility.druid.DruidGuicer$at com.metamx.tranquility.druid.DruidBeams$BuilderConfig$$anon$6.<init>(DruidBeams.scala:261)at com.metamx.tranquility.druid.DruidBeams$BuilderConfig.buildAll(DruidBeams.scala:259)at com.metamx.tranquility.druid.DruidBeams$Builder.buildBeam(DruidBeams.scala:182)at com.hulu.metrics.streaming.MapBeamFactory.makeBeam$lzycompute(MapBeamFactory.scala:47)at com.hulu.metrics.streaming.MapBeamFactory.makeBeam(MapBeamFactory.scala:22)at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:37)at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:36)at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:903)at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:903)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1935)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1935)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:67)at org.apache.spark.scheduler.Task.run(Task.scala:88)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)at java.lang.Thread.run(Thread.java:745)16/01/07 16:04:54 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)java.lang.ExceptionInInitializerErrorat com.metamx.tranquility.druid.DruidBeams$BuilderConfig$$anon$6.<init>(DruidBeams.scala:261)at com.metamx.tranquility.druid.DruidBeams$BuilderConfig.buildAll(DruidBeams.scala:259)at com.metamx.tranquility.druid.DruidBeams$Builder.buildBeam(DruidBeams.scala:182)at com.hulu.metrics.streaming.MapBeamFactory.makeBeam$lzycompute(MapBeamFactory.scala:47)at com.hulu.metrics.streaming.MapBeamFactory.makeBeam(MapBeamFactory.scala:22)at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:37)at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:36)at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:903)at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:903)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1935)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1935)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:67)at org.apache.spark.scheduler.Task.run(Task.scala:88)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)at java.lang.Thread.run(Thread.java:745)
Caused by: com.google.inject.CreationException: Guice creation errors:1) An exception was caught and reported. Message: Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath.
at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:136)
2) No implementation for javax.validation.Validator was bound.at io.druid.guice.ConfigModule.configure(ConfigModule.java:37)2 errors
at com.google.inject.internal.Errors.throwCreationExceptionIfErrorsExist(Errors.java:448)at com.google.inject.internal.InternalInjectorCreator.initializeStatically(InternalInjectorCreator.java:155)at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:107)at com.google.inject.Guice.createInjector(Guice.java:96)at com.google.inject.Guice.createInjector(Guice.java:73)
at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:57)at com.metamx.tranquility.druid.DruidGuicer$.<init>(DruidGuicer.scala:39)at com.metamx.tranquility.druid.DruidGuicer$.<clinit>(DruidGuicer.scala)... 17 moreCaused by: javax.validation.ValidationException: Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath.at javax.validation.Validation$GenericBootstrapImpl.configure(Validation.java:271)at javax.validation.Validation.buildDefaultValidatorFactory(Validation.java:110)at io.druid.guice.ConfigModule.configure(ConfigModule.java:37)
at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:230)at com.google.inject.spi.Elements.getElements(Elements.java:103)at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:136)at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104)... 22 more16/01/07 16:04:54 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-0,5,main]java.lang.NoClassDefFoundError: Could not initialize class com.metamx.tranquility.druid.DruidGuicer$at com.metamx.tranquility.druid.DruidBeams$BuilderConfig$$anon$6.<init>(DruidBeams.scala:261)at com.metamx.tranquility.druid.DruidBeams$BuilderConfig.buildAll(DruidBeams.scala:259)at com.metamx.tranquility.druid.DruidBeams$Builder.buildBeam(DruidBeams.scala:182)at com.hulu.metrics.streaming.MapBeamFactory.makeBeam$lzycompute(MapBeamFactory.scala:47)at com.hulu.metrics.streaming.MapBeamFactory.makeBeam(MapBeamFactory.scala:22)at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:37)at com.metamx.tranquility.spark.BeamRDD$$anonfun$propagate$1.apply(BeamRDD.scala:36)at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:903)at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:903)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1935)at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1935)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:67)at org.apache.spark.scheduler.Task.run(Task.scala:88)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)at java.lang.Thread.run(Thread.java:745)
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-user/a0c5c0ba-c847-4f97-968f-3fd1894b69d0%40googlegroups.com.
-val jacksonTwoVersion = "2.6.3" | ||
+val jacksonTwoVersion = "2.4.6" |
To view this discussion on the web visit https://groups.google.com/d/msgid/druid-user/c5e69db0-8572-4b19-866a-593edeb0520c%40googlegroups.com.