I am running sparkling water 1.5.3 on HDP 2.2.4.Someitme (not always) I am getting the following error.
Any idea on the error related to "cloud size under 50" I got?
===================================================
bash-4.1$ ./sparkling-water-shell.sh
-----
Spark master (MASTER) : local-cluster[10,2,8192]
Spark home (SPARK_HOME) : /userapps/hadoop/spark-1.6.0
H2O build version : 3.2.0.9 (slater)
Spark build version : 1.5.0
----
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.0
/_/
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_45)
Type in expressions to have them evaluated.
Type :help for more information.
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:16:59 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
Spark context available as sc.
16/03/16 02:17:20 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:17:20 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:17:20 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:17:20 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:17:20 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
16/03/16 02:17:23 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:17:23 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:17:23 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:17:23 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:17:23 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
16/03/16 02:17:26 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/03/16 02:17:26 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:17:27 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
SQL context available as sqlContext.
Loading /home_dir/svdfe001/var/ts-features-wf/test.scala...
import com.typesafe.config._
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.SparkContext._
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.functions._
import org.apache.spark.sql.DataFrameNaFunctions
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.FileSystem
import org.apache.hadoop.fs.Path
import org.apache.spark.sql.hive.orc._
import org.apache.spark.sql._
import org.apache.spark.h2o._
import water._
import org.apache.spark.sql.expressions.Window
import org.apache.spark.sql.DataFrame
import org.apache.spark.sql.expressions.WindowSpec
import org.apache.spark.sql.Column
import org.apache.spark.sql.Row
16/03/16 02:17:45 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:17:45 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:17:45 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:17:45 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:17:45 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
16/03/16 02:17:46 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:17:46 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:17:46 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:17:46 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:17:46 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
16/03/16 02:17:48 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/03/16 02:17:48 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:17:48 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:17:48 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:17:48 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:17:48 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
16/03/16 02:17:50 WARN HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist
16/03/16 02:17:50 WARN HiveConf: HiveConf of name hive.heapsize does not exist
16/03/16 02:17:50 WARN HiveConf: HiveConf of name hive.semantic.analyzer.factory.impl does not exist
16/03/16 02:17:50 WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
16/03/16 02:17:50 WARN HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist
sqlContext: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext@7b919e50
import sqlContext.implicits._
16/03/16 02:17:55 WARN H2OContext: Increasing 'spark.locality.wait' to value 30000
java.lang.RuntimeException: Cloud size under 50
at water.H2O.waitForCloudSize(H2O.java:1374)
at org.apache.spark.h2o.H2OContext.start(H2OContext.scala:154)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:64)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:89)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:91)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:93)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:95)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:97)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:99)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:101)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:103)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:105)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:107)
at $iwC$$iwC$$iwC.<init>(<console>:109)
at $iwC$$iwC.<init>(<console>:111)
at $iwC.<init>(<console>:113)
at <init>(<console>:115)
at .<init>(<console>:119)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
at scala.reflect.io.File.applyReader(File.scala:82)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)