Hello guys. I'm trying to connect cassandra to spark but I'm stucked on this error:
ERROR TableMetadata: Error parsing schema options for table system_traces.events: Cluster.getMetadata().getKeyspace("system_traces").getTable("events").getOptions() will return null
I'm using spark 1.4.1 and cassandra connector 1.4.0.
Cassandra is running and I created the example tables from the quickstart guide..
Thanks for your time.
Here's the output:
/usr/lib/jvm/java-8-oracle/bin/java -agentlib:jdwp=transport=dt_socket,address=
127.0.0.1:53719,suspend=y,server=n -Dfile.encoding=UTF-8 -classpath /usr/lib/jvm/java-8-oracle/jre/lib/resources.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jce.jar:/usr/lib/jvm/java-8-oracle/jre/lib/deploy.jar:/usr/lib/jvm/java-8-oracle/jre/lib/plugin.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfxswt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jsse.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfr.jar:/usr/lib/jvm/java-8-oracle/jre/lib/rt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/charsets.jar:/usr/lib/jvm/java-8-oracle/jre/lib/javaws.jar:/usr/lib/jvm/java-8-oracle/jre/lib/management-agent.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/jfxrt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/dnsns.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunpkcs11.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/nashorn.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunec.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/localedata.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/sunjce_provider.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/cldrdata.jar:/usr/lib/jvm/java-8-oracle/jre/lib/ext/zipfs.jar:/home/brunofitas/Desktop/cassandra_one/target/scala-2.11/classes:/home/brunofitas/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.7.jar:/home/brunofitas/.ivy2/cache/com.codahale.metrics/metrics-core/bundles/metrics-core-3.0.2.jar:/home/brunofitas/.ivy2/cache/com.datastax.cassandra/cassandra-driver-core/bundles/cassandra-driver-core-2.1.5.jar:/home/brunofitas/.ivy2/cache/com.google.guava/guava/bundles/guava-14.0.1.jar:/home/brunofitas/.ivy2/cache/com.twitter/jsr166e/jars/jsr166e-1.1.0.jar:/home/brunofitas/.ivy2/cache/io.netty/netty/bundles/netty-3.9.0.Final.jar:/home/brunofitas/.ivy2/cache/joda-time/joda-time/jars/joda-time-2.3.jar:/home/brunofitas/.ivy2/cache/org.apache.cassandra/cassandra-clientutil/jars/cassandra-clientutil-2.1.5.jar:/home/brunofitas/.ivy2/cache/org.apache.commons/commons-lang3/jars/commons-lang3-3.3.2.jar:/home/brunofitas/.ivy2/cache/org.joda/joda-convert/jars/joda-convert-1.2.jar:/home/brunofitas/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.6.jar:/home/brunofitas/.ivy2/cache/aopalliance/aopalliance/jars/aopalliance-1.0.jar:/home/brunofitas/.ivy2/cache/asm/asm/jars/asm-3.2.jar:/home/brunofitas/.ivy2/cache/com.clearspring.analytics/stream/jars/stream-2.7.0.jar:/home/brunofitas/.ivy2/cache/com.esotericsoftware.kryo/kryo/bundles/kryo-2.21.jar:/home/brunofitas/.ivy2/cache/com.esotericsoftware.minlog/minlog/jars/minlog-1.2.jar:/home/brunofitas/.ivy2/cache/com.esotericsoftware.reflectasm/reflectasm/jars/reflectasm-1.07-shaded.jar:/home/brunofitas/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.4.4.jar:/home/brunofitas/.ivy2/cache/com.fasterxml.jackson.core/jackson-core/bundles/jackson-core-2.4.4.jar:/home/brunofitas/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.4.4.jar:/home/brunofitas/.ivy2/cache/com.fasterxml.jackson.module/jackson-module-scala_2.11/bundles/jackson-module-scala_2.11-2.4.4.jar:/home/brunofitas/.ivy2/cache/com.google.code.findbugs/jsr305/jars/jsr305-1.3.9.jar:/home/brunofitas/.ivy2/cache/com.google.inject/guice/jars/guice-3.0.jar:/home/brunofitas/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar:/home/brunofitas/.ivy2/cache/com.ning/compress-lzf/bundles/compress-lzf-1.0.3.jar:/home/brunofitas/.ivy2/cache/com.sun.jersey/jersey-core/bundles/jersey-core-1.9.jar:/home/brunofitas/.ivy2/cache/com.sun.jersey/jersey-json/bundles/jersey-json-1.9.jar:/home/brunofitas/.ivy2/cache/com.sun.jersey/jersey-server/bundles/jersey-server-1.9.jar:/home/brunofitas/.ivy2/cache/com.sun.jersey.contribs/jersey-guice/jars/jersey-guice-1.9.jar:/home/brunofitas/.ivy2/cache/com.sun.jersey.jersey-test-framework/jersey-test-framework-grizzly2/jars/jersey-test-framework-grizzly2-1.9.jar:/home/brunofitas/.ivy2/cache/com.sun.xml.bind/jaxb-impl/jars/jaxb-impl-2.2.3-1.jar:/home/brunofitas/.ivy2/cache/com.thoughtworks.paranamer/paranamer/jars/paranamer-2.6.jar:/home/brunofitas/.ivy2/cache/com.twitter/chill-java/jars/chill-java-0.5.0.jar:/home/brunofitas/.ivy2/cache/com.twitter/chill_2.11/jars/chill_2.11-0.5.0.jar:/home/brunofitas/.ivy2/cache/com.typesafe/config/bundles/config-1.2.1.jar:/home/brunofitas/.ivy2/cache/commons-beanutils/commons-beanutils/jars/commons-beanutils-1.7.0.jar:/home/brunofitas/.ivy2/cache/commons-beanutils/commons-beanutils-core/jars/commons-beanutils-core-1.8.0.jar:/home/brunofitas/.ivy2/cache/commons-cli/commons-cli/jars/commons-cli-1.2.jar:/home/brunofitas/.ivy2/cache/commons-collections/commons-collections/jars/commons-collections-3.2.1.jar:/home/brunofitas/.ivy2/cache/commons-configuration/commons-configuration/jars/commons-configuration-1.6.jar:/home/brunofitas/.ivy2/cache/commons-digester/commons-digester/jars/commons-digester-1.8.jar:/home/brunofitas/.ivy2/cache/commons-httpclient/commons-httpclient/jars/commons-httpclient-3.1.jar:/home/brunofitas/.ivy2/cache/commons-io/commons-io/jars/commons-io-2.4.jar:/home/brunofitas/.ivy2/cache/commons-lang/commons-lang/jars/commons-lang-2.5.jar:/home/brunofitas/.ivy2/cache/commons-net/commons-net/jars/commons-net-2.2.jar:/home/brunofitas/.ivy2/cache/javax.activation/activation/jars/activation-1.1.jar:/home/brunofitas/.ivy2/cache/javax.inject/javax.inject/jars/javax.inject-1.jar:/home/brunofitas/.ivy2/cache/javax.xml.bind/jaxb-api/jars/jaxb-api-2.2.2.jar:/home/brunofitas/.ivy2/cache/jline/jline/jars/jline-0.9.94.jar:/home/brunofitas/.ivy2/cache/log4j/log4j/bundles/log4j-1.2.17.jar:/home/brunofitas/.ivy2/cache/net.java.dev.jets3t/jets3t/jars/jets3t-0.7.1.jar:/home/brunofitas/.ivy2/cache/net.razorvine/pyrolite/jars/pyrolite-4.4.jar:/home/brunofitas/.ivy2/cache/net.sf.py4j/py4j/jars/py4j-0.8.2.1.jar:/home/brunofitas/.ivy2/cache/org.apache.commons/commons-compress/jars/commons-compress-1.4.1.jar:/home/brunofitas/.ivy2/cache/org.apache.commons/commons-math/jars/commons-math-2.1.jar:/home/brunofitas/.ivy2/cache/org.apache.commons/commons-math3/jars/commons-math3-3.4.1.jar:/home/brunofitas/.ivy2/cache/org.apache.curator/curator-client/bundles/curator-client-2.4.0.jar:/home/brunofitas/.ivy2/cache/org.apache.curator/curator-framework/bundles/curator-framework-2.4.0.jar:/home/brunofitas/.ivy2/cache/org.apache.curator/curator-recipes/bundles/curator-recipes-2.4.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-annotations/jars/hadoop-annotations-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-auth/jars/hadoop-auth-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-client/jars/hadoop-client-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-common/jars/hadoop-common-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/jars/hadoop-hdfs-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-app/jars/hadoop-mapreduce-client-app-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-common/jars/hadoop-mapreduce-client-common-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-core/jars/hadoop-mapreduce-client-core-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-jobclient/jars/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-shuffle/jars/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-yarn-api/jars/hadoop-yarn-api-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-yarn-client/jars/hadoop-yarn-client-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-yarn-common/jars/hadoop-yarn-common-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.hadoop/hadoop-yarn-server-common/jars/hadoop-yarn-server-common-2.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.ivy/ivy/jars/ivy-2.4.0.jar:/home/brunofitas/.ivy2/cache/org.apache.mesos/mesos/jars/mesos-0.21.1-shaded-protobuf.jar:/home/brunofitas/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.5.jar:/home/brunofitas/.ivy2/cache/org.codehaus.jackson/jackson-jaxrs/jars/jackson-jaxrs-1.8.8.jar:/home/brunofitas/.ivy2/cache/org.codehaus.jackson/jackson-xc/jars/jackson-xc-1.8.8.jar:/home/brunofitas/.ivy2/cache/org.codehaus.jettison/jettison/bundles/jettison-1.1.jar:/home/brunofitas/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016.jar:/home/brunofitas/.ivy2/cache/org.json4s/json4s-ast_2.11/jars/json4s-ast_2.11-3.2.10.jar:/home/brunofitas/.ivy2/cache/org.json4s/json4s-core_2.11/jars/json4s-core_2.11-3.2.10.jar:/home/brunofitas/.ivy2/cache/org.json4s/json4s-jackson_2.11/jars/json4s-jackson_2.11-3.2.10.jar:/home/brunofitas/.ivy2/cache/org.mortbay.jetty/jetty-util/jars/jetty-util-6.1.26.jar:/home/brunofitas/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-1.2.jar:/home/brunofitas/.ivy2/cache/org.roaringbitmap/RoaringBitmap/bundles/RoaringBitmap-0.4.5.jar:/home/brunofitas/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.11.0.jar:/home/brunofitas/.ivy2/cache/org.scala-lang/scalap/jars/scalap-2.11.0.jar:/home/brunofitas/.ivy2/cache/org.scala-lang.modules/scala-parser-combinators_2.11/bundles/scala-parser-combinators_2.11-1.0.1.jar:/home/brunofitas/.ivy2/cache/org.scala-lang.modules/scala-xml_2.11/bundles/scala-xml_2.11-1.0.1.jar:/home/brunofitas/.ivy2/cache/org.slf4j/jcl-over-slf4j/jars/jcl-over-slf4j-1.7.10.jar:/home/brunofitas/.ivy2/cache/org.slf4j/jul-to-slf4j/jars/jul-to-slf4j-1.7.10.jar:/home/brunofitas/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.10.jar:/home/brunofitas/.ivy2/cache/org.sonatype.sisu.inject/cglib/jars/cglib-2.2.1-v20090111.jar:/home/brunofitas/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:/home/brunofitas/.ivy2/cache/org.tukaani/xz/jars/xz-1.0.jar:/home/brunofitas/.ivy2/cache/org.uncommons.maths/uncommons-maths/jars/uncommons-maths-1.2.2a.jar:/home/brunofitas/.ivy2/cache/org.xerial.snappy/snappy-java/bundles/snappy-java-1.1.1.7.jar:/home/brunofitas/.ivy2/cache/oro/oro/jars/oro-2.0.8.jar:/home/brunofitas/.ivy2/cache/stax/stax-api/jars/stax-api-1.0.1.jar:/home/brunofitas/.ivy2/cache/xmlenc/xmlenc/jars/xmlenc-0.52.jar:/home/brunofitas/.ivy2/cache/com.datastax.spark/spark-cassandra-connector_2.11/jars/spark-cassandra-connector_2.11-1.4.0.jar:/home/brunofitas/.ivy2/cache/commons-codec/commons-codec/jars/commons-codec-1.5.jar:/home/brunofitas/.ivy2/cache/commons-logging/commons-logging/jars/commons-logging-1.1.1.jar:/home/brunofitas/.ivy2/cache/io.dropwizard.metrics/metrics-core/bundles/metrics-core-3.1.0.jar:/home/brunofitas/.ivy2/cache/io.dropwizard.metrics/metrics-graphite/bundles/metrics-graphite-3.1.0.jar:/home/brunofitas/.ivy2/cache/io.dropwizard.metrics/metrics-json/bundles/metrics-json-3.1.0.jar:/home/brunofitas/.ivy2/cache/io.dropwizard.metrics/metrics-jvm/bundles/metrics-jvm-3.1.0.jar:/home/brunofitas/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.23.Final.jar:/home/brunofitas/.ivy2/cache/net.jpountz.lz4/lz4/jars/lz4-1.2.0.jar:/home/brunofitas/.ivy2/cache/org.apache.avro/avro/jars/avro-1.7.4.jar:/home/brunofitas/.ivy2/cache/org.codehaus.jackson/jackson-core-asl/jars/jackson-core-asl-1.8.8.jar:/home/brunofitas/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.7.10.jar:/home/brunofitas/.ivy2/cache/org.spark-project.akka/akka-actor_2.11/jars/akka-actor_2.11-2.3.4-spark.jar:/home/brunofitas/.ivy2/cache/org.spark-project.akka/akka-remote_2.11/jars/akka-remote_2.11-2.3.4-spark.jar:/home/brunofitas/.ivy2/cache/org.spark-project.akka/akka-slf4j_2.11/jars/akka-slf4j_2.11-2.3.4-spark.jar:/home/brunofitas/.ivy2/cache/org.spark-project.protobuf/protobuf-java/bundles/protobuf-java-2.5.0-spark.jar:/home/brunofitas/.ivy2/cache/org.tachyonproject/tachyon/jars/tachyon-0.6.4.jar:/home/brunofitas/.ivy2/cache/org.tachyonproject/tachyon-client/jars/tachyon-client-0.6.4.jar:/home/brunofitas/.ivy2/cache/org.apache.spark/spark-core_2.11/jars/spark-core_2.11-1.4.0.jar:/home/brunofitas/.ivy2/cache/org.apache.spark/spark-launcher_2.11/jars/spark-launcher_2.11-1.4.0.jar:/home/brunofitas/.ivy2/cache/org.apache.spark/spark-network-common_2.11/jars/spark-network-common_2.11-1.4.0.jar:/home/brunofitas/.ivy2/cache/org.apache.spark/spark-network-shuffle_2.11/jars/spark-network-shuffle_2.11-1.4.0.jar:/home/brunofitas/.ivy2/cache/org.apache.spark/spark-unsafe_2.11/jars/spark-unsafe_2.11-1.4.0.jar:/home/brunofitas/Applications/idea-IC-141.1532.4/lib/idea_rt.jar eu.ionseed.service.spark.Main
Connected to the target VM, address: '
127.0.0.1:53719', transport: 'socket'
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/09/23 17:35:56 INFO SparkContext: Running Spark version 1.4.0
15/09/23 17:35:56 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/09/23 17:35:56 WARN Utils: Your hostname, IonseedOne resolves to a loopback address: 127.0.1.1; using 192.168.1.6 instead (on interface eth0)
15/09/23 17:35:56 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/09/23 17:35:56 INFO SecurityManager: Changing view acls to: brunofitas
15/09/23 17:35:56 INFO SecurityManager: Changing modify acls to: brunofitas
15/09/23 17:35:56 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(brunofitas); users with modify permissions: Set(brunofitas)
15/09/23 17:35:56 INFO Slf4jLogger: Slf4jLogger started
15/09/23 17:35:56 INFO Remoting: Starting remoting
15/09/23 17:35:56 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://
spark...@192.168.1.6:34910]
15/09/23 17:35:56 INFO Utils: Successfully started service 'sparkDriver' on port 34910.
15/09/23 17:35:56 INFO SparkEnv: Registering MapOutputTracker
15/09/23 17:35:56 INFO SparkEnv: Registering BlockManagerMaster
15/09/23 17:35:56 INFO DiskBlockManager: Created local directory at /tmp/spark-f506f583-b062-4d1b-a5dc-dfae8bd0b7de/blockmgr-f778421b-a297-4691-b44c-33c9df5cfae3
15/09/23 17:35:56 INFO MemoryStore: MemoryStore started with capacity 1910.5 MB
15/09/23 17:35:56 INFO HttpFileServer: HTTP File server directory is /tmp/spark-f506f583-b062-4d1b-a5dc-dfae8bd0b7de/httpd-9f74c6de-710b-4625-876a-42addbb354d5
15/09/23 17:35:56 INFO HttpServer: Starting HTTP Server
15/09/23 17:35:57 INFO Utils: Successfully started service 'HTTP file server' on port 48329.
15/09/23 17:35:57 INFO SparkEnv: Registering OutputCommitCoordinator
15/09/23 17:35:57 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/09/23 17:35:57 INFO SparkUI: Started SparkUI at
http://192.168.1.6:4040
15/09/23 17:35:57 INFO AppClient$ClientActor: Connecting to master akka.tcp://
spark...@192.168.1.6:7077/user/Master...
15/09/23 17:35:57 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150923173557-0011
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor added: app-20150923173557-0011/0 on worker-20150923165508-192.168.1.6-48263 (
192.168.1.6:48263) with 2 cores
15/09/23 17:35:57 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150923173557-0011/0 on hostPort
192.168.1.6:48263 with 2 cores, 512.0 MB RAM
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor added: app-20150923173557-0011/1 on worker-20150923165510-192.168.1.6-57050 (
192.168.1.6:57050) with 2 cores
15/09/23 17:35:57 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150923173557-0011/1 on hostPort
192.168.1.6:57050 with 2 cores, 512.0 MB RAM
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor added: app-20150923173557-0011/2 on worker-20150923165512-192.168.1.6-40645 (
192.168.1.6:40645) with 2 cores
15/09/23 17:35:57 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150923173557-0011/2 on hostPort
192.168.1.6:40645 with 2 cores, 512.0 MB RAM
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor added: app-20150923173557-0011/3 on worker-20150923165516-192.168.1.6-44624 (
192.168.1.6:44624) with 2 cores
15/09/23 17:35:57 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150923173557-0011/3 on hostPort
192.168.1.6:44624 with 2 cores, 512.0 MB RAM
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor added: app-20150923173557-0011/4 on worker-20150923165514-192.168.1.6-51777 (
192.168.1.6:51777) with 2 cores
15/09/23 17:35:57 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150923173557-0011/4 on hostPort
192.168.1.6:51777 with 2 cores, 512.0 MB RAM
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/4 is now LOADING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/1 is now LOADING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/2 is now LOADING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/3 is now LOADING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/0 is now LOADING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/0 is now RUNNING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/1 is now RUNNING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/2 is now RUNNING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/3 is now RUNNING
15/09/23 17:35:57 INFO AppClient$ClientActor: Executor updated: app-20150923173557-0011/4 is now RUNNING
15/09/23 17:35:57 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54460.
15/09/23 17:35:57 INFO NettyBlockTransferService: Server created on 54460
15/09/23 17:35:57 INFO BlockManagerMaster: Trying to register BlockManager
15/09/23 17:35:57 INFO BlockManagerMasterEndpoint: Registering block manager
192.168.1.6:54460 with 1910.5 MB RAM, BlockManagerId(driver, 192.168.1.6, 54460)
15/09/23 17:35:57 INFO BlockManagerMaster: Registered BlockManager
15/09/23 17:35:58 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
15/09/23 17:36:00 ERROR TableMetadata: Error parsing schema options for table system_traces.events: Cluster.getMetadata().getKeyspace("system_traces").getTable("events").getOptions() will return null
java.lang.IllegalArgumentException: Not a JSON map: KEYS_ONLY
at com.datastax.driver.core.SimpleJSONParser.parseStringMap(SimpleJSONParser.java:77)
at com.datastax.driver.core.TableMetadata$Options.<init>(TableMetadata.java:569)
at com.datastax.driver.core.TableMetadata.build(TableMetadata.java:127)
at com.datastax.driver.core.Metadata.buildTableMetadata(Metadata.java:183)
at com.datastax.driver.core.Metadata.rebuildSchema(Metadata.java:115)
at com.datastax.driver.core.ControlConnection.refreshSchema(ControlConnection.java:358)
at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:267)
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:190)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:78)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1230)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:333)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:157)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:264)
at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
at eu.ionseed.service.spark.Main$.delayedEndpoint$eu$ionseed$service$spark$Main$1(Main.scala:58)
at eu.ionseed.service.spark.Main$delayedInit$body.apply(Main.scala:12)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at eu.ionseed.service.spark.Main$.main(Main.scala:12)
at eu.ionseed.service.spark.Main.main(Main.scala)
15/09/23 17:36:00 ERROR TableMetadata: Error parsing schema options for table system_traces.sessions: Cluster.getMetadata().getKeyspace("system_traces").getTable("sessions").getOptions() will return null
java.lang.IllegalArgumentException: Not a JSON map: KEYS_ONLY
at com.datastax.driver.core.SimpleJSONParser.parseStringMap(SimpleJSONParser.java:77)
at com.datastax.driver.core.TableMetadata$Options.<init>(TableMetadata.java:569)
at com.datastax.driver.core.TableMetadata.build(TableMetadata.java:127)
at com.datastax.driver.core.Metadata.buildTableMetadata(Metadata.java:183)
at com.datastax.driver.core.Metadata.rebuildSchema(Metadata.java:115)
at com.datastax.driver.core.ControlConnection.refreshSchema(ControlConnection.java:358)
at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:267)
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:190)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:78)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1230)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:333)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:157)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:264)
at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
at eu.ionseed.service.spark.Main$.delayedEndpoint$eu$ionseed$service$spark$Main$1(Main.scala:58)
at eu.ionseed.service.spark.Main$delayedInit$body.apply(Main.scala:12)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at eu.ionseed.service.spark.Main$.main(Main.scala:12)
at eu.ionseed.service.spark.Main.main(Main.scala)
15/09/23 17:36:00 INFO Cluster: New Cassandra host /
127.0.0.1:9042 added
15/09/23 17:36:00 INFO CassandraConnector: Connected to Cassandra cluster: Test Cluster
15/09/23 17:36:00 INFO SparkContext: Starting job: runJob at RDDFunctions.scala:37
15/09/23 17:36:00 INFO DAGScheduler: Got job 0 (runJob at RDDFunctions.scala:37) with 2 output partitions (allowLocal=false)
15/09/23 17:36:00 INFO DAGScheduler: Final stage: ResultStage 0(runJob at RDDFunctions.scala:37)
15/09/23 17:36:00 INFO DAGScheduler: Parents of final stage: List()
15/09/23 17:36:00 INFO DAGScheduler: Missing parents: List()
15/09/23 17:36:00 INFO DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at Main.scala:57), which has no missing parents
15/09/23 17:36:00 INFO MemoryStore: ensureFreeSpace(10056) called with curMem=0, maxMem=2003325419
15/09/23 17:36:00 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 9.8 KB, free 1910.5 MB)
15/09/23 17:36:00 INFO MemoryStore: ensureFreeSpace(4594) called with curMem=10056, maxMem=2003325419
15/09/23 17:36:00 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.5 KB, free 1910.5 MB)
15/09/23 17:36:00 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on
192.168.1.6:54460 (size: 4.5 KB, free: 1910.5 MB)
15/09/23 17:36:00 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:874
15/09/23 17:36:00 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at parallelize at Main.scala:57)
15/09/23 17:36:00 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
15/09/23 17:36:00 INFO CassandraConnector: Disconnected from Cassandra cluster: Test Cluster
15/09/23 17:36:01 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://
sparkE...@192.168.1.6:46308/user/Executor#-1645418260]) with ID 3
15/09/23 17:36:01 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.1.6, PROCESS_LOCAL, 1516 bytes)
15/09/23 17:36:01 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 192.168.1.6, PROCESS_LOCAL, 1516 bytes)
15/09/23 17:36:01 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://
sparkE...@192.168.1.6:38097/user/Executor#-1851351009]) with ID 0
15/09/23 17:36:01 INFO BlockManagerMasterEndpoint: Registering block manager
192.168.1.6:47955 with 265.1 MB RAM, BlockManagerId(0, 192.168.1.6, 47955)
15/09/23 17:36:01 INFO BlockManagerMasterEndpoint: Registering block manager
192.168.1.6:40911 with 265.1 MB RAM, BlockManagerId(3, 192.168.1.6, 40911)
15/09/23 17:36:01 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://
sparkE...@192.168.1.6:33470/user/Executor#-1608501099]) with ID 2
15/09/23 17:36:01 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://
sparkE...@192.168.1.6:35503/user/Executor#-1367373589]) with ID 1
15/09/23 17:36:01 INFO BlockManagerMasterEndpoint: Registering block manager
192.168.1.6:53844 with 265.1 MB RAM, BlockManagerId(2, 192.168.1.6, 53844)
15/09/23 17:36:02 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on
192.168.1.6:40911 (size: 4.5 KB, free: 265.1 MB)
15/09/23 17:36:02 INFO BlockManagerMasterEndpoint: Registering block manager
192.168.1.6:52787 with 265.1 MB RAM, BlockManagerId(1, 192.168.1.6, 52787)
15/09/23 17:36:02 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 192.168.1.6): java.lang.ClassNotFoundException: com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:95)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:58)
at org.apache.spark.scheduler.Task.run(Task.scala:70)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
15/09/23 17:36:02 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 2, 192.168.1.6, PROCESS_LOCAL, 1516 bytes)
15/09/23 17:36:02 INFO TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1) on executor
192.168.1.6: java.lang.ClassNotFoundException (com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1) [duplicate 1]
15/09/23 17:36:02 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID 3, 192.168.1.6, PROCESS_LOCAL, 1516 bytes)
15/09/23 17:36:02 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID 3) on executor
192.168.1.6: java.lang.ClassNotFoundException (com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1) [duplicate 2]
15/09/23 17:36:02 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID 4, 192.168.1.6, PROCESS_LOCAL, 1516 bytes)
15/09/23 17:36:02 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2) on executor
192.168.1.6: java.lang.ClassNotFoundException (com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1) [duplicate 3]
15/09/23 17:36:02 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 5, 192.168.1.6, PROCESS_LOCAL, 1516 bytes)
15/09/23 17:36:02 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://
sparkE...@192.168.1.6:39325/user/Executor#-1764794120]) with ID 4
15/09/23 17:36:02 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on
192.168.1.6:47955 (size: 4.5 KB, free: 265.1 MB)
15/09/23 17:36:02 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on
192.168.1.6:52787 (size: 4.5 KB, free: 265.1 MB)
15/09/23 17:36:02 INFO BlockManagerMasterEndpoint: Registering block manager
192.168.1.6:58130 with 265.1 MB RAM, BlockManagerId(4, 192.168.1.6, 58130)
15/09/23 17:36:02 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID 4) on executor
192.168.1.6: java.lang.ClassNotFoundException (com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1) [duplicate 4]
15/09/23 17:36:02 INFO TaskSetManager: Starting task 1.3 in stage 0.0 (TID 6, 192.168.1.6, PROCESS_LOCAL, 1516 bytes)
15/09/23 17:36:03 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID 6) on executor
192.168.1.6: java.lang.ClassNotFoundException (com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1) [duplicate 5]
15/09/23 17:36:03 ERROR TaskSetManager: Task 1 in stage 0.0 failed 4 times; aborting job
15/09/23 17:36:03 INFO TaskSchedulerImpl: Cancelling stage 0
15/09/23 17:36:03 INFO TaskSchedulerImpl: Stage 0 was cancelled
15/09/23 17:36:03 INFO DAGScheduler: ResultStage 0 (runJob at RDDFunctions.scala:37) failed in 2.189 s
15/09/23 17:36:03 INFO DAGScheduler: Job 0 failed: runJob at RDDFunctions.scala:37, took 2.378297 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 6, 192.168.1.6): java.lang.ClassNotFoundException: com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:95)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:58)
at org.apache.spark.scheduler.Task.run(Task.scala:70)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
15/09/23 17:36:03 INFO SparkContext: Invoking stop() from shutdown hook
15/09/23 17:36:03 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 5) on executor
192.168.1.6: java.lang.ClassNotFoundException (com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1) [duplicate 6]
15/09/23 17:36:03 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
15/09/23 17:36:03 INFO SparkUI: Stopped Spark web UI at
http://192.168.1.6:4040
15/09/23 17:36:03 INFO DAGScheduler: Stopping DAGScheduler
15/09/23 17:36:03 INFO SparkDeploySchedulerBackend: Shutting down all executors
15/09/23 17:36:03 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
15/09/23 17:36:03 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/09/23 17:36:03 INFO Utils: path = /tmp/spark-f506f583-b062-4d1b-a5dc-dfae8bd0b7de/blockmgr-f778421b-a297-4691-b44c-33c9df5cfae3, already present as root for deletion.
15/09/23 17:36:03 INFO MemoryStore: MemoryStore cleared
15/09/23 17:36:03 INFO BlockManager: BlockManager stopped
15/09/23 17:36:03 INFO BlockManagerMaster: BlockManagerMaster stopped
15/09/23 17:36:03 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/09/23 17:36:03 INFO SparkContext: Successfully stopped SparkContext
15/09/23 17:36:03 INFO Utils: Shutdown hook called
15/09/23 17:36:03 INFO Utils: Deleting directory /tmp/spark-f506f583-b062-4d1b-a5dc-dfae8bd0b7de
15/09/23 17:36:03 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/09/23 17:36:03 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
Disconnected from the target VM, address: '
127.0.0.1:53719', transport: 'socket'
Process finished with exit code 1