are these results fine? Is there any benchmark available to compare my results?
I use NYC taxi data set of approx. 30 million rows with 5 GB storage. I use sparklyr and crassy package to read/write data from/to spark cassandra. The experiments are performed on a cluster of 4 nodes (8 cores, 16GB RAM each); and spark cassandra are collocated on each node. I got read latency 4.032 seconds to read data from cassandra to spark as dataframe. Then I add two more columns to dataframe and then write back to cassandra. I got write latency 7176.550 seconds. Why i got too fast read and too slow write? I have used all spark cassandra connect parameters with default values. even if i changed the config$spark.cassandra.input.split.size_in_mb, it gives me the almost same read latency.
are these results fine? Is there any benchmark available to compare my results?
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
Actual I am able to write 300K for 60 Seconds, data have Maps and Lists also. Each row have the more than 150 Columns. We used only 4 node cluster with two replica (2 core, 4 GB RAM). For writing we used two additional nodes which are not in part of the Cluster (4 Core, 4GB RAM). With less config we able to complete the 30M with in 6000 Seconds with large columns. We used SparkR.
What my observation for your problem is, when writing the data into Cassandra I am seeing more RAM and CPU used compare with reading. Suppose you are using different nodes for writing, then you can write the data very fast, compare with now and you are using good config compare than us.
We have tested with Julia also, our team able to write the data in Cassandra 10 times fast compare than R programming. 30M we are able to write our data set in 700 seconds. Julia with CPP driver.
Our conversion we are doing 900M records to Cassandra. I am not seeing problem with writing, suppose you use different nodes for writing you can get more output.
Thanks Ravi, this will help me lot.
:compileJava
:compileScala
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/apache/scala/com/datastax/sparkstress/ContextHelper.scala:6: error: ConnectHelper is already defined as object ConnectHelper
[ant:scalac] object ConnectHelper {
[ant:scalac] ^
[ant:scalac] one error found
:compileScala FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':compileScala'.
> Compile failed with 1 error; see the compiler error output for details.
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 13.827 secs
Here's the list of errors I get when trying to build with "./gradlew jar -Pagainst=source". It does build fine with "./gradlew jar -Pagainst=maven":
[success] Total time: 30 s, completed May 24, 2018 12:03:39 PM
:compileJava
:compileScala
[ant:scalac] warning: Class org.joda.convert.FromString not found - continuing with a stub.
[ant:scalac] warning: Class org.joda.convert.ToString not found - continuing with a stub.
[ant:scalac] warning: Class org.joda.convert.ToString not found - continuing with a stub.
[ant:scalac] warning: Class org.joda.convert.FromString not found - continuing with a stub.
[ant:scalac] warning: Class org.joda.convert.ToString not found - continuing with a stub.
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:7: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector.cql.CassandraConnector
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:9: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector._
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:12: error: object cassandra is not a member of package org.apache.spark.sql
[ant:scalac] import org.apache.spark.sql.cassandra._
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:25: error: not found: value CassandraConnector
[ant:scalac] val numberNodes = CassandraConnector(sc.getConf).withClusterDo(_.getMetadata.getAllHosts.size)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:52: error: value cassandraFormat is not a member of org.apache.spark.sql.DataFrameReader
[ant:scalac] possible cause: maybe a semicolon is missing before `value cassandraFormat'?
[ant:scalac] .cassandraFormat(table, keyspace)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:74: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] case RDD => sc.cassandraTable[String](keyspace, table).select("color", "size").count
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:89: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] case RDD => sc.cassandraTable[String](keyspace, table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:106: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] case RDD => sc.cassandraTable[String](keyspace, table).select("color", "size", "qty",
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:121: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] case RDD => sc.cassandraTable(keyspace, table).cassandraCount()
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:135: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] case RDD => sc.cassandraTable[String](keyspace, table).select("color").count
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:149: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] case RDD => sc.cassandraTable[PerfRowClass](keyspace, table).count
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:165: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] sc.cassandraTable[(UUID, Int, String, String, org.joda.time.DateTime)](keyspace,
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:182: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] sc.cassandraTable[PerfRowClass](keyspace, table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:202: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] sc.cassandraTable[(UUID, Int, String, String, org.joda.time.DateTime)](keyspace, table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:224: error: value joinWithCassandraTable is not a member of org.apache.spark.rdd.RDD[(String,)]
[ant:scalac] possible cause: maybe a semicolon is missing before `value joinWithCassandraTable'?
[ant:scalac] .joinWithCassandraTable[PerfRowClass](keyspace, table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:248: error: value repartitionByCassandraReplica is not a member of org.apache.spark.rdd.RDD[(String,)]
[ant:scalac] possible cause: maybe a semicolon is missing before `value repartitionByCassandraReplica'?
[ant:scalac] .repartitionByCassandraReplica(keyspace, table, coresPerNode)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:265: error: value joinWithCassandraTable is not a member of org.apache.spark.rdd.RDD[(String,)]
[ant:scalac] possible cause: maybe a semicolon is missing before `value joinWithCassandraTable'?
[ant:scalac] .joinWithCassandraTable[PerfRowClass](keyspace, table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/ReadTask.scala:288: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] sc.cassandraTable[String](keyspace, table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/StreamingTask.scala:4: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector.streaming._
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/StreamingTask.scala:5: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector.writer.RowWriterFactory
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/StreamingTask.scala:3: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector.cql.CassandraConnector
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/StreamingTask.scala:42: error: not found: value CassandraConnector
[ant:scalac] val cc = CassandraConnector(ss.sparkContext.getConf)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:10: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector._
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:17: error: object cassandra is not a member of package org.apache.spark.sql
[ant:scalac] import org.apache.spark.sql.cassandra._
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/StreamingTask.scala:101: error: value saveToCassandra is not a member of org.apache.spark.streaming.dstream.DStream[com.datastax.sparkstress.RowTypes.PerfRowClass]
[ant:scalac] override def dstreamOps(dstream: DStream[PerfRowClass]): Unit = dstream.saveToCassandra(config.keyspace, config.table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/StressTask.scala:4: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector.cql.CassandraConnector
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/StressTask.scala:20: error: not found: type CassandraConnector
[ant:scalac] def getLocalDC(cc: CassandraConnector): String = {
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:3: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector.cql.CassandraConnector
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:4: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector.writer.RowWriterFactory
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:26: error: not found: type RowWriterFactory
[ant:scalac] (implicit rwf: RowWriterFactory[rowType]) extends StressTask {
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:33: error: not found: value CassandraConnector
[ant:scalac] val cc = CassandraConnector(sc.getConf)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/apache/com/datastax/bdp/spark/writer/BulkTableWriter.scala:6: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector._
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/apache/com/datastax/bdp/spark/writer/BulkTableWriter.scala:7: error: object spark is not a member of package com.datastax
[ant:scalac] import com.datastax.spark.connector.writer._
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:71: error: value cassandraFormat is not a member of org.apache.spark.sql.DataFrameWriter[org.apache.spark.sql.Row]
[ant:scalac] possible cause: maybe a semicolon is missing before `value cassandraFormat'?
[ant:scalac] .cassandraFormat(destination.table, destination.keyspace)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/apache/com/datastax/bdp/spark/writer/BulkTableWriter.scala:23: error: not found: type ColumnSelector
[ant:scalac] columns: ColumnSelector = AllColumns,
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:90: error: value saveToCassandra is not a member of org.apache.spark.rdd.RDD[rowType]
[ant:scalac] case SaveMethod.Driver => getRDD.saveToCassandra(destination.keyspace, destination.table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:142: error: not found: type RowWriterFactory
[ant:scalac] WriteTask[ShortRowClass](config, ss)(implicitly[RowWriterFactory[ShortRowClass]]) {
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:168: error: not found: type RowWriterFactory
[ant:scalac] WriteTask[PerfRowClass](config, ss)(implicitly[RowWriterFactory[PerfRowClass]]) {
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:202: error: not found: type RowWriterFactory
[ant:scalac] WriteTask[WideRowClass](config, ss)(implicitly[RowWriterFactory[WideRowClass]]) {
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:232: error: not found: type RowWriterFactory
[ant:scalac] WriteTask[WideRowClass](config, ss)(implicitly[RowWriterFactory[WideRowClass]]) {
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:261: error: not found: type RowWriterFactory
[ant:scalac] WriteTask[WideRowClass](config, ss)(implicitly[RowWriterFactory[WideRowClass]]) {
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:286: error: not found: type RowWriterFactory
[ant:scalac] WriteTask[PerfRowClass](config, ss)(implicitly[RowWriterFactory[PerfRowClass]]) {
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:303: error: value cassandraTable is not a member of org.apache.spark.SparkContext
[ant:scalac] sc.cassandraTable[PerfRowClass](config.keyspace, config.table)
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/main/scala/com/datastax/sparkstress/WriteTask.scala:306: error: value cassandraFormat is not a member of org.apache.spark.sql.DataFrameReader
[ant:scalac] ss.read.cassandraFormat(config.table, config.keyspace).load()
[ant:scalac] ^
[ant:scalac] /Users/engstrom/spark_cassandra_stress/spark-cassandra-stress/src/apache/com/datastax/bdp/spark/writer/BulkTableWriter.scala:23: error: not found: value AllColumns
[ant:scalac] columns: ColumnSelector = AllColumns,
[ant:scalac] ^
[ant:scalac] 5 warnings found
[ant:scalac] 45 errors found
:compileScala FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':compileScala'.
> Compile failed with 45 errors; see the compiler error output for details.
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 58.299 secs
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
engstrommac:spark-cassandra-stress engstrom$ ./run.sh apache --help
Submit Script:: spark-submit --class com.datastax.sparkstress.SparkCassandraStress build/libs/SparkCassandraStress-1.0.jar --help
18/05/24 15:00:10.562 INFO Reflections: Reflections took 224 ms to scan 1 urls, producing 32 keys and 329 values
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;
at org.reflections.Reflections.expandSuperTypes(Reflections.java:380)
at org.reflections.Reflections.<init>(Reflections.java:126)
at org.reflections.Reflections.<init>(Reflections.java:168)
at org.reflections.Reflections.<init>(Reflections.java:141)
at com.datastax.sparkstress.SparkCassandraStress$.<init>(SparkCassandraStress.scala:67)
at com.datastax.sparkstress.SparkCassandraStress$.<clinit>(SparkCassandraStress.scala)
at com.datastax.sparkstress.SparkCassandraStress.main(SparkCassandraStress.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:744)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
engstrommac:spark-cassandra-stress engstrom$
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
engstrommac:spark-cassandra-stress engstrom$ ./run.sh apache -p 4 -y 262144 -o 262144 -d -n 4 -S results.tsv -m driver writerandomwiderow
Submit Script:: spark-submit --class com.datastax.sparkstress.SparkCassandraStress build/libs/SparkCassandraStress-1.0.jar -p 4 -y 262144 -o 262144 -d -n 4 -S results.tsv -m driver writerandomwiderow
18/05/24 15:06:33.693 INFO Reflections: Reflections took 211 ms to scan 1 urls, producing 32 keys and 329 values
Russell - seeing as I can built it with maven it's certainly not stopping me from doing anything. If you get a chance to look at it at some point that'd be great, otherwise I'm happy to use maven.
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
Any chance that could be caused by something in my environment? Or is it likely to be a mismatch between guava version being brought down by maven?
Thanks, I appreciate that.
Also, just to let you know when building with '-Pagainst=source' it IS building the SCC. I'm sure because before the build there is no SCC assembly jar file and after the build there is. Plus there's output as part of the gradle build showing the SCC being built and tested. But when it comes time to build the scala in spark-cassandra-stress it appears to not be able to find it. I tried adding a 'compile fileTree(...)' to the build.gradle file and specifying the directory under SPARKCC_HOME where the assembly jar file gets built. I ended up with the same scala compile errors.
I'd appreciate any help, tips, or suggestions you could provide.
Thanks
engstrommac:spark-cassandra-stress engstrom$ ./gradlew jar -Pagainst=source
Checking dependency flag: source
Using Assembly Jar from Source Repo
2.0.6
:build_connector
Attempting to fetch sbt
Launching sbt from sbt/sbt-launch-0.13.12.jar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=350m; support was removed in 8.0
[info] Loading project definition from /Users/engstrom/spark_cassandra_stress/spark-cassandra-connector/project
[info] Updating {file:/Users/engstrom/spark_cassandra_stress/spark-cassandra-connector/project/}spark-cassandra-connector-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * net.virtual-void:sbt-dependency-graph:0.7.4 -> 0.8.2
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 7 Scala sources to /Users/engstrom/spark_cassandra_stress/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes...
[warn] there were 4 deprecation warning(s); re-run with -deprecation for details
[warn] there were 7 feature warning(s); re-run with -feature for details
[warn] two warnings found
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots
Scala: 2.10.6 [To build against Scala 2.11 use '-Dscala-2.11=true']
Scala Binary: 2.10
Java: target=1.7 user=1.8.0_92
Cassandra version for testing: 3.6 [can be overridden by specifying '-Dtest.cassandra.version=<version>']
[info] Set current project to root (in build file:/Users/engstrom/spark_cassandra_stress/spark-cassandra-connector/)
[error] Not a valid command: jar
[error] Not a valid key: jar (similar: run, apiUrl, target)
[error] jar
[error] ^
:build_connector FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':build_connector'.
> Process 'command 'sbt/sbt'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 37.935 secs
engstrommac:spark-cassandra-stress engstrom$
Russell, I just tried building with the changes in build.gradle and apparently something about my environment hates the change from "sbt/sbt assembly" to "sbt/sbt jar". Here's the output when I try to build against source:
engstrommac:spark-cassandra-stress engstrom$ ./gradlew jar -Pagainst=source
Checking dependency flag: source
Using Assembly Jar from Source Repo
2.0.6
:build_connector
Attempting to fetch sbt
Launching sbt from sbt/sbt-launch-0.13.12.jar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=350m; support was removed in 8.0
[info] Loading project definition from /Users/engstrom/spark_cassandra_stress/spark-cassandra-connector/project
[info] Updating {file:/Users/engstrom/spark_cassandra_stress/spark-cassandra-connector/project/%7Dspark-cassandra-connector-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * net.virtual-void:sbt-dependency-graph:0.7.4 -> 0.8.2
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 7 Scala sources to /Users/engstrom/spark_cassandra_stress/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes...
[warn] there were 4 deprecation warning(s); re-run with -deprecation for details
[warn] there were 7 feature warning(s); re-run with -feature for details
[warn] two warnings found
Using releases: https://urldefense.proofpoint.com/v2/url?u=https-3A__oss.sonatype.org_service_local_staging_deploy_maven2&d=DwIBaQ&c=adz96Xi0w1RHqtPMowiL2g&r=mPa4DVY9Tr2PgOr6pcYcDSTS5OGYiRXFr0-h3mIgaEU&m=aeTH2jo051TvBx-sQbgE3gpUCeQuZu84j5WUs5a6yOI&s=ITz27WRagW0Dov5xo0MOG5p5ak9NgEyQzabYfsnVLzA&e= for releases
Using snapshots: https://urldefense.proofpoint.com/v2/url?u=https-3A__oss.sonatype.org_content_repositories_snapshots&d=DwIBaQ&c=adz96Xi0w1RHqtPMowiL2g&r=mPa4DVY9Tr2PgOr6pcYcDSTS5OGYiRXFr0-h3mIgaEU&m=aeTH2jo051TvBx-sQbgE3gpUCeQuZu84j5WUs5a6yOI&s=WgWAFHVbLpJ9zaLfPuRRsIV_CCDaDrN6FsAQRIcGzRE&e= for snapshots
Scala: 2.10.6 [To build against Scala 2.11 use '-Dscala-2.11=true']
Scala Binary: 2.10
Java: target=1.7 user=1.8.0_92
Cassandra version for testing: 3.6 [can be overridden by specifying '-Dtest.cassandra.version=<version>']
[info] Set current project to root (in build file:/Users/engstrom/spark_cassandra_stress/spark-cassandra-connector/)
[error] Not a valid command: jar
[error] Not a valid key: jar (similar: run, apiUrl, target)
[error] jar
[error] ^
:build_connector FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':build_connector'.
> Process 'command 'sbt/sbt'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 37.935 secs
engstrommac:spark-cassandra-stress engstrom$
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.