app-20151014175609-0015.1.executor.filesystem.file.largeRead_ops.csv
app-20151014175609-0015.1.executor.filesystem.file.read_bytes.csv
app-20151014175609-0015.1.executor.filesystem.file.read_ops.csv
app-20151014175609-0015.1.executor.filesystem.file.write_bytes.csv
app-20151014175609-0015.1.executor.filesystem.file.write_ops.csv
app-20151014175609-0015.1.executor.filesystem.hdfs.largeRead_ops.csv
app-20151014175609-0015.1.executor.filesystem.hdfs.read_bytes.csv
app-20151014175609-0015.1.executor.filesystem.hdfs.read_ops.csv
app-20151014175609-0015.1.executor.filesystem.hdfs.write_bytes.csv
app-20151014175609-0015.1.executor.filesystem.hdfs.write_ops.csv
app-20151014175609-0015.1.executor.threadpool.activeTasks.csv
app-20151014175609-0015.1.executor.threadpool.completeTasks.csv
app-20151014175609-0015.1.executor.threadpool.currentPool_size.csv
app-20151014175609-0015.1.executor.threadpool.maxPool_size.csv
listed on the workers.
My metrics file looks like
executor.source.cassandra-connector.class=org.apache.spark.metrics.CassandraConnectorSource
driver.source.cassandra-connector.class=org.apache.spark.metrics.CassandraConnectorSource
*.sink.csv.class=org.apache.spark.metrics.sink.CsvSink
# Polling period for CsvSink
*.sink.csv.period=1
*.sink.csv.unit=minutes
# Polling directory for CsvSink
*.sink.csv.directory=/tmp/
I'm on spark-cassandra-connector 1.3.0. Any thoughts on what is going wrong?
I have the same issue using 1.5.0-M2 however this commit might have fixed it in M3. I'll have to try.
Thanks,
Joe
I ran into the same problem. I'm using spark-cassandra-connector_2.10:1.5.0, spark 1.5.1. I am trying to send the metrics to statsD. My metrics.properties is as the following:
*.sink.statsd.class=org.apache.spark.metrics.sink.StatsDSink
*.sink.statsd.host=localhost
*.sink.statsd.port=18125
executor.source.cassandra-connector.class=org.apache.spark.metrics.CassandraConnectorSource
driver.source.cassandra-connector.class=org.apache.spark.metrics.CassandraConnectorSource
I'm able to see other metrics, e.g. DAGScheduler, but not any from the CassandraConnectorSource. E.g. I tried to search "write-byte-meter", but didn't find it. I didn't see the metrics on the spark UI either. I didn't find any relevant error or info in the log that indicates the CassandraConnectorSource is actually registered by the spark metrics system. Any pointers would be very much appreciated!
Thanks,
Sa
*.sink.statsd.class=org.apache.spark.metrics.sink.StatsDSink
*.sink.statsd.host=localhost
*.sink.statsd.port=18125
executor.source.cassandra-connector.class=org.apache.spark.metrics.CassandraConnectorSource
driver.source.cassandra-connector.class=org.apache.spark.metrics.CassandraConnectorSource
Thanks,
Sa
--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.