Customizing Spark-metrics name

21 views
Skip to first unread message

asuresh

unread,
Oct 4, 2017, 2:37:14 PM10/4/17
to metrics-user

Hi ,


I have been using this library to capture metrics across executors in my spark streaming applications. 

#spark-metrics - https://github.com/groupon/spark-metrics
This allows to use dropwizard/codehale metrics on spark applications to publish the required metrics.


The sink for the metrics is slf4j.
Below is a sample line of metrics captured in the log file:


(type=METER, name=local-1505766776874.driver.WebStream.dqmetricnamespace.stream.dq.publish, count=11, mean_rate=0.2160861001716871, m1=1.0392064160302328, m5=1.8935575481351274, m15=2.09270473390157, rate_unit=events/second)


Is there anyway to add a prefix to these keys when they are captured?
For eg: count to be replaced with foo_count
mean_rate to be replaced with foo_mean_rate etc


I want the log line to be captured as below:

(foo_type=METER, foo_name=local-1505766776874.driver.WebStream.dqmetricnamespace.stream.dq.publish, foo_count=11, foo_mean_rate=0.2160861001716871, foo_m1=1.0392064160302328, foo_m5=1.8935575481351274, foo_m15=2.09270473390157, foo_rate_unit=events/second)


It would really help if we have a prefix attached to these keys when capturing in the log file.
Really appreciate any help on this.
Please let me know for any questions.


Thanks,
Amar

Reply all
Reply to author
Forward
0 new messages