Alerts seems to be overriden based on same labels with unique metric names

502 views
Skip to first unread message

mali...@gmail.com

unread,
Jan 9, 2017, 6:49:27 AM1/9/17
to Prometheus Users
Hi There,

I've got an example where I have several metrics with unique names and but they all have the same labels, see below, 

The metrics have a value of 0 or 1

example_name_1 {job="another_job", environment="test"}
example_name_2 {job="another_job", environment="test"}
example_name_3 {job="another_job", environment="test"}

The alert rule in prometheus for the above is defined similar to,

ALERT Failed_Example_Alert
IF {job="
another_job", alertname !="Failed_Example_Alert"} == 1
FOR 11m
LABELS { severity = "warn", name = "{{ $labels.__name__ }}", environment = "{{ $labels.environment }}"" }
ANNOTATIONS {
summary = "The example test has failed",
description = "{{ $labels.__name__ }} has failed",
action = "Check build",
}


The query for {job="", alertname != "Failed_Example_Alert"} == 1 in prometheus correctly displays the 3 results. If all the above metrics have a value of 1 I expected to see 3 alerts created but I only see 1 alert created by prometheus. Does prometheus override the alert as the label values are the same? If I include the name of the metric as a label name and value (when building up the metric) I get 3 Alerts as expected. Apologies I haven't looked at the codebase yet to figure out what prometheus does. I understand that adding the name of the metric again as a label maynot follow the prometheus metric naming conventions although it created the correct number of alerts.

Thanks




Brian Brazil

unread,
Jan 9, 2017, 7:12:34 AM1/9/17
to mali...@gmail.com, Prometheus Users
This is expected behaviour, as the alerts have identical labels and are thus indistinguishable.

The issue here is that you're not providing a metric name in your expression, which you should pretty much never do. I suspect you need to move some of the metric name into a label.


--

mali...@gmail.com

unread,
Jan 9, 2017, 8:32:15 AM1/9/17
to Prometheus Users, mali...@gmail.com
Thanks Brian. Yup I've added the metric name as a label to the metric which makes the results of the search query distinguishable, hence I get the expected number of alerts.
 
We've got several alert rules based mainly on the metric name (in the IF condition) and you've said not to create alerts off the metric name. Is this because of metric names that can change which can potentially break alert rules? 

which you should pretty much never do

Thanks again 

Brian Brazil

unread,
Jan 9, 2017, 9:16:51 AM1/9/17
to mali...@gmail.com, Prometheus Users
On 9 January 2017 at 13:32, <mali...@gmail.com> wrote:
Thanks Brian. Yup I've added the metric name as a label to the metric which makes the results of the search query distinguishable, hence I get the expected number of alerts.
 
We've got several alert rules based mainly on the metric name (in the IF condition) and you've said not to create alerts off the metric name. Is this because of metric names that can change which can potentially break alert rules? 

No, it's because you should always be providing metric names in expressions.

The only time you wouldn't be doing that is when doing certain types of performance/correctness debugging (e.g. https://www.robustperception.io/which-are-my-biggest-metrics/). It is also likely that some future long term storage solutions will mandate metric names to be included in selectors.

Brian
 
--
You received this message because you are subscribed to the Google Groups "Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prometheus-users+unsubscribe@googlegroups.com.
To post to this group, send email to prometheus-users@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/prometheus-users/6141794b-1de0-4a0c-aed3-b0720acc0f65%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--

mali...@gmail.com

unread,
Jan 9, 2017, 9:53:45 AM1/9/17
to Prometheus Users, mali...@gmail.com
Cheers Brian
To unsubscribe from this group and stop receiving emails from it, send an email to prometheus-use...@googlegroups.com.
To post to this group, send email to promethe...@googlegroups.com.



--
Reply all
Reply to author
Forward
0 new messages