Hi There,
I've got an example where I have several metrics with unique names and but they all have the same labels, see below,
The metrics have a value of 0 or 1
example_name_1 {job="another_job", environment="test"}
example_name_2 {job="another_job", environment="test"}
example_name_3 {job="another_job", environment="test"}
The alert rule in prometheus for the above is defined similar to,
ALERT Failed_Example_Alert
IF {job="another_job", alertname !="Failed_Example_Alert"} == 1
FOR 11m
LABELS { severity = "warn", name = "{{ $labels.__name__ }}", environment = "{{ $labels.environment }}"" }
ANNOTATIONS {
summary = "The example test has failed",
description = "{{ $labels.__name__ }} has failed",
action = "Check build",
}
The query for {job="", alertname != "Failed_Example_Alert"} == 1 in prometheus correctly displays the 3 results. If all the above metrics have a value of 1 I expected to see 3 alerts created but I only see 1 alert created by prometheus. Does prometheus override the alert as the label values are the same? If I include the name of the metric as a label name and value (when building up the metric) I get 3 Alerts as expected. Apologies I haven't looked at the codebase yet to figure out what prometheus does. I understand that adding the name of the metric again as a label maynot follow the prometheus metric naming conventions although it created the correct number of alerts.
Thanks