Silencing does not work

338 views
Skip to first unread message

dc3o

unread,
Mar 26, 2021, 8:03:47 AM3/26/21
to Prometheus Users
I'm setting a silence condition using job =~ ".*"  or job=~".+". On alertmanager I'm seeing equl number of alerts as I do see on alertmanager silencing tab. Yet for some of the alerts that are appearing as silenced  - I still do receive slack or pd notifications. I'm expecting these not to appear at all according to AM dashboard. Any idea?

Nemanja Delic

unread,
Mar 26, 2021, 9:38:30 AM3/26/21
to Prometheus Users
What I'm noticing is that only for some alerts and for specific label combinations - it keeps re-sending alerts. 

On Fri, Mar 26, 2021 at 1:03 PM dc3o <deln...@gmail.com> wrote:
I'm setting a silence condition using job =~ ".*"  or job=~".+". On alertmanager I'm seeing equl number of alerts as I do see on alertmanager silencing tab. Yet for some of the alerts that are appearing as silenced  - I still do receive slack or pd notifications. I'm expecting these not to appear at all according to AM dashboard. Any idea?

--
You received this message because you are subscribed to the Google Groups "Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prometheus-use...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/prometheus-users/77bab951-1de7-46e6-a75c-a57d00eb1163n%40googlegroups.com.

Michal Kobus

unread,
May 4, 2021, 6:20:26 AM5/4/21
to Prometheus Users
Experiencing the same problem. Alertmanager official Docker image version v0.21.0

Creating Silence matching all alerts (alertname=~".*"). After silence is saved I see all affected alerts below (about 50 matched). Configured dummy HTTP-webhook counter and set it as receiver in Alertmanager. I observe alerts still being sent to the receiver (both firing and resolved).

Michal Kobus

unread,
May 4, 2021, 7:52:59 AM5/4/21
to Prometheus Users
I wrote a script to create silence per alertname, this way I could silence all alerts. It seems there's an issue with regex matcher.

Michal Kobus

unread,
May 5, 2021, 3:47:46 AM5/5/21
to Prometheus Users
There we have it: https://github.com/prometheus/alertmanager/blob/1f3796c5cc58bdcf6fedfb427580c7bfab1f88ba/silence/silence.go#L464
So at least one defined matcher must fail for empty string, which explains failure for alertname=~".*" matcher.

However, it doesn't explain why matchers fail (not on validation stage though) for ".+" or ".+[ae].+" regexp.

So by setting alertname=~".+", matcher will get "^(?:.+)$" which should work as well. It works more than less, but some alerts are still being sent to receivers, not all but some and sometimes, which doesn't show anything special in debug logs.
Reply all
Reply to author
Forward
0 new messages