Barman is not deleting the old backups as per retention policy

224 views
Skip to first unread message

Rajasekar Arumugam

unread,
Feb 20, 2023, 8:42:06 AM2/20/23
to Barman, Backup and Recovery Manager for PostgreSQL
Hi Team, 

I have Barman is not deleting the old backups as per retention policy. The DB which is backing up is in recovery mode as this is a DR site.

[db-data]
backup_directory = /backups/db-data
backup_method = postgres
retention_policy_mode = auto
retention_policy = RECOVERY WINDOW OF 14 days
streaming_archiver = on
; PATH setting for this server
create_slot = auto



barman check db-data
Server db-data:
    PostgreSQL: OK
    superuser or standard user with backup privileges: OK
    PostgreSQL streaming: OK
    wal_level: OK
    replication slot: OK
    directories: OK
    retention policy settings: OK
    backup maximum age: OK (no last_backup_maximum_age provided)
    backup minimum size: OK (39.3 GiB)
    wal maximum age: OK (no last_wal_maximum_age provided)
    wal size: OK (1.8 GiB)
    compression settings: OK
    failed backups: OK (there are 0 failed backups)
    minimum redundancy requirements: OK (have 2 backups, expected at least 0)
    pg_basebackup: OK
    pg_basebackup compatible: OK
    pg_basebackup supports tablespaces mapping: OK
    systemid coherence: OK
    pg_receivexlog: OK
    pg_receivexlog compatible: OK
    receive-wal running: OK
    archiver errors: OK


barman list-backup db-data|wc -l
199



barman list-backup db-data
db-data 20230220T010002 - Mon Feb 20 01:12:52 2023 - Size: 39.4 GiB - WAL Size: 1.8 GiB - WAITING_FOR_WALS
db-data 20230219T010003 - Sun Feb 19 01:15:40 2023 - Size: 39.3 GiB - WAL Size: 4.1 GiB - WAITING_FOR_WALS
db-data 20230218T010003 - Sat Feb 18 01:14:24 2023 - Size: 39.1 GiB - WAL Size: 4.4 GiB - WAITING_FOR_WALS
db-data 20230217T010003 - Fri Feb 17 01:14:16 2023 - Size: 38.1 GiB - WAL Size: 4.7 GiB - WAITING_FOR_WALS
db-data 20230216T010003 - Thu Feb 16 01:18:21 2023 - Size: 38.0 GiB - WAL Size: 4.3 GiB - WAITING_FOR_WALS
db-data 20230215T010003 - Wed Feb 15 01:16:33 2023 - Size: 37.9 GiB - WAL Size: 4.6 GiB - WAITING_FOR_WALS
db-data 20230214T010003 - Tue Feb 14 01:15:50 2023 - Size: 37.8 GiB - WAL Size: 4.2 GiB - WAITING_FOR_WALS
db-data 20230213T010003 - Mon Feb 13 01:16:14 2023 - Size: 37.8 GiB - WAL Size: 4.3 GiB - WAITING_FOR_WALS
db-data 20230212T010003 - Sun Feb 12 01:12:40 2023 - Size: 37.5 GiB - WAL Size: 4.4 GiB - WAITING_FOR_WALS
db-data 20230211T010002 - Sat Feb 11 01:17:55 2023 - Size: 37.4 GiB - WAL Size: 4.4 GiB - WAITING_FOR_WALS
db-data 20230210T010003 - Fri Feb 10 01:17:57 2023 - Size: 37.3 GiB - WAL Size: 4.4 GiB - WAITING_FOR_WALS
db-data 20230209T010003 - Thu Feb  9 01:16:33 2023 - Size: 37.2 GiB - WAL Size: 4.0 GiB
db-data 20230208T010003 - Wed Feb  8 01:20:55 2023 - Size: 37.1 GiB - WAL Size: 4.3 GiB - WAITING_FOR_WALS
db-data 20230207T010003 - Tue Feb  7 01:20:06 2023 - Size: 37.0 GiB - WAL Size: 5.1 GiB - WAITING_FOR_WALS
db-data 20230206T010003 - Mon Feb  6 01:13:51 2023 - Size: 36.8 GiB - WAL Size: 6.0 GiB - WAITING_FOR_WALS
db-data 20230205T010003 - Sun Feb  5 01:16:05 2023 - Size: 36.6 GiB - WAL Size: 4.6 GiB - WAITING_FOR_WALS
db-data 20230204T010002 - Sat Feb  4 01:12:45 2023 - Size: 36.4 GiB - WAL Size: 3.9 GiB - WAITING_FOR_WALS
db-data 20230203T010002 - Fri Feb  3 01:17:19 2023 - Size: 36.3 GiB - WAL Size: 4.4 GiB - WAITING_FOR_WALS
db-data 20230202T010003 - Thu Feb  2 01:13:12 2023 - Size: 36.1 GiB - WAL Size: 4.2 GiB - WAITING_FOR_WALS
db-data 20230201T010002 - Wed Feb  1 01:13:49 2023 - Size: 35.9 GiB - WAL Size: 4.3 GiB - WAITING_FOR_WALS
db-data 20230131T010003 - Tue Jan 31 01:15:00 2023 - Size: 35.8 GiB - WAL Size: 5.0 GiB - WAITING_FOR_WALS
db-data 20230130T010002 - Mon Jan 30 01:14:45 2023 - Size: 35.6 GiB - WAL Size: 4.2 GiB - WAITING_FOR_WALS
db-data 20230129T010003 - Sun Jan 29 01:13:05 2023 - Size: 35.5 GiB - WAL Size: 4.6 GiB - WAITING_FOR_WALS
db-data 20230128T010003 - Sat Jan 28 01:13:20 2023 - Size: 35.3 GiB - WAL Size: 5.2 GiB - WAITING_FOR_WALS
db-data 20230127T010003 - Fri Jan 27 01:11:38 2023 - Size: 35.2 GiB - WAL Size: 4.3 GiB - WAITING_FOR_WALS
db-data 20230126T010003 - Thu Jan 26 01:16:18 2023 - Size: 35.0 GiB - WAL Size: 3.8 GiB - WAITING_FOR_WALS
db-data 20230125T010003 - Wed Jan 25 01:12:29 2023 - Size: 34.9 GiB - WAL Size: 4.2 GiB - WAITING_FOR_WALS
db-data 20230124T010003 - Tue Jan 24 01:13:21 2023 - Size: 34.6 GiB - WAL Size: 4.2 GiB
db-data 20230123T010014 - Mon Jan 23 01:14:02 2023 - Size: 34.9 GiB - WAL Size: 4.9 GiB - WAITING_FOR_WALS
db-data 20230122T010003 - Sun Jan 22 01:10:19 2023 - Size: 34.7 GiB - WAL Size: 4.0 GiB - WAITING_FOR_WALS
db-data 20230121T010002 - Sat Jan 21 01:10:41 2023 - Size: 34.5 GiB - WAL Size: 4.0 GiB - WAITING_FOR_WALS
db-data 20230120T010003 - Fri Jan 20 01:12:56 2023 - Size: 34.2 GiB - WAL Size: 4.2 GiB - WAITING_FOR_WALS
db-data 20230119T010002 - Thu Jan 19 01:13:45 2023 - Size: 34.0 GiB - WAL Size: 4.0 GiB - WAITING_FOR_WALS
db-data 20230118T010003 - Wed Jan 18 01:11:50 2023 - Size: 33.8 GiB - WAL Size: 4.0 GiB - WAITING_FOR_WALS
db-data 20230117T010003 - Tue Jan 17 01:12:13 2023 - Size: 33.7 GiB - WAL Size: 4.0 GiB - WAITING_FOR_WALS
db-data 20230116T010003 - Mon Jan 16 01:13:08 2023 - Size: 33.6 GiB - WAL Size: 4.0 GiB - WAITING_FOR_WALS
db-data 20230115T010002 - Sun Jan 15 01:10:40 2023 - Size: 33.4 GiB - WAL Size: 4.0 GiB - WAITING_FOR_WALS
db-data 20230114T010002 - Sat Jan 14 01:11:48 2023 - Size: 33.2 GiB - WAL Size: 4.2 GiB - WAITING_FOR_WALS
db-data 20230113T010003 - Fri Jan 13 01:13:58 2023 - Size: 33.1 GiB - WAL Size: 3.7 GiB - WAITING_FOR_WALS
db-data 20230112T010003 - Thu Jan 12 01:10:51 2023 - Size: 33.0 GiB - WAL Size: 3.5 GiB - WAITING_FOR_WALS
db-data 20230111T010002 - Wed Jan 11 01:12:17 2023 - Size: 32.9 GiB - WAL Size: 3.4 GiB - WAITING_FOR_WALS
db-data 20230110T010003 - Tue Jan 10 01:10:52 2023 - Size: 32.8 GiB - WAL Size: 4.0 GiB - WAITING_FOR_WALS
db-data 20230109T010003 - Mon Jan  9 01:12:31 2023 - Size: 32.7 GiB - WAL Size: 3.6 GiB - WAITING_FOR_WALS
db-data 20230108T010002 - Sun Jan  8 01:16:04 2023 - Size: 32.5 GiB - WAL Size: 3.5 GiB - WAITING_FOR_WALS
db-data 20230107T010002 - Sat Jan  7 01:10:58 2023 - Size: 32.4 GiB - WAL Size: 3.8 GiB - WAITING_FOR_WALS
db-data 20230106T010002 - Fri Jan  6 01:10:36 2023 - Size: 32.3 GiB - WAL Size: 601.6 MiB - WAITING_FOR_WALS
db-data 20230105T010003 - Thu Jan  5 01:13:28 2023 - Size: 32.2 GiB - WAL Size: 4.9 GiB - WAITING_FOR_WALS
db-data 20230104T010002 - Wed Jan  4 01:11:44 2023 - Size: 32.1 GiB - WAL Size: 2.8 GiB - WAITING_FOR_WALS
db-data 20230103T010002 - Tue Jan  3 01:10:00 2023 - Size: 32.0 GiB - WAL Size: 4.1 GiB - WAITING_FOR_WALS
db-data 20230102T010002 - Mon Jan  2 01:10:09 2023 - Size: 31.9 GiB - WAL Size: 2.3 GiB - WAITING_FOR_WALS
db-data 20230101T010002 - Sun Jan  1 01:09:54 2023 - Size: 31.8 GiB - WAL Size: 1.8 GiB - WAITING_FOR_WALS
db-data 20221230T010002 - Fri Dec 30 01:11:25 2022 - Size: 31.7 GiB - WAL Size: 4.2 GiB - WAITING_FOR_WALS
db-data 20221229T010002 - Thu Dec 29 01:11:14 2022 - Size: 31.6 GiB - WAL Size: 2.7 GiB - WAITING_FOR_WALS
db-data 20221228T010002 - Wed Dec 28 01:13:40 2022 - Size: 31.5 GiB - WAL Size: 2.5 GiB - WAITING_FOR_WALS
db-data 20221227T010002 - Tue Dec 27 01:12:24 2022 - Size: 31.4 GiB - WAL Size: 2.3 GiB - WAITING_FOR_WALS
db-data 20221226T010002 - Mon Dec 26 01:11:08 2022 - Size: 31.3 GiB - WAL Size: 2.0 GiB - WAITING_FOR_WALS
db-data 20221225T010002 - Sun Dec 25 01:10:59 2022 - Size: 31.2 GiB - WAL Size: 2.1 GiB - WAITING_FOR_WALS
db-data 20221224T010002 - Sat Dec 24 01:10:05 2022 - Size: 31.1 GiB - WAL Size: 1.6 GiB - WAITING_FOR_WALS
db-data 20221223T010002 - Fri Dec 23 01:13:00 2022 - Size: 31.0 GiB - WAL Size: 2.2 GiB - WAITING_FOR_WALS
db-data 20221222T010002 - Thu Dec 22 01:12:53 2022 - Size: 31.0 GiB - WAL Size: 2.7 GiB - WAITING_FOR_WALS
db-data 20221221T010003 - Wed Dec 21 01:09:55 2022 - Size: 31.0 GiB - WAL Size: 3.3 GiB - WAITING_FOR_WALS
db-data 20221220T010002 - Tue Dec 20 01:12:37 2022 - Size: 30.9 GiB - WAL Size: 2.4 GiB - WAITING_FOR_WALS
db-data 20221219T010002 - Mon Dec 19 01:11:04 2022 - Size: 30.9 GiB - WAL Size: 2.4 GiB - WAITING_FOR_WALS
db-data 20221218T010002 - Sun Dec 18 01:09:44 2022 - Size: 30.8 GiB - WAL Size: 1.7 GiB - WAITING_FOR_WALS
db-data 20221217T010002 - Sat Dec 17 01:10:11 2022 - Size: 30.7 GiB - WAL Size: 2.0 GiB - WAITING_FOR_WALS
db-data 20221216T010002 - Fri Dec 16 01:12:15 2022 - Size: 30.7 GiB - WAL Size: 2.3 GiB - WAITING_FOR_WALS
db-data 20221215T010002 - Thu Dec 15 01:12:09 2022 - Size: 30.6 GiB - WAL Size: 2.3 GiB - WAITING_FOR_WALS
db-data 20221214T010002 - Wed Dec 14 01:09:29 2022 - Size: 29.5 GiB - WAL Size: 6.3 GiB - WAITING_FOR_WALS
db-data 20221213T010002 - Tue Dec 13 01:09:26 2022 - Size: 29.4 GiB - WAL Size: 2.7 GiB - WAITING_FOR_WALS
db-data 20221212T010002 - Mon Dec 12 01:09:04 2022 - Size: 29.3 GiB - WAL Size: 2.8 GiB - WAITING_FOR_WALS
db-data 20221211T010002 - Sun Dec 11 01:12:45 2022 - Size: 29.3 GiB - WAL Size: 1.9 GiB - WAITING_FOR_WALS
db-data 20221210T010002 - Sat Dec 10 01:11:53 2022 - Size: 29.2 GiB - WAL Size: 1.6 GiB - WAITING_FOR_WALS
db-data 20221209T010002 - Fri Dec  9 01:11:10 2022 - Size: 29.1 GiB - WAL Size: 3.0 GiB - WAITING_FOR_WALS
db-data 20221208T010002 - Thu Dec  8 01:11:41 2022 - Size: 29.1 GiB - WAL Size: 2.8 GiB - WAITING_FOR_WALS
db-data 20221207T010002 - Wed Dec  7 01:10:41 2022 - Size: 29.0 GiB - WAL Size: 2.4 GiB - WAITING_FOR_WALS
db-data 20221206T010002 - Tue Dec  6 01:11:28 2022 - Size: 29.0 GiB - WAL Size: 2.4 GiB - WAITING_FOR_WALS
db-data 20221205T010003 - Mon Dec  5 01:12:07 2022 - Size: 28.9 GiB - WAL Size: 3.2 GiB - WAITING_FOR_WALS
db-data 20221204T010002 - Sun Dec  4 01:10:08 2022 - Size: 28.8 GiB - WAL Size: 2.1 GiB - WAITING_FOR_WALS
db-data 20221203T010003 - Sat Dec  3 01:11:06 2022 - Size: 28.8 GiB - WAL Size: 1.9 GiB - WAITING_FOR_WALS
db-data 20221202T010002 - Fri Dec  2 01:09:39 2022 - Size: 28.7 GiB - WAL Size: 2.7 GiB - WAITING_FOR_WALS
db-data 20221201T010002 - Thu Dec  1 01:09:02 2022 - Size: 28.7 GiB - WAL Size: 2.3 GiB - WAITING_FOR_WALS
db-data 20221130T010002 - Wed Nov 30 01:09:16 2022 - Size: 28.6 GiB - WAL Size: 2.5 GiB - WAITING_FOR_WALS
db-data 20221129T010003 - Tue Nov 29 01:10:48 2022 - Size: 28.4 GiB - WAL Size: 2.6 GiB - WAITING_FOR_WALS
db-data 20221128T010002 - Mon Nov 28 01:11:24 2022 - Size: 28.4 GiB - WAL Size: 2.4 GiB - WAITING_FOR_WALS
db-data 20221127T010002 - Sun Nov 27 01:09:49 2022 - Size: 28.4 GiB - WAL Size: 1.9 GiB - WAITING_FOR_WALS
db-data 20221126T010002 - Sat Nov 26 01:11:06 2022 - Size: 28.3 GiB - WAL Size: 1.8 GiB - WAITING_FOR_WALS
db-data 20221125T010002 - Fri Nov 25 01:10:41 2022 - Size: 28.3 GiB - WAL Size: 2.3 GiB - WAITING_FOR_WALS
db-data 20221124T010002 - Thu Nov 24 01:11:37 2022 - Size: 28.3 GiB - WAL Size: 2.2 GiB - WAITING_FOR_WALS
db-data 20221123T010002 - Wed Nov 23 01:09:35 2022 - Size: 28.1 GiB - WAL Size: 2.4 GiB - WAITING_FOR_WALS
db-data 20221122T010002 - Tue Nov 22 01:09:33 2022 - Size: 28.1 GiB - WAL Size: 2.6 GiB - WAITING_FOR_WALS
db-data 20221121T010002 - Mon Nov 21 01:08:23 2022 - Size: 27.9 GiB - WAL Size: 2.3 GiB - WAITING_FOR_WALS
db-data 20221120T010003 - Sun Nov 20 01:12:27 2022 - Size: 27.8 GiB - WAL Size: 2.0 GiB - WAITING_FOR_WALS
db-data 20221119T010002 - Sat Nov 19 01:09:15 2022 - Size: 27.8 GiB - WAL Size: 1.7 GiB - WAITING_FOR_WALS
db-data 20221118T010002 - Fri Nov 18 01:11:43 2022 - Size: 27.7 GiB - WAL Size: 2.6 GiB - WAITING_FOR_WALS
db-data 20221117T010002 - Thu Nov 17 01:12:55 2022 - Size: 27.7 GiB - WAL Size: 2.5 GiB - WAITING_FOR_WALS
db-data 20221116T010002 - Wed Nov 16 01:07:40 2022 - Size: 27.6 GiB - WAL Size: 2.5 GiB - WAITING_FOR_WALS
db-data 20221115T010002 - Tue Nov 15 01:07:25 2022 - Size: 27.6 GiB - WAL Size: 2.2 GiB - WAITING_FOR_WALS
db-data 20221114T010002 - Mon Nov 14 01:09:40 2022 - Size: 27.6 GiB - WAL Size: 2.4 GiB - WAITING_FOR_WALS
db-data 20221113T010002 - Sun Nov 13 01:06:57 2022 - Size: 27.5 GiB - WAL Size: 1.9 GiB - WAITING_FOR_WALS
db-data 20221112T010003 - Sat Nov 12 01:07:45 2022 - Size: 27.5 GiB - WAL Size: 1.8 GiB - WAITING_FOR_WALS
 

Luca Ferrari

unread,
Feb 20, 2023, 9:19:20 AM2/20/23
to Barman, Backup and Recovery Manager for PostgreSQL
barman cron not running?

--
--
You received this message because you are subscribed to the "Barman for PostgreSQL" group.
To post to this group, send email to pgba...@googlegroups.com
To unsubscribe from this group, send email to
pgbarman+u...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/pgbarman?hl=en?hl=en-GB

---
You received this message because you are subscribed to the Google Groups "Barman, Backup and Recovery Manager for PostgreSQL" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pgbarman+u...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/pgbarman/cb3844bf-7996-4316-a73d-b1037d763ef3n%40googlegroups.com.

Rafał Karpiński

unread,
Feb 20, 2023, 9:28:27 AM2/20/23
to pgba...@googlegroups.com
Is barman cron configured and running?
After installation from for example rpm there is example config for /etc/cron.d

--
--
You received this message because you are subscribed to the "Barman for PostgreSQL" group.
To post to this group, send email to pgba...@googlegroups.com
To unsubscribe from this group, send email to
pgbarman+u...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/pgbarman?hl=en?hl=en-GB

---
You received this message because you are subscribed to the Google Groups "Barman, Backup and Recovery Manager for PostgreSQL" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pgbarman+u...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/pgbarman/cb3844bf-7996-4316-a73d-b1037d763ef3n%40googlegroups.com.
--
Rafał Karpiński
rafal(at)maxweb.pl

Rajasekar Arumugam

unread,
Feb 20, 2023, 10:40:42 AM2/20/23
to Barman, Backup and Recovery Manager for PostgreSQL
Hi , Yes cron is working 

# /etc/cron.d/barman: crontab entries for the barman package
MAILTO=root
* * * * * barman [ -x /usr/bin/barman ] && /usr/bin/barman -q cron

Rajasekar Arumugam

unread,
Feb 20, 2023, 10:41:36 AM2/20/23
to Barman, Backup and Recovery Manager for PostgreSQL
barman -v
2.19 Barman by EnterpriseDB (www.enterprisedb.com)

Mike Wallace

unread,
Feb 20, 2023, 11:22:29 AM2/20/23
to pgba...@googlegroups.com
If the cron job is running then it is likely that the retention policy
is working as intended and there is an issue with WAL streaming.

The retention policy will not delete any of the backups in a
`WAITING_FOR_WALS` state - this is because there is an assumption that
such backups will eventually move to a `DONE` state (via `barman
cron`) at which point they will be considered as candidates for
deletion when the retention policy is applied. The retention policy
will also not consider the `WAITING_FOR_WALS` backups as being enough
to satisfy the retention policy because the `WAITING_FOR_WALS` state
means that Barman's WAL archive does not have the WAL segments
required in order for PostgreSQL to recover from the backup.

This means that in the available output there are only two complete
backups which are candidates for deletion: 20230209T010003 and
20230124T010003. Because the retention policy is `RECOVERY WINDOW OF
14 days`, Barman will be keeping the newest completed backup before
the start of that 14 day window so that any point within the window
can be reached via point-in-time recovery. Since the backup dated 9th
February is within the last 14 days, that means the backup dated 24th
January is required and therefore not deleted by the retention policy.

The underlying issue may therefore be that there are so many backups
remaining in the `WAITING_FOR_WALS` state which never move to a `DONE`
state. The fact that there are so many backups in this state suggests
that there is an issue with WAL streaming - for example, the
`pg_receivewal` process running on the Barman server may not be
keeping up with the rate of WAL generation which could lead to WALs
being lost if max_slot_wal_keep_size is exceeded for the replication
slot being used by Barman. This is just one possibility - but if
`barman cron` is running regularly yet backups are not ending up in
the `DONE` state, investigating the health of Barman's WAL streaming
seems like a sensible next step.

On Mon, Feb 20, 2023 at 3:41 PM 'Rajasekar Arumugam' via Barman,
Backup and Recovery Manager for PostgreSQL <pgba...@googlegroups.com>
wrote:
> To view this discussion on the web, visit https://groups.google.com/d/msgid/pgbarman/1e43e74d-377a-470e-ac8e-2fc0f25aed14n%40googlegroups.com.

Rajasekar Arumugam

unread,
Feb 20, 2023, 11:59:29 AM2/20/23
to Barman, Backup and Recovery Manager for PostgreSQL
Thanks Mike, Is there any recommended config parameters to address this  `WAITING_FOR_WALS` as this is only happening in standby server. We have separate barman server for prod where I dont see this issue for Prod DB. 

Rafał Karpiński

unread,
Feb 20, 2023, 12:02:32 PM2/20/23
to pgba...@googlegroups.com
Any logs from barman?
/var/log/barman

--
Rafał Karpiński
rafal(at)maxweb.pl

Rajasekar Arumugam

unread,
Feb 20, 2023, 12:21:35 PM2/20/23
to Barman, Backup and Recovery Manager for PostgreSQL

No Errors on the logfile.

2023-02-20 11:07:02,541 [2266] barman.wal_archiver INFO: Found 1 xlog segments from streaming for db-data. Archive all segments in one run.
2023-02-20 11:07:02,541 [2266] barman.wal_archiver INFO: Archiving segment 1 of 1 from streaming: db-data/000000070000037B00000001
2023-02-20 11:08:02,595 [2311] barman.wal_archiver INFO: No xlog segments found from streaming for db-data.
2023-02-20 11:09:02,810 [2432] barman.wal_archiver INFO: No xlog segments found from streaming for db-data.
2023-02-20 11:09:05,214 [13010] barman.command_wrappers INFO: db-data: pg_receivewal: finished segment at 37B/3000000 (timeline 7)
2023-02-20 11:09:52,303 [13010] barman.command_wrappers INFO: db-data: pg_receivewal: finished segment at 37B/4000000 (timeline 7)
2023-02-20 11:10:01,900 [2478] barman.wal_archiver INFO: Found 2 xlog segments from streaming for db-data. Archive all segments in one run.
2023-02-20 11:10:01,901 [2478] barman.wal_archiver INFO: Archiving segment 1 of 2 from streaming: db-data/000000070000037B00000002
2023-02-20 11:10:02,926 [2478] barman.wal_archiver INFO: Archiving segment 2 of 2 from streaming: db-data/000000070000037B00000003
2023-02-20 11:11:02,173 [2565] barman.wal_archiver INFO: No xlog segments found from streaming for db-data.

Mike Wallace

unread,
Feb 22, 2023, 8:10:52 AM2/22/23
to pgba...@googlegroups.com
Hi Rajasekar,

It's difficult to make any configuration recommendations without a better understanding of what's happening. If you haven't already, check the complete Barman log over the last couple of months for any errors relating to `pg_receivewal`, e.g.:

    2023-02-22 12:31:02,560 [1244] barman.server ERROR: ArchiverFailure:pg_receivewal terminated with error code: 1

or:

    2023-02-22 12:34:05,242 [1795] barman.command_wrappers INFO: main: pg_receivewal: starting log streaming at 2/B7000000 (timeline 1)
    2023-02-22 12:34:05,242 [1795] barman.command_wrappers INFO: main: pg_receivewal: error: unexpected termination of replication stream: ERROR:  requested WAL segment 0000000100000002000000B7 has already been removed
    2023-02-22 12:34:05,242 [1795] barman.command_wrappers INFO: main: pg_receivewal: error: disconnected
    2023-02-22 12:34:05,242 [1795] barman.command_wrappers DEBUG: Command return code: 1
    2023-02-22 12:34:05,243 [1795] barman.server ERROR: ArchiverFailure:pg_receivewal terminated with error code: 1

The second example indicates that Barman's WAL streaming has fallen so far behind that the next WAL it requires is no longer available on the PostgreSQL server. This seems unlikely given backups are still being taken (Barman will refuse to take backups if pg_receivewal is not running) but it's a possibility.

You could also check the replication slot status on the standby by running the following on the standby:

    select * from pg_replication_slots where slot_name = 'BARMAN_SLOT_NAME';

where BARMAN_SLOT_NAME is the value of slot_name in your barman configuration.The `restart_lsn` is the oldest LSN still required by the WAL streaming process and the `wal_status` should be either `reserved` or `extended`.

Another thing to check is the WAL archive itself -  check the `Begin WAL` and `End WAL` values in the output of `barman show-backup SERVER_NAME BACKUP_ID` for one of the backups in a `WAITING_FOR_WALS` state and check whether all WALs in that (inclusive) range are available in `BARMAN_HOME/SERVER_NAME/wals` directory. If all WALs are present then it's possible that the WAL metadata which Barman stores in `BARMAN_HOME/SERVER_NAME/wals/xlog.db`is corrupt or incomplete, in which case you could rebuild it with `barman rebuild-xlogdb SERVER_NAME`. If there are WALs which are not within the directory at all then this points again to WAL streaming being an issue.

It's also worth checking whether any WALs have accumulated in `BARMAN_HOME/SERVER_NAME/streaming` which are not being moved into the WAL archive by `barman cron`.

Sorry I can't give any more specific advice and I hope some of this is helpful. If you confirm there is an issue with Barman or the standby keeping up with WAL streaming then this blog post may help with the wal_keep_size and max_slot_wal_keep_size PostgreSQL configuration variables: https://www.2ndquadrant.com/en/blog/pg13-slot-size-limit/

Best,

Mike


Rajasekar Arumugam

unread,
Mar 16, 2023, 6:49:59 AM3/16/23
to Barman, Backup and Recovery Manager for PostgreSQL
Hi Mike,

I have validated the config but I am not able to find anything. I could see all the backups are working fine which is less than 150 GB and rest of all waiting for wals. I am not sure what to do on the large clusters as receive wal and all other setup is correct.

Thanks
Raj

Mike Wallace

unread,
Mar 16, 2023, 5:35:05 PM3/16/23
to pgba...@googlegroups.com
Hi Raj,

I was thinking about this some more and wondered if maybe `barman
cron` is hitting some issues - can you try running `barman cron` by
hand as the barman user and see whether there is anything notable in
either the output or the Barman log files? It would be helpful to see
as much of the output and logs as you can share as well as to know the
length of time it takes to complete.

Best,

Mike

On Thu, Mar 16, 2023 at 10:50 AM 'Rajasekar Arumugam' via Barman,
> To view this discussion on the web, visit https://groups.google.com/d/msgid/pgbarman/58881d8b-3a60-42ec-8093-5a58e016dfa2n%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages