Hello Debezium Community,
I am having trouble configuring ad-hoc snapshots using signals with Debezium Server, Postgres (Cloud SQL), and a Google Pub/Sub sink.
My goal is to trigger an incremental snapshot by inserting a row into `public.debezium_signal`.
### Environment
* **Debezium:** `
quay.io/debezium/server:2.6` (running as a StatefulSet on GKE)
* **Database:** Postgres 16.10 (on Google Cloud SQL)
* **Sink:** Google Cloud Pub/Sub
* **Connector:** `io.debezium.connector.postgresql.PostgresConnector` (pgoutput)
### Debezium Server Configuration (`application.properties`)
Here is the configuration from my ConfigMap:
```yaml
debezium.sink.type=pubsub
debezium.sink.pubsub.project.id=digestojud
debezium.sink.pubsub.ordering.enabled=true
# ... other sink properties ...
debezium.errors.tolerance=all
debezium.event.processing.failure.handling.mode=fail # Note this
debezium.source.connector.class=io.debezium.connector.postgresql.PostgresConnector
debezium.source.offset.storage.file.filename=data/offsets.dat
debezium.source.offset.flush.interval.ms=0
debezium.source.snapshot.mode=never
debezium.source.database.hostname=...
debezium.source.database.port=5432
debezium.source.database.user=...
debezium.source.database.password=...
debezium.source.database.dbname=op
debezium.source.database.server.name=op-staging
debezium.source.topic.prefix=op-staging
debezium.source.plugin.name=pgoutput
debezium.source.slot.name=dbzv13_stg
debezium.source.publication.name=dbz
debezium.source.publication.autocreate.mode=disabled
# My included tables
debezium.source.table.include.list=public.user_company,public.customer,public.api_call_log,public.monitored_event,public.monitored_person,public.cadastro,public.monthly_event_count
# My Signal Table configuration
debezium.source.signal.data.collection=public.debezium_signal
quarkus.log.level=DEBUG
```
Signal Table Structure
```yaml
op=> \d debezium_signal
Table "public.debezium_signal"
Column | Type | Collation | Nullable | Default
--------+------------------+-----------+----------+---------
id | varchar(255) | | not null |
type | varchar(64) | | not null |
data | text | | |
Indexes:
"debezium_signal_pkey" PRIMARY KEY, btree (id)
```
### The data inserted
```sql
INSERT INTO public.debezium_signal (id, type, data) VALUES ('766a36af-5b42-4b3f-aa76-063deef149be', 'execute-snapshot', '{ "type": "incremental", "data-collections": ["public.monthly_event_count"], "additional-conditions": [ { "data-collection": "public.monthly_event_count", "filter": "evt_type = 204 AND year = 2025 AND month = 10 AND is_trial = FALSE" } ] }');
```
### The Problem & What I've Found
**Scenario 1 (Expected):** If `public.debezium_signal` is **NOT** added to my Postgres publication (`dbz`) and **NOT** in the `table.include.list`, nothing happens when I `INSERT` a signal. No logs, no snapshot.
**Scenario 2 (The Error):** To fix Scenario 1, I add `public.debezium_signal` to my Postgres publication (`ALTER PUBLICATION dbz ADD TABLE public.debezium_signal;`) and I also add it to the `debezium.source.table.include.list` in my config.
When I do this, the connector successfully sees the signal! However, it immediately fails with the following error from the Pub/Sub sink:
```
io.debezium.DebeziumException: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.NotFoundException: io.grpc.StatusRuntimeException: NOT_FOUND: Resource not found (resource=op-staging.public.debezium_signal).
```
### My Analysis
It seems Debezium is correctly identifying the event as a signal (to be processed internally) but *also* treating it as a standard CDC event that needs to be sent to the sink.
Because the Pub/Sub topic `op-staging.public.debezium_signal` doesn't exist (and I don't want it to), the sink fails. And because I have `failure.handling.mode=fail`, this error stops the entire batch, which causes the signal processing itself to fail.
I also see these logs, confirming the signal was received but failed:
```
Action execute-snapshot failed. The signal SignalRecord{id='592a36af...'} may not have been processed.
```
### My Questions
1. Is my analysis correct?
2. What is the standard way to configure Debezium Server to **process** events from `signal.data.collection` but **not** send those same events to the sink?
3. Do I really need to add the `debezium_signal` table to the publication?
4. I verified that the query using the same filters as the insert made in `debezium_signal` returns exactly one result, but the Debezium logs indicate that no result was found. What are the likely reasons for this discrepancy?
Thank you for your help!