Security Command Center findings to BigQuery

42 views
Skip to first unread message

Jason Gumbs

unread,
Feb 8, 2022, 5:53:28 PM2/8/22
to pubsub-discuss

I'm attempting to send Security Command Center findings to BQ by way of a Dataflow job. When I run the newly created job, a second BQ table is automatically created with the same name as the table that I created except with the name "error_records" added to the end.

Im guessing that the solutions lies in the schema or my original table but, I'm not sure. I have no idea how to fix/correct this. Any suggestions would be helpful.

shameemf

unread,
Feb 8, 2022, 7:08:15 PM2/8/22
to pubsub-discuss
Looks like it's an intended behavior.  A dead-letter BigQuery Table is automatically created to catch messages that fail due to various reasons including — message schemas that do not match BigQuery table schema, malformed JSON and messages which throw errors while transforming via the JavaScript function. 

Please find the documentation here [1]-[2]
- "outputDeadletterTable
(Optional) Name of the output BigQuery table to hold failed records. If not provided, pipeline creates table during execution with name {output_table_name}_error_records. For example: my-project-ID:my_dataset_name.my-table-name." [1]

- "outputDeadletterTable
The BigQuery table for messages that failed to reach the output table, in the format of <my-project>:<my-dataset>.<my-table>. If it doesn't exist, it is created during pipeline execution. If not specified, OUTPUT_TABLE_SPEC_error_records is used instead." [2]

Reply all
Reply to author
Forward
0 new messages