Hi,
I am using BigQueryInsertJobOperator inside a dynamic task group in version 2.5.3 (composer 2.3.5)
the operator succeed to deploy a new job to bigquery but fails after with the following error:
Traceback (most recent call last):
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1384, in _run_raw_task
self._execute_task_with_callbacks(context, test_mode)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1533, in _execute_task_with_callbacks
self.task.post_execute(context=context, result=result)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/lineage/__init__.py", line 77, in wrapper
ret_val = func(self, context, *args, **kwargs)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 1135, in post_execute
post_execute_prepare_lineage(self, context)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/composer/data_lineage/operators/__init__.py", line 98, in post_execute_prepare_lineage
mixin.post_execute_prepare_lineage(task, context) # type: ignore[attr-defined]
File "/opt/python3.8/lib/python3.8/site-packages/airflow/composer/data_lineage/operators/google/cloud/bigquery.py", line 43, in post_execute_prepare_lineage
job_id = job_id_path.split(":")[-1]
AttributeError: 'LazyXComAccess' object has no attribute 'split'it looks like a bug from the latest composer version. any one encountered a similar issue?any workaround or any thought will be appreciated :)