Dataproc job does not exist in project

29 views
Skip to first unread message

Solutions Larroude

unread,
Mar 16, 2024, 12:38:34 PMMar 16
to CDAP User
Hey, guys,

I would like to ask for help, I'm trying to run pipelines in Data Fusion, but when testing each one they are showing the same error:
"Dataproc job {client_project_name}_STG-GSHEETS-LOGISTIC-CONTROL-REPORT-LOGISTIC_08c9077f-e3a7-11ee-9a34-86d8c898b2dd does not exist in project {client_project_name}, region us-central1."

I checked the dataproc jobs and the id {e06833dc-e2fe-11ee-893f-86d8c898b2dd} of this new pipeline is not there.
I checked the permissions and I have the dataproc admin skill on my username.

I tried to redo the pipeline and it presented the following error:: "Deploy failed: No space left on device"


Follw the complete log of the pipeline execution:

Executing PROVISION subtask REQUESTING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Not checking cluster reuse, enabled: true, skip delete: false, idle ttl: 240, reuse threshold: 15

Creating Dataproc cluster cdap-stg-gshee-e06833dc-e2fe-11ee-893f-86d8c898b2dd in project {nome_projeto_cliente}, in region us-central1, with image 2.1, with labels {goog-datafusion-version=6_9, cdap-version=6_9_2-1693551895972, goog-datafusion-edition=developer, gdfusion=larroude-cluster}, endpoint dataproc.googleapis.com:443

Encountered 1 warning while creating Dataproc cluster: The firewall rules for specified network or subnetwork would allow ingress traffic from 0.0.0.0/0, which could be a security risk.

Completed PROVISION subtask REQUESTING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Executing PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Completed PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Executing PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Completed PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Executing PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Completed PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Executing PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Completed PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Executing PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Completed PROVISION subtask POLLING_CREATE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Completed PROVISION task for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Executing DEPROVISION subtask REQUESTING_DELETE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Dataproc job {nome_projeto_cliente}_STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY_Data_e06833dc-e2fe-11ee-893f-86d8c898b2dd does not exist in project larroude-data-prod, region us-central1.

Completed DEPROVISION subtask REQUESTING_DELETE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Executing DEPROVISION subtask POLLING_DELETE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Completed DEPROVISION subtask POLLING_DELETE for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Completed DEPROVISION task for program run program_run:{nome_projeto_cliente}.STG-GSHEETS-SALES-REPORT-DROPSHIP-DAILY.f1003d27-a995-11ee-a937-42d43072b892.workflow.DataPipelineWorkflow.e06833dc-e2fe-11ee-893f-86d8c898b2dd.

Hederson Pereira dos Santos

unread,
Apr 8, 2024, 11:54:08 AMApr 8
to CDAP User
Hello,

The  {nome_projeto_cliente} should be a variable memory. 

You can use it like this: $ {nome_projeto_cliente} and config a variable memory like nome_projeto_cliente with the name of the project in control center.

To acess control center, click in the menu icon (three horizontal bar), then control center. Choose the pipeline and click in set preference.
A new window will be open and you type in key area the name of variable: nome_projeto_cliente, type in value area  the value of variable.

I hope that it can help you.

Solutions Larroude

unread,
Apr 12, 2024, 10:36:57 AMApr 12
to CDAP User
Hey guys,

Problem solved, I contacted Google support, the problem caused was due to lack of cache space in data fusion.
Solution: I cleaned pipelines that I would no longer use, reducing cache space and normalizing executions.
What resulted was contacting Google support directly.



Best regards.
Solutions

Reply all
Reply to author
Forward
0 new messages