Error while creating Dataproc cluster with Cloud SQL connectivity.

75 views
Skip to first unread message

Anand Varne

unread,
Nov 15, 2022, 9:01:44 AM11/15/22
to Google Cloud Dataproc Discussions

Hi Teams,

We are trying to create Dataproc cluster with Cloud SQL connectivity.

Business impact we are facing

Description / Issue / Errors: Error in Dataproc initialisation script, unable to reach network for Cloud MySQL Start and end time of incident (and timezone): 10 Nov 2022, 17:28:49, asia-south1-a 

Cluster name: xxx-gcp-uat-xxx-events 

Cluster UUID: xxx-27xxxbe-4484-xxxx-0498da65d192 

If applicable: How are you submitting your Jobs?: Using CLI 

Cluster creation is failing in Cluster initialisation stage Attach Job Driver Output from failed Job.


ERROR: (gcloud.dataproc.clusters.create) Operation [projects/xx-uat-xxxx-prj-spk-4d/regions/asia-south1/operations/571c939d-d612-3885-98b6-70c950e719ef] failed: Multiple Errors:

- Initialization action failed. Failed action 'gs://xxx-gcp-uat-xxx-artifacts-bucket/xxx-analytics-artifacts/scripts/cloud-sql-proxy.sh', see output in: gs://xxx-gcp-uat-xxx-analytics-bucket/google-cloud-dataproc-metainfo/9e6c916d-1a86-43e3-bae5-37ae0dd2ad25/xxx-gcp-uat-xxx-analytics-m/dataproc-initialization-script-0_output


CLI Command that we are using to create Dataproc Cluster.


gcloud dataproc clusters create xxx-gcp-uat-xxx-events \

  --scopes sql-admin,cloud-platform \

  --subnet ${SUBNETWORK} \

  --no-address \

  --region ${REGION} --zone ${ZONE} \

  --enable-component-gateway --bucket ${BUCKET} \

  --master-machine-type n1-standard-2 --master-boot-disk-size 100 \

  --master-boot-disk-type pd-ssd \

  --num-workers ${NUM_WORKERS_PROCESSING} --worker-machine-type n1-standard-4 --worker-boot-disk-size 100 \

  --worker-boot-disk-type pd-ssd \

  --image-version 2.0-debian10 \

  --properties ^#^spark:spark.jars.packages='org.apache.spark:spark-sql-kafka-0-10_2.12:3.1.2,io.delta:delta-core_2.12:1.0.0,com.typesafe:config:1.4.2'#spark:spark.sql.extensions='io.delta.sql.DeltaSparkSessionExtension'#spark:spark.sql.catalog.spark_catalog='org.apache.spark.sql.delta.catalog.DeltaCatalog'#hive:hive.metastore.warehouse.dir=gs://${BUCKET}/warehouse \

  --initialization-actions gs://${ARTIFACTS_BUCKET}/scripts/cloud-sql-proxy.sh \

  --metadata "hive-metastore-instance=${PROJECT}:${REGION}:${METASTORE_DB}" \

  --metadata "enable-cloud-sql-proxy-on-workers=false" \

  --metadata=block-project-ssh-keys=true \

  --metadata "db-admin-secret=xxx-xxx-gcp-uat-xxx-db-root-password-secret" \

  --metadata "db-admin-secret-version=1" \

  --tags ${NETWORK_TAG} \

  --gce-pd-kms-key projects/xxx-srs-infosec-prj-spk-xx/locations/asia-south1/keyRings/xxx-gcp-app-uat-xxx-as1-kr/cryptoKeys/xxx-xxx-UAT-APP-Key \

  --service-account=sa-xxx-uat-c...@xxx-uat-xxx-prj-spk-4d.iam.gserviceaccount.com \

  --project ${PROJECT}


Thanks,

Anand Varne

Reply all
Reply to author
Forward
0 new messages