8+ ONLY - Big Data/Hadoop Developer with GCP @ Dearborn - MI

2 views
Skip to first unread message

azhar uddin

unread,
Mar 31, 2022, 10:38:38 AM3/31/22
to

Hi Professionals,

Role : Big Data/Hadoop Developer with GCP

Location : Dearborn - MI

Visa :  NO H1 Transfers

Job Roles / Responsibilities:       


Required Skills Technical Skills
- Python,Hadoop Administration,Big Data Management 

Minimum 8 + Years working experience in Bigdata, Hadoop, Spark, Python, Scala, Kafka,SQLs,ETL development, data modelling • Hands on experience in GCP, Big Query, GCS Bucket, G-Cloud Function, cloud dataflow pub/sub cloud shell, GSUTIL, BQ command line utilities, DataProc, Stack driver • Experience in writing a program using g-cloud function – to load data in to BigQuery for on arrival files in GCS bucket • Experience in writing a program to maintain raw file archival in GCS B

Minimum 8 + Years working experience in Bigdata, Hadoop, Spark, Python, Scala, Kafka,SQLs,ETL development, data modelling
Hands on experience in GCP, Big Query, GCS Bucket, G-Cloud Function, cloud dataflow pub/sub cloud shell, GSUTIL, BQ command line utilities, DataProc, Stack driver
Experience in writing a program using g-cloud function – to load data in to BigQuery for on arrival files in GCS bucket
Experience in writing a program to maintain raw file archival in GCS Bucket
 Designing any schema in BigQuery – full scan for OLAP/BI use cases, experience in any technology usage for disk I/O throughput – cloud platform economy of scale – any technology in combination of MapRed and BigQuery to get better performance
 Loading Data on incremental basis to BIGQUERY raw and UDM layer using SOQL, Google DataProc, GCS bucket, HIVE, Spark, Scala, Python, Gsutil and Shell Script.
Experience in writing a program to download Database (SQL Server, Oracle, DB2) dump and load it in GCS Bucket – from GCS bucket to Database (hosted in Google cloud) and load to BigQuery using python/spark/Scala/DataProc
Experience in processing and loading bound and unbound Data from Google pub/sub to BigQuery using cloud Dataflow with scripting language
Using BigQuery rest API with (python/spark/Scala) to ingest Data from and some other site to BIGQUERY, build App Engine-Based Dashboards
Do participate in architecture council for database architecture recommendation.
Deep analysis on SQL execution plan and recommend hints or restructure or introduce index or materialized view for better performance
 Open SSH tunnel to Google DataProc to access to yarn manager to monitor spark jobs.
Submit spark jobs using gsutil and spark submission get it executed in Dataproc cluster Qualifications:
2+ Experience Google Cloud Platform technologies.
Experience in Private,hybrid or public cloud technology.

 Process GCP and other cloud implementation an


Thanks & Regards

Mohd Azhar uddin

Tel: 703-831-8282 Ext. 2526,(M) (315) 543-4232

 

Reply all
Reply to author
Forward
0 new messages