Hi,
Please find the JD and let me know your interest.
Position: Sr Data Engineer with Python & GCP
Location: Phoenix, AZ (Locals Only & F2F Must)
Duration: 12+ Months
Client: Wells Fargo
Exp: 10+ Year
Visa: H1B
Job Description:
Responsibilities
· Design, develop, and optimize data pipelines using Python, PySpark, and GCP services (BigQuery, Dataflow, Dataproc, Pub/Sub).
· Build ETL/ELT workflows for risk analytics, ensuring scalability, security, and cost efficiency.
· Collaborate with analytics teams on data modeling, query optimization, and platform monitoring.
· Implement IaC (Terraform) for GCP infrastructure and automate CI/CD for data platforms.
Must-Have Qualifications
· 10+ years in data engineering; 5+ years hands-on Python/PySpark and GCP (cert preferred: Professional Data Engineer).
· Expertise in GCP data tools: BigQuery (SQL optimization), Dataflow (Apache Beam), Cloud Storage/DataProc.
· Strong SQL, ETL experience, and knowledge of data governance/security (IAM, VPC).
Nice-to-Have
· FinOps/cost optimization in GCP; leadership/mentoring.
Thanks & Regards
Vidyasagar K
Sr Technical Recruiter
Rohatech LLC.
2550 W Union Hills Dr, Suite 350, Phoenix, AZ 85027
Email - vsa...@rohatech.com
Mobile:- 602-666-8288