Onsite Job opportunity for the role of Data Engineer in Mountain View / Oakland CA (Mandatory 3 days/week onsite)

0 views
Skip to first unread message

Rahul Pandey

unread,
Feb 20, 2026, 1:38:36 PM (2 days ago) Feb 20
to TI_LINE

Greetings,

This is Rahul from Quantum world Technologies; I am working as Senior Technical Recruiter in this company. I have a Remote Job Opportunity with one of our clients. Please share your resume if you are interested in the job details given below:

Job Title: Data Engineer

Location: Mountain View / Oakland CA (Mandatory 3 days/week onsite – No flexibility)

Job Type-Contract

Skills Required: GCP, MySQL, Python, Kafka, Terraform

Only local California candidate

Banking and financial sector experience is must.

Job Summary

  • Data Pipeline Ownership: Design, develop, and maintain robust ETL (Extract, Transform, Load) data pipelines to process raw GCP Billing Export data and other large datasets.
  • Cost Attribution Logic: Implement and optimize complex backend logic and data models to accurately attribute shared infrastructure costs (e.g., MySQL, Kafka, BigQuery, and GCS usage) to the appropriate business verticals.
  • Backend Engineering: Own the development lifecycle for core backend services, ensuring high performance, scalability, and stability, with a strict focus on data accuracy.
  • Organizational Mapping: Collaborate with finance and platform teams to integrate organizational structure and mapping into the cost attribution system.
  • System Optimization: Perform deep-dive performance tuning on data processing jobs and database interactions to ensure efficient handling of large datasets.
  • Infrastructure & Automation: Use infrastructure-as-code principles (e.g., Terraform) for managing underlying resources and develop automation scripts (e.g., Python, Bash) for operational tasks.
  • Reliability & Monitoring: Implement monitoring and alerting for all pipelines to ensure data quality and uninterrupted service delivery.
  • Documentation: Maintain up-to-date documentation and runbooks detailing the data models, ETL logic, and cost attribution methodology.

Required Qualifications

  • Software Engineering Expertise: 6+ years of experience in backend software development, focusing on large-scale data processing and high-volume systems.
  • Language Proficiency: Expert-level proficiency in at least one of the following: Python, GoLang, or Typescript.
  • Cloud Cost Management: Hands-on experience with cloud financial data, specifically processing and utilizing GCP Billing Export data for cost analysis and attribution.
  • Data Platform Experience: Strong familiarity and working experience with key data technologies, including MySQL, Kafka, BigQuery, and GCS.
  • Backend & Data Processing: Deep understanding of backend engineering principles, data pipelines, ETL processes, and processing large, complex datasets.
  • Cloud Infrastructure: Hands-on experience with Google Cloud Platform (GCP) services.

 

 

Thanks & Regards

Rahul Pandey

rahul....@quantumworldit.com

Senior Technical Recruiter            
Reply all
Reply to author
Forward
0 new messages