Hiring - GCP Data Engineer - San Jose, CA

0 views
Skip to first unread message

bharat kancharla

unread,
Dec 10, 2025, 2:20:52 PMDec 10
to kj...@samrusystems.com

Hello,

 

Hope you are doing good.

 

Please find the below requirement and revert back with suitable resumes.

 

Role: GCP Data Engineer

Location: San Jose, CA (Day 1 Onsite)

Exp required: 10+ years

Visa status: Any visa Except OPT and please avoid modified GC cards.

For GC need E-Verify Mandatory

Max rate: $55/hr on C2C

 

Job Description:

We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives.

You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment

 

Core Responsibilities

Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasetsto support high-volume reporting, business analysis, and data science model development.

Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics

Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaminganalytics.

Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.

 

Required Skills & Experience

Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically:

BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment.

Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub

Programming & Querying:

Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries

SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.

Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar).

DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes.

 

Need below details ASAP: 
 

                                                                      Candidate Details

Full Name

 

Contact Number

 

Personal Email ID

 

Skype ID

 

Technology

 

Total Exp

 

US Experience

 

DOB (MM/DD/YYYY)

 

LinkedIn ID

 

Work authorization & Validity

 

US Entry

 

Visa on arrival to USA

 

Currently working(Y/N):

If yes reason for change?

 

Passport Number

 

Last four digits of SSN

 

Current Location

 

Relocation

 

Any Interviews in pipeline

 

Availability to Join Project

 

Master’s (Stream, University & Year of Completion)

 

Bachelor’s (Stream, University & Year of Completion)

 

References:1 (Current Client)

 

 

 

 

 

References:2 (Previous Client)

 

 

 

 

Comments

NA

Reply all
Reply to author
Forward
0 new messages