Onsite job opportunity for the role of GCP Data Engineer in Paramus, NJ

2 views
Skip to first unread message

Rahul Pandey

unread,
May 14, 2026, 3:02:43 PM (2 days ago) May 14
to Recruiting Simplifies

Greetings,

This is Rahul from Quantum world Technologies; I am working as Senior Technical Recruiter in this company. I have an Onsite Job Opportunity with one of our clients. Please share your resume if you are interested in the job details given below:

Job Title: GCP Data Engineer

Location : Paramus, NJ

 

Job Description:

Designs, builds, and optimizes secure, scalable, and high-performance data pipelines and analytics solutions using Google Cloud Platform tools like BigQueryDataflowDataproc, Composer, GCS, Cloud function, Cloud run and Pub/Sub. This role requires 5+ years of experience, expertise in Python/SQL, and implementing data governance and CI/CD pipelines. 

Key Responsibilities

Pipeline Development: Design, build, and optimize end-to-end data pipelines using GCP-native services (Dataflow, Dataproc, Cloud Storage) and Python.

Data Modeling & Architecture: Create high-quality, reproducible data models in BigQuery using partitioning, clustering, and materialized views to enhance performance and manage costs.

Streaming & Real-time: Implement real-time streaming pipelines using Pub/Sub and Apache Beam/Spark Streaming.

Infrastructure & DevOps: Establish CI/CD pipelines for data workflows.

Security & Governance: Implement best practices for data security, including IAM roles, encryption (CMEK), and VPC Service Controls.

Collaboration: Work with stakeholders to define requirements, mentor junior engineers, and produce technical documentation. 

Required Technical Skills

Languages: Strong SQL and Python proficiency.

Platforms: Deep expertise in Google Cloud Platform (GCP).

Tools: BigQuery, Dataflow, Cloud Composer (Airflow), Pub/Sub, Cloud Storage, Dataproc, Cloud sql, Cloud run, Cloud function, logging and monitoring

Data Modeling: Database design, ETL/ELT workflows.

Qualifications

Experience: 5+ years in GCP data engineering

Must have delivered at least 2 to 3 end to end projects as a data engineer using GCP Services

Strong SQL and Python proficiency.

Strong understanding of database design, data modeling (relational, dimensional, NoSQL).

Expertise in data integration, ETL/ELT, and data pipeline development.

Knowledge of cloud security best practices, identity management, and networking.

Familiarity with DevOps, CI/CD, and containerization (Docker, Kubernetes).

Excellent communication and problem-solving skills. 

Education: Bachelor’s degree in computer science, Engineering, or relevant field.

Certifications: Google Cloud Professional Data Engineer certification is highly preferred


 

Thanks & Regards

Rahul Pandey

rahul....@quantumworldit.com

Reply all
Reply to author
Forward
0 new messages