GCP Data Engineer with LUMI
Location: Phoenix (Preferably local but others who are open for relocation will also be considered)
Exp: 12+ yrs
Mode: Hybrid- 3 days onsite in a week
Job Description:
We
are looking for 2 seriously skilled Senior Data Engineer with a solid
experience of building Bigdata, GCP Cloud based ETL Pipelines and Spark
applications, strong problem-solving skills, articulate communications,
and a collaborative mindset.
Highly prefer AXP LUMI experience due to the time sensitive, critical nature of the work.
Hands-on
software development experience with Big Data & Analytics solutions
like Spark, Python, shell scripting, GCP Cloud - Big Query, Airflow,
DataProc, PubSub, Kafka, Git, Jenkins, etc.
Strong experience in
designing, developing, and optimizing data pipelines for large-scale
data processing, transformation, and analysis using Big Data and GCP
technologies.
Proficiency in SQL and database systems, with
experience in designing and optimizing data models for performance and
scalability. Knowledge of distributed (multi-tiered) systems, algorithms
& relational databases.
Strong Object-Oriented Programming skills and design patterns.
Experience with CICD pipelines, Automated test frameworks, and source code management tools (XLR, Jenkins, Git).