I have some urgent requirements with my client. Please send me your updated resume along with your hourly rate / yearly salary expectations, if interested. In case you are not interested, it will be nice to let your friends know of this position who may be a potential fit.
Hadoop Developer with GCP
Atlanta, GA
Contract
Required Skills:
· Spark with strong Java/Python/Scala
· Hands on of Hadoop concepts
· Design big data pipelines using GCP big data products.
· Familiar with standard pipeline architectures for streaming and batch data using various combinations of tools like Dataflow, BigQuery, PubSub etc.
· Able to analyze pros and cons of various architectures from different perspectives – cost, compute, robustness etc.
· Comfortable with both hands on development as well as discussing the design aspects with customer leadership like enterprise architects and program managers.
· Strong experience in Python, PySpark required
· Working knowledge of Hadoop ecosystem including Hive, HBase, Kafka
· Should have basic understanding of Machine Learning concepts such as feature engineering and should have worked in a Data Science/ Data Engineering environment
· Comfortable with Docker and Kubernetes including companion orchestration tools such as Argo or Kubeflow
· Led teams in the development of production quality ETL workflows for ingestion, transformation of scalable
· Strong communication skills, and the ability to see the impact of development on the overall product
· Pro actively identify and manage opportunities for optimizing processes
· Should be able to coach, guide and mentor junior members in the team
· Responsible for development, support, maintenance and implementation of big data projects
· GCP Data Engineer certification preferred
Thanks and Regards,
Dev Chauhan
Direct No. 609-551-3117
Unsubscribe Link: https://forms.gle/QD3FQLLvVEqP7s2y8)