Urgent need: GCP Data Engineer Lead | Remote | 12 months+
0 views
Skip to first unread message
syedg...@gmail.com
unread,
Sep 7, 2022, 12:20:59 PM9/7/22
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to a...@signox.net
Hi
How are you doing
We have a job requirement. Please have a look at It and let me know if you have any consultants matching it please send resumes toa...@signox.net
Job Title: GCP Data Engineer Lead
Location: USA (Remote)Experience: 8 -10 yrsPreferred Experience:
• 3+ years of hands-on architecting and building cloud data solutions in Big Query, Redshift, or Snowflake.
• 5+ years of developing data solutions with Python.
• Proficient working with large, structured data sets, 1+billion rows/5+terabytes.
• Demonstrated use and knowledge of Kafka.
• Familiarity with Google Cloud Platform.
• Strong understanding of Data Governance and Data Change Control
Required Experience:
• A deep understanding of data architecture principles and data warehouse methodologies – specifically Kimball or Data Vault.
• Adept at ETL/ELT development and optimization.
• Skilled in a database cloud technology.
• 5+ years of data architecture and data modelling experience.
Experience with real time streaming
Experience or familiarly with writing DevOps script for data engineering pipeline
(Either in GCP or AWS, but GCP preferred)
Big Query experience (API and SQL)
Hadoop ecosystem experience (Pyspark or hive etc.)
Work experience in Startups as a data engineer
Some Application / API experience
Highly skilled in SQL.
• Any object oriented, high-level programming language.
• Bachelor's Degree in computer science or related field.
• GCP, Big Query, GKE, Data Engineer
Phani- not much Pyspark exp- realtime streaming . no Devops
Experience with real time streaming
Experience or familiarly with writing DevOps script for data engineering pipeline
(Either in GCP or AWS, but GCP preferred)
Big Query experience (API and SQL)
Hadoop ecosystem experience (Pyspark or hive etc.)