Hi Please let me know if you have any suitable profile for this position.
Position: Data Engineer/GCP
Location: Remote
The Role:
Emids are looking for an experienced Data Engineer who can help move on-prem Hadoop workloads to Google Cloud (GCP).
Responsibilities:
•Export data from the Hadoop ecosystem to ORC or Parquet file
•Build scripts to move data from on-prem to GCP
•Build Python/Pyspark pipelines
•Transform the data as per the outlined data model
•Proactively improve pipeline performance and efficiency
‘Must Have’ Experience:
•4+ years of Data Engineering work experience
•2+ years of building Python/Pyspark pipelines
•2+ years working with Hadoop/Hive
•4+ years of experience with SQL
•Cloud experience – GCP
•Experience with Data Warehousing & Data Lake
•Understanding of Data Modeling
•Understanding of data files format like ORC, Parquet, Avro
‘Nice to Have’ Experience:
•Google experience – Cloud Storage, Cloud Composer, DataProc & BigQuery
•Experience using Cloud Warehouses like BigQuery (preferred), Amazon Redshift, Snowflake etc.
•Working knowledge of Distributed file systems like GCS, S3, HDFS etc.
•Understanding of Airflow / Cloud Composer
•CI/CD and DevOps experience
•ETL tools e.g., Informatica (IICS) Ab Initio, Infoworks, SSIS
Thanks and Regards
Utkarsh Dwivedi| 1Point System LLC
Direct: __________ • utk...@1pointsys.com
115 Stone Village Drive • Suite C • Fort Mill, SC • 29708
An E-Verified company | An Equal Opportunity Employer
DISCLAIMER: If you have received this email in error or prefer not to receive such emails in the future, please notify by replying with a ''REMOVE'' in the subject line and your email address shall be removed immediately from the mailer list.