Role : GCP Data Engineer
Location : Remote
Visa : NO H1 Transfers
Job Roles / Responsibilities:
Requirements
· Minimum 8-10 yrs of experience in IT experience
· Experience using Python and SQL for data filtering, transformational and loading
· Experience developing ETL/ELT using tools such as airflow/cloud composer
· Ability to set up and monitor real-time streaming data solutions (Kafka)
· Experience working with relational and MPP databases such as Postgres, HIVE, and Bigquery
· Ability to leverage software development lifecycle capabilities including Git version control, unit testing, and CI/CD pipelines.
Projects they would work on
· ETLs to make shared data sources (digital thread).write ETL job to get data lake into postgres for more robust applications
· Work on setting up streaming data sources and consumption for manufacturing and other data sources. Help current applications scale through optimization.
· Create data engineering pipelines and templates for data sources in GCP
Responsibilities
· Create and automate ETL mappings to consume and aggregate data from multiple different data sources
· Monitor performance, troubleshoot and tune ETL processes as appropriate
· Execution of end to end implementation of underlying data ingestion workflow.
· Solve complex data problems to deliver insights that helps our business to achieve their goals
Thanks & Regards
Mohd Azhar uddin
Tel: 703-831-8282 Ext. 2526,(M) (315) 543-4232
Thanks & Regards
Mohd Azhar uddin
Tel: 703-831-8282 Ext. 2526,(M) (315) 543-4232