IMMEDIATE NEED : DATA ENGINEER

4 views
Skip to first unread message

Deepak Gulia

unread,
Sep 16, 2020, 11:00:40 AM9/16/20
to dee...@godigitive.com

Please send me the profiles at dee...@godigitive.com

 

Position : Data Engineer (3 Months to start with)

Location: South San Francisco, CA

 

JD

 

  • Data engineer role with primary knowledge on API integration tool on how it works/ manipulation etc.
  • Capability to learn tool called workato ( it's a API integration tool)
  • Collect requirement, document requirement, and create workflow.
  • Python and AWS knowledge optional.

 

 

  

 Deepak Gulia | Digitive LLC  – cloud. made simple
Fax: 408-935-8696 | Email: 
dee...@godigitive.com
GTALK :-  
deepakguli...@gmail.com 

Deepak Gulia

unread,
Sep 16, 2020, 1:36:02 PM9/16/20
to dee...@godigitive.com

Deepak Gulia

unread,
Sep 25, 2020, 12:41:06 PM9/25/20
to dee...@godigitive.com

 Role: Data Engineer

Location: Remote (preferred west coast as the candidate shall be working in PST)

Description:

  • Design, develop, test, deploy, support, enhance data integration solutions seamlessly to connect and integrate the enterprise systems in our Enterprise Data Platform.
  • Innovate for data integration in Apache Spark-based Platform to ensure the technology solutions leverage cutting edge integration capabilities.
  • Experience with ETL, data pipeline creation to load data from multiple data sources.

 

 

Primary Skills:

·        4+ years working experience in data integration and pipeline development.

·        BS degree in CS, CE or EE.

·        2+ years of Experience with AWS Cloud on data integration with Apache Spark, EMR, Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS, MongoDB/DynamoDB ecosystems

·        Strong real-life experience in python development especially in pySpark in AWS Cloud environment.

·        Design, develop test, deploy, maintain and improve data integration pipeline.

·        Experience in Python and common python libraries.

·        Strong analytical experience with database in writing complex queries, query optimization, debugging, user defined functions, views, indexes etc.

·        Strong experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools.

·        Databricks, Redshift Experience is a plus.

 

Note: Please look for 4-5 years of work experience only as the budget here is limited.

Reply all
Reply to author
Forward
0 new messages