Multiple Positions - GCP Tech Lead & GCP Data Engineer - TEXAS CITY, Texas (Remote) - Fulltime

2 views
Skip to first unread message

myportal myjob

unread,
Jul 18, 2022, 11:02:43 AM7/18/22
to


Hello All,

Hope you are doing great

Below is the job description and the skills required. Let me know if you would be interested to apply for this position.

Position 1:GCP Tech Lead
TEXAS CITY, Texas (Remote)
12 Months
Fulltime

Job Description:
8 to 10 years of experience in Designing ,develop, deploy high performing data solutions  BIDW (OnPrim & Cloud), with minimum 2 to 3 years’ experience  in GCP
Experience in providing end to end data solution from ingestion through visualization for large and complex programs
Experience in model building, turning & implementation of Google BigQuery solution in GCP 
Design schemas for efficient storage and query execution 
Have a good understanding of Google best practice/recommendation and should be able to align the same with the customer requirements to deliver best in class solution for the customer’s 

Analytics requirements.
A strong background and exposure to databases, including both relational (e.g. PostgreSQL, MySQL) and NoSQL (e.g. Redis, Cassandra, MongoDB) database systems
Familiarity with real time streaming and processing of various data sources, including logs, time series telemetry data, unstructured social data and relational data
2+  years of exp in writing  complex queries, stored procedure and exposure to SVN or Git code version tool



Position 2: GCP Data Engineer 
TEXAS CITY, Texas (Remote)
12 Months
Fulltime

Job Description:

GCP Data Engineer with 6 to 8+ years of experience in Data Analytics and Big Data.
Responsible for extract, transform and load (ETL) processes and the creation of applications that can connect to remote APIs. Preferably including streaming data into environments such as BigQuery on Google Cloud Management Platform.
Preferred Experience in implementing Data Pipelines leveraging Google Cloud products such as Cloud BigQuery, GCS, Cloud DataFlow, Cloud Pub/Sub, Cloud BigTable.
Have a good understanding of Google best practice/recommendation and should be able to align the same with the customer requirements to deliver best in class solution for the customer’s Analytics requirements.
A strong background and exposure to databases, including both relational (e.g. PostgreSQL, MySQL) and NoSQL (e.g. Redis, Cassandra, MongoDB) database systems
Familiarity with real time streaming and processing of various data sources, including logs, time series telemetry data, unstructured social data and relational data
Would be preferred to have been a part of implementation/migration of a Data Warehouse and Big Data (Hadoop) project from on-prem to GCP (using Google BigQuery, DataFlow, DataProc etc.)
Work closely with Operations team to tune existing and new architecture.

Thanks & regards,

Syed Abbas

Reply all
Reply to author
Forward
0 new messages