Hi,
Hope you are
doing great!!
I have an urgent
requirement with our client. Please let me know your interest.
Azure Data Engineer
Pleasanton - CA
USA,
6 to 10 years
Must Have - **2-3
Experience on creating the frameworks towards building the data pipelines.
(Mandatory) · Data Bricks etc., (Mandatory) ***
Good to
Have: Experience in Azure Data Migration project.
- 2-3 years of experience working on Data
bricks, Azure Data Factory and other Azure data solutions ecosystems
(Mandatory)
- 2-3 years of experience working on
Spark SQL, Hive SQL, USQL. (Mandatory).
- 2-3 years of experience working on
Spark, Scala and Python. (Mandatory)
- 2-3 Experience on creating the
frameworks towards building the data pipelines. (Mandatory) · Must have
experience on configure the data streams between Event Hub and Azure
Service Bus with other integration systems such as Data Bricks etc., (Mandatory)
·
- Must have experience working with
Onshore / Offshore model. (Mandatory) ·
- Azure Fundamentals Certification
(AZ-900) and Azure Data Solution (DP-200 & DP-201) (Preferred). · 2-3
years of experience working on JAVA or other object oriented programming
(Preferred).
- Extensive experience working on Big
data technologies such has Hive, Pig and Map Reduce are preferred.
- Experience work with structured and
unstructured data is must.
- Good understanding of data oriented
projects for integration and analytics is must.
- Provide Business Intelligence and Data
Warehousing solutions and support by leveraging project standards and
leading analytics platform.
- Evaluate and define functional
requirements for BI and DW solutions
- Build conceptual and logical models
based on the functional flow of business in a scalable mode.
- Work directly with Business leadership
and Application SMEs to understand the requirement and analyzing the
source to fulfill the requirement.
- Propose and develop data solutions to
enable effective decision-making, driving business objects or addressing
the enterprise integration requirements.
- Analyze the data quality, data
governance, compliance and other legal requirements on data storage;
address all the required non-business but operational requirements during
the design and build of data pipelines.
- Identify avenues on cost savings either
by using in-house Cognizant accelerators or by building re-usable
frameworks across the projects.
- Interpret data by building the models,
charts and tables on reporting platform towards business intelligence
requirements.
- Expert and Key point of contact between
the data analyst, data scientists, and the business/application teams.
Thanks &
Regards
E-Verify, Certified Minority Business
Enterprise MBE, Dun & Bradstreet CREDIBILITY CORP Certified