Remote || Data Architect with Snowflake || Long Term Contract

0 views
Skip to first unread message

sandeep aggarwal

unread,
Oct 27, 2023, 10:30:16 AM10/27/23
to sand...@centraprise.com

Hello,

Hope you are doing well.

Please have a look at the below job description and share some relevant profiles for this opportunity.

 

Role                       :                               Data Architect with Snowflake

Location               :                               Remote

Duration              :                               Long Term Contract

Experience         :                               12+ years

 

Note: Certified SnowPro (Desirable)

 

Job Description:

 

Required:

·         Must have 4years strong experience in Data modelling.

·         Able to understand, analyze and design enterprise data model, and have expertise on data modelling tool erwin.

·         Experience in designing the Canonical logical and physical data model using erwin data modeler tool.

·         Experience in Snowflake Data warehouse - data modeling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts

·         Have expertise on SQL (PL/SQL) programming.

·         Has built processes supporting data transformation, data structures, metadata, dependency and workload management

·         Has experience on analyzing source data, transforming data, creating mapping document, build design documents, and data dictionary etc.

·         Good experience in end to end Data warehousing, ETL, and BI projects.

·         Hands on experience on any of the ETL tools like (e.g. Attunity, Informatica or Datastage)

·         Hands-on experience on at least one CDC tool: Oracle Goldengate, Qlik etc.

·         Has Led or involved on data migration projects end to end

·         Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot

·         Experience with data security and data access controls and design

·         Experience with AWS, or Azure or GCP data storage and management technologies such as S3, Blob/ADLS or Google Cloud Storage.

·         Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface

 

Good to have:

·         Have created flexible and scalable snowflake model

·         Experience with Snowflake utilities, SnowSQL, SnowPipe and Big Data model techniques using Python

·         have experience at least two end-to-end implementations of Snowflake cloud data warehouse.

·         Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features

·         Should have good understanding of 3NF and Star schema

·         Preferred to have expertise in any one programming language Python/Java/Scala

·         Able to develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modeling.

·         Able to troubleshoot problems across infrastructure, platform and application domains.

·         Should be able to provide technical leadership to large enterprise scale projects.

 

Roles & Responsibilities:

·         Understand, analyze and design enterprise data models.

·         Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface.

·         build processes supporting data transformation, data structures, metadata, dependency and workload management.


--

Warm Regards!


Sandeep Aggarwal

Direct: +1 (848-668-9626)

Email: sand...@centraprise.com

Connect with me: www.linkedin.com/in/sandeep-aggarwal-434413245

 

Reply all
Reply to author
Forward
0 new messages