Job Title: Cloud ETL Developer
Location - Richmond, VA (look for nearby candidates within 40-50 miles)
Work Arrangement- Hybrid (3 days onsite 2 days remote)
Must have skills: ETL processes, and business intelligence, Azure Databricks, Data Factory v2, Data Lake Store, Data Lake Analytics
Job Description:
· The candidate must have a minimum of 10 years of experience delivering business data analysis artifacts
· 5+ years of experience as an Agile Business Analyst; strong understanding of Scrum concepts and methodology
· Experience organizing and maintaining Product and Sprint backlogs
· Experience translating client and product strategy requirements into dataset requirements and user stories
· Proficient with defining acceptance criteria and managing acceptance process
· Exceptional experience writing complex sql queries for Sql Server and Oracle
· Experience with Azure Databricks
· Experience with ESRI ArcGIS
· Experience with enterprise data management
· Expertise with Microsoft Office products (Word, Excel, Access, Outlook, Visio, PowerPoint, Project Server)
· Experience with reporting systems – operational data stores, data warehouses, data lakes, data marts
· The candidate must have exceptional written and oral communications skills and have the proven ability to work well with a diverse set of peers and customers
Technologies Required:
· Data Factory v2,Data Lake Store, Data Lake Analytics, Azure Analysis Services, AZURE Synapse
· IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse.
· Operating System Environments (Windows, Unix, etc.).
· Scripting experience with Windows and/or Python, Linux Shell scripting
Required:
· Designs and develops systems for the maintenance of the Data Asset Program, ETL processes, and business intelligence.
· Design and support the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data.
· Work closely with data analysts, data scientists, and other data consumers within the business to gather and populate data hub and data.
· Advanced understanding of data integration. · Strong knowledge of database architectures, strong understanding of ingesting spatial data.
· Ability to negotiate and resolve conflicts, · Ability to effectively prioritize and handle multiple tasks and projects.
· Excellent computer skills and be highly proficient in the use of Ms Word, PowerPoint, Ms Excel, MS Project, MS Visio, and MS Team Foundation Server.
· Experience with key data warehousing architectures including Kimball and Inmon and has broad experience designing solutions using a broad set.
· expertise in Data Factory v2, Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse.
· IBM DataStage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse.
· Operating System Environments (Windows, Unix, etc.).· Scripting experience with Windows and/or Python, Linux Shell scripting.
· Experience in AZURE Cloud engineering.
Thanks J
Shaik Jaffar IT Recruiter | www.xeroictech.com | |
A: XeroicTech Inc, 2803 Philadelphia Pike, Suite B, 19703 | |
![]() | |
The content of this email is confidential and intended for the recipient specified in message only. It is strictly forbidden to share any part of this message with any third party, without a written consent of the sender. If you received this message by mistake, please reply to this message and follow with its deletion, so that we can ensure such a mistake does not occur in the future. |