Need local to Chicago - Data engineer /ETL Developer (Strong SQL + Snowflake+Python) -

1 view
Skip to first unread message

Kiran Kumar

unread,
Mar 4, 2026, 2:59:35 PM (21 hours ago) Mar 4
to Kiran Kumar

Data engineer /ETL Developer (Strong SQL + Snowflake+Python) –

Local to Chicago (primary and strongly preferred).

 

•              Reports to the Wabash location.

•              Hybrid schedule: 3 days/week on site (current anchor days: Tuesday, Wednesday, Thursday).

•              Standard working hours: 8:00 AM – 5:00 PM CT.

 

8+ years in data engineering / warehousing

•              Strong SQL + Snowflake

•              Solid Python (for framework work + data transformation)

•              Experience with large-scale migrations or platform modernization

•              Background in FS, risk, or regulatory reporting (nice-to-have, not mandatory)

 

Project & Technical Scope

•              Role supports a new enterprise data platform build-out to replace/sunset legacy IBM DataStage + Netezza/RAQ systems.

•              Tech stack includes: Snowflake, SQL, Python (strongly desired), Airflow (orchestration), and custom internal Python-based frameworks.

•              Focus areas:

o             Data engineering + ETL development

o             Participation in design/architecture (majority still hands-on development)

o             Data modeling (nice to have)

 

Domain Focus

•              Supports financial risk and regulatory reporting domains including:

o             Credit Risk, Market Risk, Model Risk

o             CCAR, FR Y 14, SECL, and other regulatory submissions

•              Financial industry experience is highly preferred (helps candidates acclimate much faster).

 

Job Description*

Project Overview: Lead the design, and implementation of the enterprise data ecosystem, driving the modernization and consolidation of data platforms to support business intelligence, advanced analytics, and regulatory reporting. This includes a critical focusing on the design, development, and maintenance of data warehousing solutions that support credit risk modeling.

Contractor's Role:
· Provide strategic vision for data architecture, platforms, and governance, translating business objectives into effective, scalable, and secure technical solutions.
· Lead the end-to-end design and implementation of a scalable enterprise data ecosystem, migrating legacy systems into a modern, consolidated cloud environment.
· Architect and maintain robust data warehouse structures in Snowflake to support complex credit risk models, ensuring data lineage, accuracy, and auditability for regulatory compliance.
· Build and optimize high-performance ETL/ELT pipelines using Python, ensuring seamless data flow from disparate sources into the central warehouse.
· Design and manage complex workflows using data orchestration tools (e.g., Airflow) to ensure high availability and reliability of data feeds.
· Drive the transition from fragmented data silos to a unified platform, implementing best practices in data modeling.

Experience Level - 3 - Senior
· 8+ years in Data Engineering, with a proven track record of leading large-scale data modernization or consolidation projects.
· Deep experience in Snowflake architecture, including performance tuning, data sharing, and cost management.
· Expert-level knowledge of SQL and dimensional data modeling techniques.
· Mastery of Python for data manipulation, automation, and API integrations.
· Extensive experience with enterprise-grade orchestration tools (e.g., Apache Airflow)

Qualifications
· Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent professional experience.
· In-depth knowledge of data architecture principles, data modeling techniques (including dimensional modeling for regulatory reporting).
· Strong communication and presentation skills, with the ability to convey complex technical concepts and strategy, including regulatory requirements related to data, to both technical and executive-level stakeholders.
· Expertise in data warehousing and ETL/ELT patterns, particularly within a financial risk context.

Nice to Have
· Datastage ETL experience.
· CI/CD pipeline deployment processes.

Daily Tasks and Responsibilities
· Lead technical workshops to capture business and technical requirements, particularly those related to data warehousing solutions that support credit risk modeling.
· This includes coding, testing, and deploying processes to extract, transform, and load data from various sources.
· validating data to ensure accuracy and consistency, which is crucial for reliable analysis.
· Perform technical evaluations of new data technologies and features, building proof-of-concepts to validate architectural approaches.
· Serve as the key technical liaison between the engineering teams.

 

Thanks & Regards,

Kiran Kumar
Email: Ki...@sapphiresoftwaresolutions.com

Sapphire Software Solutions Inc | Certified Minority Business Enterprise (MBE)

 

Reply all
Reply to author
Forward
0 new messages