● Implement a data architecture and infrastructure that aligns with business objectives. Collaborate closely with Application Engineers and Product Managers to ensure that the technical infrastructure robustly supports client requirements
● Create ETL and data pipeline solutions for efficient loading of data into the warehouse along with their testing to ensure reliability and optimal performance
● Collect, validate, and provide high-quality data, ensuring data integrity
● Champion data democratization efforts, facilitating accessibility to data for relevant stakeholders
● Guide the team with regard to technical best practices and contribute substantially to the architecture of our systems
● Supporting operational work like onboarding new customers to our data products and participating in on-call for the team
● Engage with cross-functional teams, including Solutions Delivery, Business Intelligence, Predictive Analytics, and Enterprise Services, to address and support any data-related technical issues or requirements
● At least 6-8 years of experience in working with data warehouses, data lakes, and ETL pipelines
● Proven experience with building optimized data pipelines using Snowflake and dbt
● Expert in orchestrating data pipelines using Apache Airflow, including authoring, scheduling, and monitoring workflows
● Exposure to AWS and proficiency in cloud services such as EKS(Kubernetes), ECS, S3, RDS, IAM etc.
● Experience designing and implementing CI/CD workflows using GitHub Actions, Codeship, Jenkins etc.
● Experience with tools like Terraform, Docker, Kaka
● Strong experience with Spark using Scala and Python
● Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake and various relational databases like Redshift, Postgres, etc.
● Experience with data modeling and system design architecting scalable data platforms and applications for large enterprise clients.
● A dedicated focus on building high-performance systems
● Exposure to building data quality frameworks
● Strong problem-solving and troubleshooting skills, with the ability to identify and resolve data engineering issues and system failures
● Excellent communication skills, with the ability to communicate technical information to non-technical stakeholders and collaborate effectively with cross-functional teams
● The ability to envision and construct scalable solutions that meet diverse needs for enterprise clients with dedicated data teams
● Previous engagement with healthcare and/or social determinants of health data products.
● Experience leveraging agentic-assisted coding tools (eg:, Cursor, Codex AI, Amazon Q, GitHub Copilot)
● Experience working with R
● Experience with processing health care eligibility and claims data
● Exposure to Matillion ETL
● Experience using and building solutions to support various reporting and data user tools (Tableau, Looker, etc)
--
Thanks & Regards
Hangouts: sek...@tekwings.com / usekh...@gmail.com
Tekwings Requirements Email group : https://groups.google.com/d/forum/tekwings_requrements_group1
LinkedIn Group: https://www.linkedin.com/groups/10421204/