Immediate start
• Strong proficiency in Python and SQL and experience with AWS services.
• Experience designing, building and maintaining ETL/ELT data pipelines for large-scale data processing.
• Familiarity / Solid understanding of data modeling, data warehousing concepts, and optimizing pipelines for performance and scalability.
• Experience with code versioning (Git or Bitbucket) and CI/CD pipelines for automated deployments.
• Ability to implement data quality checks, monitoring, and troubleshooting to ensure reliability and accuracy of data pipelines.
• Familiarity and knowledge of Agile Project management practices (Jira, Confluence).
• Experience collaborating with cross-functional teams to translate business requirements into scalable data solutions.
• Direct experience with Treasure Data CDP Platform, or Medtech domain