Good
understanding of AWS Data Stack including services like S3, Redshift,
EMR, Lambda, Athena and Glue. Good understanding of Hadoop/HDFS Designing and developing ETL jobs across multiple platforms Designing
and developing schema definitions and support data warehouse/mart to
enable integration of disparate data sources from within Intuit and
outside, aggregate it and make it available for analysis Must have
mastery of data warehousing technologies including data modelling, ETL
and reporting. Ideal candidate to have 3+ years of experience in
end-to-end data warehouse implementations and at least 2 projects with
4TB+ data volume Experience in developing DB schemas, creating ETLs, and familiar with MPP/Hadoop systems Good knowledge of Operating Systems (Unix or Linux) Good understanding of Data ware House methodologies Hands on experience in any of the programming languages (Shell scripting, Scala, Python, Java, etc) Good written, oral communication and presentation Skills Knowledge of Big Data eco system like Hadoop M/R, Pig and Hive is a strong plus Good knowledge on Agile principles and experience working in scrum teams using Jira Good to have work experience in service based companies deployed in bigger product based companies