NOTE : Need Passport Number and Client ID ( BADGE ) for below 10 years profiles
Technology stack:
• Strong hands on experience in pyspark core, pyspark sql, python,hive and apache airflow dag development
• Ability to understand and write complex sql queries as per the client requirement.
• Good knowledge on AWS EC2 instances, EMR clusters, s3 storage, Athena, snowflake and dynamodb.
• Knowledge on Bitbucket and jenkins to deploy the code .
• Knowledge on JIRA ticketing tool and Agile methodology.
Roles and Responsibilities:
• Develop spark jobs using python API
• Writing spark SQL and HQL to accommodate the requirement
• Fine tuning the spark jobs
• Creating Hive DDLs
• Develop the airflow dags to define the workflow.
• Testing and validating the developed spark scripts and dags.
• Deploy the code onto the dev and prod environments
• Monitoring the max table dags in production
• Monitoring the apollo process dags in the production
• Working on any enhancements in the max tables and apollo process
• Generating weekly and monthly post edge apollo report.
The common question that gets asked in business is, 'why?' That's a good question, but an equally valid question is, 'why not?'
Thanks & Regards
Mohd Azhar uddin
4229 Lafayette Center Dr., Suite #1625, Chantilly, VA 20151
Tel: 703-831-8282 Ext. 2526, Fax : 703-439-2550,