Hope you are doing well,
Role : Big Data admin / Hadoop admin / Apache Airflow Administrator
Location : 100% Remote
Visa : NO H1 Transfers
Job Roles / Responsibilities:
• Experience with data pipeline and workflow management tools: Airflow
• Experience with developing guidelines for Airflow clusters and DAG's
• Experience with Performance tuning of the DAG and task implementation
• Develop DAG - data pipeline to on-board and change management of datasets
• Experience installing Apache Airflow, configuring, and monitoring Airflow cluster
• Understanding of airflow rest services and integration of airflow of platform eco-system
• Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
• Experience building and optimizing - big data- data pipelines, architectures, and data sets.
• Strong analytic skills related to working with unstructured datasets.
• Build processes supporting data transformation, data structures, metadata, dependency, and workload management.
• Proficient in at least one and experience with modern programming languages (Java, C#, Python, JavaScript/TypeScript) and open-source technologies
• Possess professional software engineering practices and best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
• Fast learner and a team player
• Knowledge in AWS/Azure/GCP setup
• Apache Airflow Fundamentals
• ITIL foundation
• Agile/Scrum principles
Thanks & Regards
Mohd Azhar Uddin
Tel: 703-831-8282 Ext. 2526 / (M) (315) 543-4232
Email: m.a...@canopyone.com