Hi Folks,
My name is David (Dave). I am a Senior Technical Recruiter. I’m focused on finding the most qualified candidates for the US IT industry.
We have the below job opening with us. Please look into it and help us and yourself by providing the valuable candidate’s data.
Position: Big Data Engineer
Work Location: Alpharetta, GA – Need Locals
Required Work Authorization: USC, GC & H1B Only
Project Type: Long Term Project
Type of Hire: Corp to Corp (C2C)
· 3 days onsite.
· Ready to go for in-person interview.
Job Description:
Department Profile:
Wealth Managements Core Platform Services group provides horizontal services to all Wealth management Development Teams. Our mission is to provide stable and scalable infrastructure and technology solutions for entire Wealth Management. We serve as a centralized interface through which Wealth Management teams can obtain infrastructure solutions and project support by working closely with application owners throughout the SDLC process to ensure that established products/services are leveraged, and new requirements are fulfilled.
Position Description:
· This position is for Big data engineer for Company's Wealth Management.
· Framework CoE team at Company’s Alpharetta or New York offices.
· CoE team is responsible to define and govern the data platforms.
We are looking for colleagues with strong sense of ownership and ability to drive solutions.
· The role is primarily responsible to automate the existing process and bring new ideas and innovation.
· The candidate is expected to code, conduct code reviews, and test framework as needed, along with participating in application architecture and design and other phases of the automation.
· The ideal candidate will be a self-motivated team player committed to delivering on time and should be able to work with or without minimal supervision.
Responsibilities:
· Design & develop new automation framework for ETL processing.
· Support existing framework and become technical point of contact for all related teams.
· Enhance existing ETL automation framework as per user requirements.
· Performance tuning of spark, snowflake ETL jobs.
· New technology POC and suitability analysis for Cloud migration.
· Process optimization with the help of automation and new utility development.
· Work in collaboration for any issues and new features.
· Support any batch issue.
· Support application team teams with any queries.
Required Skills:
· 7+ years of Data engineering experience.
· Must be strong in UNIX Shell, Python scripting knowledge.
· Must be strong in Spark.
· Must have strong knowledge of SQL.
· Hands-on knowledge on how HDFS/Hive/Impala/Spark works.
· Strong in logical reasoning capabilities.
· Should have working knowledge of Github, DevOps, CICD/ Enterprise code management tools.
· Strong collaboration and communication skills.
· Must possess strong team-player skills and should have excellent written and verbal communication skills.
· Ability to create and maintain a positive environment of shared success.
· Ability to execute and prioritize a tasks and resolve issues without aid from direct manager or project sponsor.
· Good to have working experience on snowflake & any data integration tool i.e. informatica cloud.
Primary skills:
· Python, Big Data & Apache Spark.
Skills:
· Snowflake/Azure/AWS any cloud.
· IDMC/any ETL tool.
Many thanks,
Dave