Role: Senior/Lead AWS Data Engineer
Location: Wilmington, DE-Onsite
Duration: 12 months
Ideally local to Wilmington, DE, but open to relocation candidates
Lead Level Candidate – Ideally 10+ years of experience
· Main Skillset:
o lambdas, EMR, SQS, SNS, S3, EC2’s, etc…
o Spark
o Python
o CI/CD
o Big Data
Job Description:
Data Engineering Role Urgent Hire
Santosh discussed the need to fill a critical data engineering position as soon as possible, emphasizing skills in AWS cloud services, Python, and Spark. He expressed a preference for candidates with recent experience and a lead-level role, with Wilmington being the preferred location. Santosh is actively looking at candidates, including those rolling off from other projects, and is willing to bypass a supplier call to expedite the process.
Discover Data Pipeline Governance Discussion
Santosh explained that the Discover data pipeline is a configuration-driven application that protects sensitive data elements and processes incoming files, rather than being a traditional ETL pipeline. Will inquired about data protection requirements for CAP1 governance, to which Santosh confirmed that downstream data protection aligns with existing governance requirements. The team discussed future vendor requirements, with Santosh indicating that while managed services might be considered in the future, they are not currently at a point to make decisions about co-managed or full managed services engagements.
Basic Qualifications:
-Bachelor’s degree or Military Experience
-At least 1+ years’ experience with leading big data technologies such as Spark, Hadoop, HDFS
-At least 2 years of professional experience with data engineering concepts
-At least 2 years experience in writing good quality software in languages like Java, Python, Scala etc.
Preferred Qualifications:
-2+ years experience with AWS cloud
-2+ years of experience in Python, Java, or Scala
-2+ years of experience with Unix/Linux systems with scripting experience in Shell, Perl or Python
-2+ years of experience building data pipelines
-2+ years of automated deployment, CICD experience with tools like Jenkins
-At least 1 year of Cloud (AWS, Azure, Google) development experience
-Experience with Streaming and/or NoSQL implementation (Mongo, Cassandra, etc.) a plus
· AWS Services
o lambdas, SQS, EMR, SFS, S3
o Spark
o Python
o CI/CD
o Big Data