Rate $60/hr C2C
NO OPT / GC - Passport num, VISA, DL Must for submission
Job Title: Lead AWS Glue Data Engineer
Location: Fort Mill, SC or San Diego, CA (Hybrid 3-4 days onsite per week)
12+ Months Contract
Job Summary
We are seeking a highly skilled AWS Glue Data Engineer to design, develop, and optimize large-scale data pipelines and ETL workflows on AWS. The ideal candidate will have strong expertise in AWS cloud-native data services, data modeling, and pipeline orchestration,
with hands-on experience building robust and scalable data solutions for enterprise environments.
Key Responsibilities
· Design, develop, and maintain ETL pipelines using AWS Glue, Glue Studio, and Glue Catalog.
· Ingest, transform, and load large datasets from structured and unstructured sources into AWS data lakes/warehouses.
· Work with S3, Redshift, Athena, Lambda, and Step Functions for data storage, query, and orchestration.
· Build and optimize PySpark/Scala scripts within AWS Glue for complex transformations.
· Implement data quality checks, lineage, and monitoring across pipelines.
· Collaborate with business analysts, data scientists, and product teams to deliver reliable data solutions.
· Ensure compliance with data security, governance, and regulatory requirements (BFSI preferred).
· Troubleshoot production issues and optimize pipeline performance.
Required Qualifications
· 15+ years of experience in Data Engineering, with at least 8+ years on AWS cloud data services.
· Strong expertise in AWS Glue, S3, Redshift, Athena, Lambda, Step Functions, CloudWatch.
· Proficiency in PySpark, Python, SQL for ETL and data transformations.
· Experience in data modeling (star, snowflake, dimensional models) and performance tuning.
· Hands-on experience with data lake/data warehouse architecture and implementation.
· Strong problem-solving skills and ability to work in Agile/Scrum environments.
Preferred Qualifications
· Experience in BFSI / Wealth Management domain.
· AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification.
· Familiarity with CI/CD pipelines for data engineering (CodePipeline, Jenkins, GitHub Actions).
· Knowledge of BI/Visualization tools like Tableau, Power BI, QuickSight.
Education
· Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field.
· Master’s degree preferred.