Job Opening || Data Architect with AWS Glue || Charlotte,NC

0 views
Skip to first unread message

pradeep bhondwe

unread,
May 7, 2026, 11:15:29 AM (6 days ago) May 7
to pradeep...@ktekresourcing.com
Job Title: Data Architect with AWS Glue
Location: Charlotte,NC
Duration: Long Term


Need Only USC, Green Card and H4EAD and L2s and J2EAD.

Job Discription - 

We are seeking an experienced and highly skilled Data Solutions Architect with strong expertise in AWS Glue, ETL Pipeline Development, PySpark, and Python. The ideal candidate will lead the design, development, optimization, and maintenance of scalable data processing solutions within the AWS ecosystem.

The candidate should possess deep technical expertise in building enterprise-grade ETL pipelines, managing cloud-based data platforms, and implementing efficient big data processing solutions using PySpark and AWS Glue.


Key Responsibilities

AWS Glue & ETL Development

  • Design, develop, and maintain scalable ETL workflows using AWS Glue.
  • Create and optimize Glue Jobs, Crawlers, Triggers, and Data Catalog configurations.
  • Implement automated job scheduling, monitoring, and failure handling mechanisms.
  • Integrate data from multiple structured and unstructured sources.

ETL Pipeline Design

  • Design robust and scalable ETL pipelines for batch and incremental data processing.
  • Ensure data quality, validation, integrity, and reconciliation across systems.
  • Build reusable ETL frameworks and orchestration patterns.
  • Optimize pipeline performance and reliability for large-scale data processing.

Python Development

  • Develop clean, modular, scalable, and maintainable Python code.
  • Implement object-oriented programming concepts and reusable libraries.
  • Handle logging, exception management, and performance optimization.
  • Work with APIs, file handling, and automation scripts.

PySpark Development

  • Develop and optimize PySpark applications for large-scale distributed data processing.
  • Work extensively with Spark DataFrames, Spark SQL, and RDD transformations.
  • Tune Spark jobs for memory management, partitioning, and execution efficiency.
  • Implement data transformation and aggregation logic using PySpark.

Database & SQL Management

  • Write complex SQL queries for data extraction, transformation, and analysis.
  • Work with PostgreSQL databases including DML operations and query optimization.
  • Understand Snowflake architecture, database design, and SQL optimization techniques.
  • Ensure database performance tuning and efficient schema management.

Data Platform Management

  • Manage and optimize AWS-based data infrastructure and ecosystem components.
  • Ensure platform security, scalability, monitoring, and governance standards.
  • Collaborate with DevOps, Data Analysts, Architects, and Business stakeholders.
  • Support CI/CD and deployment automation for data engineering solutions.


Job Description: 

Areas for evaluation with weightage

Level

Skill Name

Concepts

Expectations

Weightage

Expert

AWS Glue

ETL, AWS Glue, Data Catalog, Job Scheduling

Demonstrate ability to design, implement, and manage ETL processes using AWS Glue

20

Expert

ETL Pipeline Design

ETL, Data Validation, Data Integrity

Ability to design and implement robust ETL pipelines

20

Proficient

Python

Python, OOP, Libraries, Error Handling

Demonstrate ability to write clean, efficient, and scalable Python code

20

Proficient

PySpark

PySpark, DataFrames, RDD, Spark SQL

Show proficiency in writing and optimizing PySpark scripts for data processing

20

Competent

RDBMS (PostgreSQL)

PostgreSQL - Query and DML

Snowflake - SQL, Query Optimization, Database Design

Demonstrate ability to write complex SQL queries and manage PostgreSQL databases

15

Competent

Data Platforms Management

AWS Ecosystem, Data Infrastructure, Security

Manage and optimize data platforms within the AWS ecosystem

5

 





image.png

Pradeep Bhondve   

Talent Acquisition Specialist,

KTEK Resourcing LLC

 

E Pradeep.bhondve@ktekresourcing.com

Linkedin: https://www.linkedin.com/in/pradeep-bhondve-aba57b166/

W www.ktekresourcing.com

A 9494 Southwest Freeway, Suite #350, Houston, TX -77074



pradeep bhondwe

unread,
May 12, 2026, 12:25:45 PM (22 hours ago) May 12
to pradeep...@ktekresourcing.com
Hello,

My name is Pradeep Bhondve, and I work as a Technical Recruiter for K-Tek Resourcing.
 
We are searching for Professionals below business requirements for one of our clients. Please read through the requirements and connect with us in case it suits your profile.

Please see the Job Description and if you feel Interested then send me your updated resume at Pradeep.bhondve@ktekresourcing.com or give me a call at  . 



Job Title: Data Architect with AWS Glue
Location: Charlotte, NC
Duration: Long Term

Need Only Visa Independent -  USC, a green card, and H4EAD L2S, and J2S

Reply all
Reply to author
Forward
0 new messages