RE: C2C role :: Azure Data Architect (Streamlit exp) :: Iselin, NJ(Hybrid)

0 views
Skip to first unread message

Yogesh Singh

unread,
Mar 24, 2026, 3:13:01 PM (12 days ago) Mar 24
to Yogesh Singh

Hi,

 

Kindly share resumes of H1/H4 for the below role.

LinkedIn ID must be from before 2020.

DL should be within 100 miles of NJ

 

C2C role:

Role: Azure Data Architect, No AWS profiles(Streamlit exp must)

Location: Iselin, NJ(Hybrid)

Rate:$70-$75/hr. C2C, max

 

Required Experience

Between 12 to 15 Years

 

Job Overview:
We are looking for a hands-on Data Architect to design and lead the implementation of a robust data platform on Azure and Databricks. The successful candidate will be responsible for creating, testing, and improving data frameworks to optimize the functionality of our business Your primary focus will be to ensure our data is accurate, auditable, and resilient through advanced Data Observability frameworks.

 

Responsibilities:

Architecture Design: Architect scalable Medallion architectures (Bronze/Silver/Gold) in Azure Databricks

Data Observability & Quality: Design automated frameworks for data quality to validate schemas, data, and business logic in real-time.

Reconciliation & Integrity: Build end-to-end reconciliation engines to ensure data consistency between source systems, the lakehouse, and downstream reports.

Exception Handling: Develop a centralized exception handling strategy that captures, logs, and routes data processing errors without halting entire pipelines.

User Interface: Build an interactive Streamlit-based user interface to provide stakeholders with capabilities to view/edit data quality rules and handle data exceptions

DevOps & Automation: Implement CI/CD pipelines for data infrastructure using Azure DevOps/GitLab, focusing on automated testing and zero-touch deployments.

 

Required Skills:

Platform Mastery: Deep expertise in Azure and Databricks

Observability Mindset: Proven track record of building metadata-driven frameworks for monitoring data "incidents" rather than just system uptime.

Coding: High proficiency in Python, PySpark, and SQL.

Visualization: Experience building data apps or internal tools via Streamlit.

Agile Methodologies: Familiarity with agile development methodologies.

DevOps: Strong understanding of Git-based workflows, Infrastructure as Code (Terraform), and automated testing.

Communication Skills: Excellent written and verbal communication skills.

The candidate must have a bachelor’s degree in Computer Science, Information Technology, or a related field; a Master's degree is preferred.

 

Preferred Skills:

Experience with Databricks DQX framework to manage data quality at scale

Certifications: Azure Solutions Architect and/or Databricks Certified Data Engineer Professional.

 

Thanks

Yogesh Pratap Singh


The information transmitted is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender and delete the material from any computer.
Reply all
Reply to author
Forward
0 new messages