C2C Role - Databricks Architect - REMOTE

1 view
Skip to first unread message

Rakesh Sharma

unread,
Apr 3, 2026, 4:22:04 PM (2 days ago) Apr 3
to Rakesh Sharma

Dear Vendors

 

Please share resume for below role

 

Role:                Databricks Architect

Location:         Troy, MI/Remote

Experience:     12+ years

 

Role Overview

We are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.

 

Key Responsibilities

  • Architect and implement Databricks Lakehouse solutions for large-scale data platforms
  • Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL)
  • Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning)
  • Build and manage Databricks jobs, workflows, notebooks, and clusters
  • Enable data governance using Unity Catalog (access control, lineage)
  • Integrate Databricks with cloud data services (ADLS / S3, ADF, Synapse, etc.)
  • Support analytics, BI, and AI/ML workloads (MLflow exposure is a plus)
  • Lead solution design discussions and mentor data engineering teams

 

Must-Have Skills

  • 10+ years in data engineering / data architecture
  • 5+ years of strong hands-on experience with Databricks
  • Expert in Apache Spark, PySpark, SQL
  • Strong experience with Delta Lake & Lakehouse architecture
  • Cloud experience on Azure Databricks / AWS Databricks
  • Proven experience in designing high-volume, scalable data pipelines

 

Good-to-Have

  • Unity Catalog, MLflow, Databricks Workflows
  • Streaming experience (Kafka / Event Hubs)
  • CI/CD for Databricks (Azure DevOps / GitHub)

 

Regards,

Rakesh Kumar

VedaSoft Inc.

www.vedasoftinc.com

 

Reply all
Reply to author
Forward
0 new messages