Hi,
Kindly share your updated resume if you are good with the JD below. I will be happy to share more information.
Role : Azure Data Bricks Architect
Location
: Remote Job
Duration: 6+ months
Job Summary
We are seeking an experienced Azure Databricks Architect with a strong background in Azure cloud services to design, implement, and optimize scalable data solutions. The ideal candidate will lead the architecture and deployment of Databricks-based data platforms, enabling advanced analytics and data-driven decision-making across the organization.
Key Responsibilities
Design and architect end-to-end data solutions leveraging Azure Databricks, Azure Data Lake Storage, Azure Synapse Analytics, and other Azure services.
Lead the development and deployment of scalable, high-performance big data pipelines using Apache Spark on Databricks.
Collaborate with data engineers, data scientists, and stakeholders to translate business requirements into technical solutions.
Implement best practices for data governance, security, and compliance on Azure data platforms. Optimize cluster configurations and Spark jobs to improve performance and reduce costs.
Provide technical leadership and mentorship to data engineering teams on Databricks and Azure cloud technologies.
Stay updated with the latest Azure and Databricks features and recommend enhancements. Troubleshoot, debug, and resolve issues related to data ingestion, transformation, and processing workflows.
• Develop documentation, architecture diagrams, and technical specifications.
Assist in the evaluation and integration of third-party tools and technologies with Databricks..
Required Skills and Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related field.
Extensive experience architecting and implementing solutions on Azure Databricks and Apache Spark. Strong knowledge of Azure cloud services: Azure Data Lake Storage Gen2, Azure Synapse Analytics, Azure Data Factory, Azure SQL Database, etc.
Proficient in programming languages such as Python, Scala, or SQL for big data processing. Experience with data lake design, ETL/ELT processes, and data pipeline orchestration. Strong understanding of cloud security, data governance, and compliance standards. Familiarity with infrastructure as code tools (e.g., Terraform, ARM templates) is a plus. Excellent problem-solving skills and ability to manage multiple priorities.
Strong communication and interpersonal skills to work with cross-functional teams.
Thanks and Regards
Ramashish Kumar
Valzo Soft Solutions LLP
A: 12600 Deerfield Parkway ,
Suite -2123, Alpharetta, GA-30004
LinkedIn : linkedin.com/in/ramashishkumar86