Databricks Architect/Admin
Department: Data & Analytics Platform
Reports To: Senior Director, Data Platform
Location: Hartford, CT
Job Type: 6-month contract to hire
REQUIRED QUALIFICATIONS
- 7+ years of experience in data engineering or data platform roles, with a minimum of 4 years hands-on Databricks implementation experience.
- Demonstrated expertise with Databricks platform capabilities: Unity Catalog, Delta Lake, Databricks Workflows, Delta Live Tables, and SQL Warehouses.
- Strong Unix/Linux proficiency — shell scripting, process management, file system operations, cron scheduling, and environment configuration.
- Proficiency in Python and PySpark for distributed data processing, pipeline development, and platform automation.
- Experience with cloud infrastructure (AWS, Azure, or GCP), including compute, storage, networking, and IAM/security constructs.
- Demonstrated ability to design for scale, cost efficiency, and operational reliability in an enterprise data environment.
- Demonstrated experience designing automation frameworks for data platform operations — including job orchestration, monitoring, alerting, and pipeline self-healing.
- Familiarity with AI/ML concepts and tooling within the Databricks ecosystem, including MLflow, AutoML, and Model Serving; exposure to generative AI or LLM-integrated workflows is a plus.
- Experience with Oracle database environments, including SQL development, schema design, and integration patterns for data extraction and pipeline sourcing.
- Proficiency in Git-based version control — branching strategies, pull request workflows, repository management, and CI/CD pipeline integration for data platform code.
- Experience working within ITSM and project delivery frameworks such as ServiceNow and Jira.
- Strong written and verbal communication skills, with the ability to convey complex architectural concepts to both technical and non-technical audiences.