Quick interviews : Sr. Lead Data engineer -Snowflake & Data Platform@ ONSITE- Raleigh NC, Dallas TX, Phoenix AZ

0 views
Skip to first unread message

Madhava Perungulam

unread,
Mar 4, 2026, 8:52:10 AM (21 hours ago) Mar 4
to
Good morning Professional,

Please share relevant profiles for the below role .

Note : All the skills highlighted are mandatory Skills 

EMAIL IS THE BEST WAY TO REACH OUT 


Lead Data Engineer – Snowflake & Data Platform

Location- ONSITE- Raleigh NC, Dallas TX, Phoenix AZ


Experience needed : at least 12-14 years

Experience Required:

hands‑on experience in Snowflake, dbt, data engineering, and cloud data platforms.


Summary

We are looking for a Senior Data Engineer to design, build, and operate secure and scalable data pipelines on the Snowflake platform. The role involves end‑to‑end data engineering—from raw ingestion to Data Vault 2.0 modeling to business consumption layers. The ideal candidate has strong experience in data ingestion, Snowflake architecture, dbt Cloud, Airflow orchestration, and implementing governance and audit‑ready controls.


Key Responsibilities

Data Engineering & Modeling

  • Build scalable data ingestion pipelines using Qlik/Glue/ETL tools.
  • Implement Raw → Data Vault 2.0 (Hubs/Links/Satellites) → Consumption models in dbt Cloud.
  • Develop optimized Snowflake structures (tables, streams, tasks, materialized views).
  • Apply strong testing practices (unique, not null, relationships, freshness).

Orchestration

  • Build and maintain Airflow (MWAA) DAGs and/or dbt Cloud scheduled jobs.
  • Design reliable, auditable, lineage‑tracked workflows with SLAs.

Security & Governance

  • Implement RBAC/ABAC roles, masking policies, network rules, and data classification.
  • Establish audit‑ready controls: change management, approvals, runbooks, evidence collection.

Infrastructure & DevOps

  • Use CI/CD, Git branching, and Terraform for infrastructure‑as‑code.
  • Manage promotion from DEV → QA → UAT → PROD environments.

Data Quality & Observability

  • Embed data quality tests in dbt; build reconciliations and anomaly checks.
  • Monitor performance using Snowflake ACCOUNT_USAGE, logs, and metrics (e.g., Splunk, Datadog).

Cost & Performance Optimization

  • Optimize Snowflake warehouses, auto‑suspend/resume, multi‑cluster settings, and resource monitors.
  • Tune SQL queries and manage storage efficiently.

Compliance

  • Build data and system controls to meet audit standards (SOX/GLBA/FFIEC/PCI-like).
  • Maintain secure, compliant data operations.

 

Strong hands‑on experience with:

  • Snowflake platform (security, governance, workloads)
  • dbt Cloud (modeling, testing, macros)
  • Airflow (MWAA) orchestration
  • Python, SQL, and AWS basics

 

 Regards







Reply all
Reply to author
Forward
0 new messages