Lead Data Engineer - 6 months Contract

0 views
Skip to first unread message

Swaroop Mohapatra

unread,
Mar 3, 2026, 8:09:46 AMMar 3
to US Contract Jobs
Please share resumes to swa...@qualis1inc.com
Location: ONSITE- Raleigh NC, Dallas TX, Phoenix AZ
Duration: 6 months
Visa - H1B

Role: Lead Data Engineer

Responsibilities
Modeling & Warehousing
•    Design and implement scalable data ingestion frameworks
•    Implement Raw → DV 2.0 (Hubs/Links/Satellites) → Consumption patterns in dbt Cloud with robust tests
•    (unique/not null/relationships/freshness).
Build performant Snowflake objects (tables, streams, tasks, materialized views) and optimize clustering/micro-partitioning.
•    Orchestration
•    Author and operate Airflow (MWAA) DAGs and/or dbt Cloud jobs; design idempotent, rerunnable, lineage-tracked workflows
•    with SLAs/SLOs.
Security & Governance
•    Enforce RBAC/ABAC, network policies/rules, masking/row access policies, tags, data classification, and least-privilege role
•    Operationalize audit-ready controls (change management, approvals, runbooks, separation of duties, evidence capture).
IaC & DevOps
•    Use CI/CD flows, Terraform, Git branching for code promotion.
Data Quality & Observability
•    Bake tests into dbt; implement contract checks, reconciliations, and anomaly alerts.
•    Monitor with Snowflake ACCOUNT_USAGE/INFORMATION_SCHEMA, event tables, and forward logs/metrics to
•    SIEM/APM (e.g., Splunk, Datadog).
•    Build controls and evidence to satisfy internal audit, SOX/GLBA/FFIEC/PCI-like expectations.

Snowflake Platform (hands-on, production):
•    Secure account setup: databases/schemas/stages, RBAC/ABAC role design, grants, network policies/rules, storage integrations.
•    Data protection: Dynamic Data Masking, Row Access Policies, Tag-based masking, PII classification/lineage tagging.
•    Workloads & features: Streams/Tasks, Snowpipe, external tables, file formats, copy options, retries & dedupe patterns.
•    Operations: warehouse sizing, multi-cluster, resource monitors, Time Travel & Fail-safe, cross-region/account replication.
•    Networking concepts: AWS PrivateLink/S3 access patterns, external stages, (at least) high-level familiarity with VPC/DNS/
•    endpoint flows.
DBT Cloud:
•    Dimensional + Data Vault 2.0 modeling in dbt (H/L/S), snapshots, seeds, exposures, Jinja/macros, packages, artifacts.
•    Testing and documentation discipline; deployment environments (DEV/QA/UAT/PROD) and job orchestration.
Orchestration:
•    Airflow (MWAA): Operators/Sensors (dbt, Snowflake, S3), XComs, SLAs, retries, backfills, alerting, and modular DAG design.
•    Experience deciding when to run in dbt Cloud orchestration vs Airflow, and integrating both cleanly.
Data Quality & Observability:
•    Contract tests, reconciliations, freshness SLAs, anomaly detection; surfacing lineage and test results to stakeholders.
•    Query tuning (profiling, pruning, statistics awareness, result caching).
Reply all
Reply to author
Forward
0 new messages