59591: Lead Data Engineer W/ Qlik Sense
Raleigh
NC, Dallas TX, Phoenix AZ (Onsite)
6 months
Descriptions:
You will design, build, and operate secure, audited, and cost-efficient data
pipelines on Snowflake—from
raw ingestion to Data Vault 2.0 models
and onward to business-friendly consumption layers (mart/semantic). You’ll use
Qlik/Glue/ETLs for
ingestion, dbt Cloud for modeling/testing,
MWAA/Airflow and/or dbt Cloud’s orchestration for scheduling, and Terraform
(with HashiCorp practices)
for infrastructure-as-code. The ideal
candidate must have hands-on experience with data ingestion frameworks
and Snowflake platform
database/schema design, security, networking,
and governance that satisfy regulatory and compliance audit requirements.
Responsibilities
Modeling & Warehousing
• Design and implement scalable data
ingestion frameworks
• Implement Raw → DV 2.0
(Hubs/Links/Satellites) → Consumption patterns in dbt Cloud with robust tests
• (unique/not
null/relationships/freshness).
Build performant Snowflake objects (tables, streams, tasks, materialized views)
and optimize clustering/micro-partitioning.
• Orchestration
• Author and operate Airflow (MWAA) DAGs
and/or dbt Cloud jobs; design idempotent, rerunnable, lineage-tracked workflows
• with SLAs/SLOs.
Security & Governance
• Enforce RBAC/ABAC, network policies/rules,
masking/row access policies, tags, data classification, and least-privilege
role
• hierarchies.
• Operationalize audit-ready controls
(change management, approvals, runbooks, separation of duties, evidence
capture).
IaC & DevOps
• Use CI/CD flows, Terraform, Git branching
for code promotion.
Data Quality & Observability
• Bake tests into dbt; implement contract
checks, reconciliations, and anomaly alerts.
• Monitor with Snowflake
ACCOUNT_USAGE/INFORMATION_SCHEMA, event tables, and forward logs/metrics to
• SIEM/APM (e.g., Splunk, Datadog).
Cost & Performance
• Right-size warehouses, configure
auto-suspend/auto-resume, multi-cluster for concurrency, resource monitors, and
query optimization.
Compliance
• Build controls and evidence to satisfy
internal audit, SOX/GLBA/FFIEC/PCI-like expectations.
Qualifications
Bachelor's Degree and 6 years of experience in Advanced data engineering,
enterprise architecture, project leadership OR
High School Diploma or GED and 10 years of experience in Advanced data
engineering, enterprise architecture, project leadership
Preferred:
Snowflake Platform (hands-on, production):
• Secure account setup:
databases/schemas/stages, RBAC/ABAC role design, grants, network
policies/rules, storage integrations.
• Data protection: Dynamic Data Masking,
Row Access Policies, Tag-based masking, PII classification/lineage tagging.
• Workloads & features: Streams/Tasks,
Snowpipe, external tables, file formats, copy options, retries & dedupe
patterns.
• Operations: warehouse sizing,
multi-cluster, resource monitors, Time Travel & Fail-safe,
cross-region/account replication.
• Networking concepts: AWS PrivateLink/S3
access patterns, external stages, (at least) high-level familiarity with
VPC/DNS/
• endpoint flows.
DBT Cloud:
• Dimensional + Data Vault 2.0 modeling in
dbt (H/L/S), snapshots, seeds, exposures, Jinja/macros, packages, artifacts.
• Testing and documentation discipline;
deployment environments (DEV/QA/UAT/PROD) and job orchestration.
Orchestration:
• Airflow (MWAA): Operators/Sensors (dbt,
Snowflake, S3), XComs, SLAs, retries, backfills, alerting, and modular DAG
design.
• Experience deciding when to run in dbt
Cloud orchestration vs Airflow, and integrating both cleanly.
Data Quality & Observability:
• Contract tests, reconciliations,
freshness SLAs, anomaly detection; surfacing lineage and test results to
stakeholders.
• Query tuning (profiling, pruning,
statistics awareness, result caching).
Audit & Controls:
• Change control with approvals/evidence,
break-glass procedures, production access separation, audit log
retention/immutability.
• Runbooks, PIR/RCAs, control mapping
(e.g., to SOX/GLBA/PCI-like controls where relevant).
Programming & Cloud:
• Python (ETL utils, Airflow tasks), SQL
(advanced), and AWS basics (S3, IAM, CloudWatch, MWAA fundamentals).
Bonus Skills :
• Snowflake governance: data classification
at scale, Universal Search, tags + masking automation.
• Iceberg/external table strategies; Kafka
or event-driven ingestion patterns.
• Great Expectations, Monte
Carlo/Anomalo/Atlan/Collibra/BigID integrations.
• dbt: advanced macros, dbt mesh, custom materializations,
Slim CI, state comparison, deferral, exposures to BI lineage.
• BI/Semantic: ThoughtSpot/Looker/Power BI
metric-layer design; semantic modeling concepts.
• Packaging & distribution: internal
dbt packages, reusable Terraform modules, cookie-cutter project templates.
• Platform engineering: FinOps for
Snowflake, cost charge-back/show-back, warehouse auto-tuning utilities.
• Security engineering: SCIM/SSO (Okta),
MFA patterns, service-account hardening, ephemeral credentials.
• SRE practices: SLIs/SLOs, on-call
runbooks, incident management.
Skills: Digital : Qlik Sense~Digital : Snowflake~Data Build Tool
Experience Required: 6-8
|
Skills: |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Regards,
Satya
Technical Recruiter
Key Business Solutions, Inc|| Office: 916 646 2080 Ext 216 || Fax: 916 646 2081 || Email: sa...@keybusinessglobal.com || Website: www.key-soft.com || Yahoo IM/G Talk: satyakeysoft
Notice: This email is not intended to be a solicitation. Please accept our apologies and reply in the subject heading with REMOVE to be removed from our Mailing list. Thank You.