$50/hr c2c - H1b only -no other visa-onsite NC -Infra AWS devops -i94 must

0 views
Skip to first unread message

Shruthi M

unread,
Mar 5, 2026, 4:53:26 PM (6 days ago) Mar 5
to Shruthi M
YES2026-01-13 3:00J251513~Raleigh, NC~DevOps Role Data Engineering Focus50• Cloud Platform AWS
• Infrastructure as Code Terraform
• CICD Tools Jenkins
• Data Platforms AWS Glue
• Data Analytics Tools Qlik
• Design, build, and maintain CICD pipelines to support data engineering workloads
• Automate infrastructure provisioning and environment management using Terraform
• Collaborate closely with Data Engineering, Analytics, and Platform Infrastructure teams
• Explain how you would ensure reliability and performance in a data lake architecture
• Discuss your experience with monitoring and logging for data pipelines
NEWRole: Data Engineer (Snowflake,DBT,Python and Qlik)
Descriptions:
Technical/Business Skills:
Strong hands-on experience in building robust metadata-driven, automated, secured, governed data pipeline solutions leveraging modern cloud data technologies, tools for large data platforms.
Strong experience in Snowflake architecture, data classification, tagging & masking automation, Universal Search, performance tuning, security and cost optimization.
Strong hands-on experience building snowflake objects (tables, streams, tasks, materialized views) and optimize clustering/micro-partitioning.
Strong experience using Astronomer Airflow for orchestrating, scheduling data pipelines; and Terraform (with HashiCorp practices) for IaaC automation of pipelines.
Excellent proficiency in Python, Pyspark, advanced SQL for ingestion frameworks and automation.
Experience in implementing logging, monitoring, alerting with Snowflake using ACCOUNT_USAGE/INFORMATION_SCHEMA, event tables, and integrating with Splunk, Datadog.
Strong experience designing, implementing tokenization, RBAC, masking policies, dynamic & conditional masking, data access controls across Snowflake and supporting systems.
Experience in SCIM/SSO (Okta), MFA, ephemeral credentials, service-account hardening.
Internal dbt packages, reusable Terraform modules, project templates experience is a plus.
Experience with Iceberg/external tables; Kafka or event-driven ingestion patterns is a plus.
Experience FinOps for Snowflake, cost charge-back/show-back, warehouse auto-tuning is a plus.
Power BI/Looker metric-layer design; semantic modeling concepts experience is a plus.
DBT mesh, custom materializations, Slim CI, state comparison experience is a plus.
Financial banking experience is a plus.
Must have one or more certifications in the relevant technology fields.
Functional Skills:
Team Player: Support peers, team, and department management.
Communication: Excellent verbal, written, and interpersonal communication skills.
Partnership and Collaboration: Develop and maintain partnership with business and IT stakeholders
Attention to Detail: Ensure accuracy and thoroughness in all tasks.
Responsibilities
Data warehousing & modeling - Build performant Snowflake objects (databases, tables, streams, tasks, materialized views, warehouses) and optimize clustering/micro-partitioning
Performance Optimization and Troubleshooting - Analyze and optimize system performance for large-scale data operations. Troubleshoot complex data issues and implement robust solutions
Code Deployment & Release Management - Adopt release management processes to promote code deployment to various environments including production, disaster recovery, and support activities.
Skills: Digital : DevOps~Digital: Terraform
Shruthi M
Sierra 
Lead Recruiter
Reply all
Reply to author
Forward
0 new messages