LOCAL TO HOUSTON // SNOWFLAKE DATA ENGINEER

0 views
Skip to first unread message

Vishal Kumar

unread,
Oct 27, 2025, 3:01:09 PMOct 27
to
Hello,

My name is Vishal Kumar, and I am a Staffing Specialist at Arkhya Tech Inc. I am reaching out about an exciting job opportunity with one of our clients.

Job Title: Snowflake Data Engineer

Location: Houston , TX (Onsite) Local candidates only (NO RELOCATION)

Job Type: Contract

 

Job Summary

We are seeking a highly skilled Snowflake Data Engineer to design, develop, and optimize data pipelines and cloud-based data solutions. The ideal candidate will have hands-on experience in SnowflakeSQL, and modern data engineering practices, with a strong ability to integrate data from multiple sources into scalable and high-performing data platforms.

 

Key Responsibilities

·Design, build, and maintain ETL/ELT pipelines using Snowflake and cloud-native services (AWS/Azure/GCP).

·Develop and optimize Snowflake schemas, tables, views, and stored procedures for efficient data storage and retrieval.

·Implement data ingestion from structured, semi-structured, and unstructured sources (CSV, JSON, Parquet, APIs, Kafka, etc.).

·Ensure data quality, governance, and security standards using Snowflake features (RBAC, masking, auditing, etc.).

·Optimize query performance and manage Snowflake compute resources (warehouses, clustering, caching).

·Collaborate with business analysts, data scientists, and stakeholders to deliver reliable and scalable data solutions.

·Automate workflows and orchestrate pipelines using tools like Airflow, DBT, ADF, or Informatica.

·Participate in code reviews, performance tuning, and best practice enforcement for Snowflake and SQL development. 

Required Skills & Qualifications

·Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.

·3–8 years of data engineering experience with at least 2+ years hands-on with Snowflake.

·Strong expertise in SQL, ETL/ELT, and data warehousing concepts.

·Experience with cloud platforms (AWS, Azure, or GCP) and services for data integration.

·Proficiency in Python/Scala/Spark for data transformation and pipeline automation.

·Knowledge of data modeling (Star/Snowflake schema, Data Vault, etc.).

·Experience with CI/CD, Git, and DevOps practices for data engineering.

·Familiarity with BI/Analytics tools (Tableau, Power BI, Looker) is a plus.

·Strong problem-solving skills and ability to work in fast-paced, agile environments.

Preferred Qualifications

·Snowflake SnowPro certification or other relevant cloud/data certifications.

·Hands-on experience with DBT, Matillion, or Fivetran.

·Experience in real-time data streaming (Kafka, Kinesis, Event Hub).

·Exposure to data governance and cataloging tools (Collibra, Alation, Purview).














Regards,

 

Vishal Kumar

Technical IT Recruiter

Email ID: vish...@arkhyatech.com

Connect with me on LinkedIn

 

Arkhya Tech Inc.www.arkhyatech.com


Mailsuite Email tracked with Mailsuite  ·  Opt out
10/28/25, 12:16:38 AM
Reply all
Reply to author
Forward
0 new messages