Snowflake Data Engineer - Need Local of Houston , TX

1 view
Skip to first unread message

Vishal Kumar

unread,
Oct 3, 2025, 1:19:25 PM (yesterday) Oct 3
to

Hello 

My name is Vishal Kumar and I am a Staffing Specialist at Arkhya Tech Inc. I am reaching out to you on an exciting job opportunity with one of our clients

Role- Snowflake Data Engineer 

Location- Houston TX  (Local only)

Need Snowpro Certified Candidates 

Responsibilities

Job Description

We are seeking a skilled Snowflake Data Engineer with SnowPro Certification to design, build, and optimize scalable data pipelines and solutions. The ideal candidate will have strong experience in data warehousing, cloud platforms, and advanced SQL development, along with hands-on expertise in Snowflake features and best practices.

Key Responsibilities:

  • Design, develop, and maintain ETL/ELT pipelines using Snowflake, SQL, and modern data engineering tools.
  • Implement data ingestion strategies from multiple sources (batch & streaming) into Snowflake.
  • Optimize Snowflake performance (query tuning, clustering, partitioning, caching, etc.).
  • Manage data security, governance, and access policies in alignment with best practices.
  • Collaborate with business analysts, data scientists, and stakeholders to deliver reliable data solutions.
  • Work with cloud platforms (AWS, Azure, or GCP) for integration and deployment.
  • Implement data modeling (star schema, snowflake schema, dimensional modeling) for analytics and BI needs.
  • Ensure data quality, monitoring, and automation across data pipelines.


Required Skills & Experience:

  • Proven experience as a Data Engineer with focus on Snowflake.
  • SnowPro Core / Advanced Certification is mandatory.
  • Strong SQL development and performance tuning expertise.
  • Hands-on with ETL tools (Informatica, Talend, dbt, Fivetran, Matillion, or similar).
  • Experience in cloud platforms (AWS S3, Glue, Lambda / Azure Data Factory / GCP Dataflow) for data integration.
  • Knowledge of Python/Scala for scripting and pipeline automation.
  • Experience with data governance, security policies, and compliance standards.
  • Strong understanding of data warehousing concepts and BI integration.
  • Good to Have:
  • Experience with Snowpark, Streams, Tasks, and Time Travel features in Snowflake.
  • Exposure to CI/CD pipelines, GitHub, and DevOps practices.
  • Knowledge of Kafka, Spark, or other streaming technologies.


Education:

  • Bachelor’s/Master’s degree in Computer Science, Information Systems, or related field.
  • SnowPro Certification (Core/Advanced) – mandatory.




Regards,

 

Vishal Kumar

Technical IT Recruiter

Email ID: vish...@arkhyatech.com

Follow us on:  LinkedIn

 

Arkhya Tech Inc.www.arkhyatech.com


Mailsuite Sent with Mailsuite  ·  Unsubscribe
10/03/25, 10:46:40 PM
Reply all
Reply to author
Forward
0 new messages