REQ: Senior Operations & Data Engineer (Snowflake Specialist) in Denver, CO (Remote)

0 views
Skip to first unread message

Viswa codetech

unread,
Apr 1, 2026, 10:34:51 AM (20 hours ago) Apr 1
to
Hi Guys, 

Please let me know your interest in the requirement below.

Senior Operations & Data Engineer (Snowflake Specialist)
Denver, CO (Remote) Locals needed 
Visa: H1B, H4EAD, GC EAD, USC
6+ Month Contract
Locals Need 

Position Objective
The Office of Information Technology (OIT) is seeking a highly specialized Senior Operations and Data Engineer to serve as the primary administrator and technical lead for our Snowflake ecosystem. This role is a hybrid of platform operations and high-level data engineering, ensuring that sensitive state and federal data (FTI/CJIS) is managed within a secure, high-uptime, and cost-effective environment.

Preferred  Qualifications
To be considered for this role, candidates should  provide proof of the following:
Active Snowflake Certification

Background Clearance Readiness: Absolute eligibility to pass OIT, FTI (Federal Tax Information), and CJIS (Criminal Justice Information Services) background checks.
 

Key Responsibilities
Platform Operations & Administration
Snowflake Mastery: Act as the lead administrator for Snowflake environments; manage platform uptime, vendor escalations, and patch/versioning communications.
Environment Provisioning: Configure Snowflake, including complex RBAC (Role-Based Access Control) and security permissions.
Governance & CI/CD: Implement and manage DataOps and CI/CD pipelines to automate deployments for the broader implementation team.
Financial Stewardship: Configure cost-management features such as Snowflake resource monitors, budgets, and consumption tracking; consult on chargeback models.

Data Engineering & Transformation
Pipeline Architecture: Develop robust ETL/ELT pipelines to ingest data from transactional systems (Line of Business) into the analytical Snowflake environment.
Analytical Modeling: Translate Data Architect visions into technical reality by building complex transformations and target schemas.
Quality Management: Design and deploy automated data cleansing and quality-check pipelines.
Performance Engineering: Optimize data flows for specific latency and frequency requirements while maintaining credit efficiency.

Primary Deliverables
Architectural Contributions: Design reviews, Architectural Plans, and Scope Documents.
Deployment Assets: New account/environment deployments, security schemas, and permission assignments.
Engineering Assets: Comprehensive ETL Pipeline Design Documents, Mapping Documents, and production-ready Pipelines.
Product Backlog & Support Ticket  Management; performance reports
Weekly Status Reports


--

Regards,

Viswa

Code Tech Inc

24155 Drake Rd, Suite 205,

Farmington Hills, MI 48335

Email: viswa@codetech-inc.com

www.codetech-inc.com

Linkedin: https://www.linkedin.com/in/vissu-viswa-446156246/

An "E-Verify Employer"

Reply all
Reply to author
Forward
0 new messages