Hiring for Data Engineer @ Jersey City, NJ (Onsite)

0 views
Skip to first unread message

Oncorre Technologies PVT LTD

unread,
Apr 29, 2026, 3:57:37 PMApr 29
to
Job Title: Data Engineer
Location: Jersey City, NJ
Mode of Work: Onsite
 
Key Skills:
Snowflake (including Snowflake AI / Cortex), SQL, AWS (Glue), PySpark, Big Data Concepts, Data Warehousing, Machine Learning Integration
 
Responsibilities:
  • Lead the design, development, and implementation of scalable data solutions using AWS and Snowflake, including Snowflake AI capabilities.
  • Leverage Snowflake AI (Cortex) features to build intelligent data applications, including NLP-based querying, data summarization, and predictive analytics.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical and AI-driven data solutions.
  • Develop and maintain robust data pipelines, ensuring data quality, integrity, security, and governance.
  • Optimize data storage, transformation, and retrieval processes to support advanced analytics and AI workloads.
  • Integrate AI/ML models within Snowflake for real-time and batch data processing use cases.
  • Provide technical leadership and mentorship to junior data engineers.
  • Work closely with stakeholders to gather requirements and deliver actionable, data-driven insights.
  • Ensure compliance with industry standards and best practices in data engineering, AI governance, and cloud architecture.
 Must Have:
  • 8+ years of relevant experience in Data Engineering and delivery.
  • 8+ years of experience in Big Data Concepts and cloud implementations.
  • Strong experience with SQL, Python, and PySpark.
  • Hands-on experience with Snowflake, including Snowflake AI / Cortex capabilities.
  • Strong knowledge of data ingestion, transformation, and processing frameworks.
  • Experience with AWS services (Glue, EMR, S3, Aurora, RDS, and overall AWS architecture).
  • Experience integrating AI/ML solutions within data platforms.
  • Strong problem-solving, analytical skills, and ownership mindset.
  • Ability to code, debug, perform performance tuning, and deploy applications to production environments.
  • Experience working in Agile methodologies.
Good to Have:
  • Experience with DevOps tools (Jenkins, Git, etc.) and CI/CD pipelines.
  • Exposure to data migration, Data Vault 2.0, and modern data architecture patterns.
  • Experience with AI/ML frameworks and tools integrated with Snowflake (e.g., model deployment, feature engineering).
Requirements:
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Data Engineer with strong expertise in AWS and Snowflake.
  • Hands-on experience with Snowflake AI capabilities (Cortex, Snowpark ML, or similar).
  • Strong understanding of data warehousing concepts and modern AI-driven data platforms.
  • Excellent communication skills to explain complex technical and AI concepts to non-technical stakeholders.
  • Experience in the insurance domain, preferably with claims and loss processes.
  • Proficiency in SQL, Python, and related programming languages.
  • Strong problem-solving skills and attention to detail.
  • Ability to work independently and in a fast-paced team environment.
Preferred Qualifications:
  • Experience with data modeling, ETL/ELT processes, and AI-enhanced pipelines.
  • Familiarity with data governance, security, and AI ethics practices.
  • AWS or Snowflake certifications (including AI-related certifications) are a plus.
Mandatory Skills:
  • 2+ years of experience with AWS
  • 5+ years of experience with Snowflake (including exposure to Snowflake AI features)
  • 2+ years of Insurance domain experience
  • 7+ years of experience with PL/SQL
Reply all
Reply to author
Forward
0 new messages