H1B only || Data Architect || Jersey City, NJ || Hybrid || Local

0 views
Skip to first unread message

Mohit

<hs.quantum5@gmail.com>
unread,
9:41 AM (6 hours ago) 9:41 AM
to recruitervikrant913@googlegroups.com, acrux-active-c2c-requirements@googlegroups.com, swaraj111@googlegroups.com, c2c-daily-requirements--usa@googlegroups.com, magrichards11@googlegroups.com, c2c-daily-requirementsusit@googlegroups.com, c2crequirements2021@googlegroups.com, c2cgroups-dcpv@googlegroups.com, mydailyrequirementc2c@googlegroups.com, allurgentrupeshc2creqs@googlegroups.com, c2c-3-daily-c2c-requirements@googlegroups.com, krishvendors@googlegroups.com, shaikyesdan@googlegroups.com, c2chotlist-requirement-posting@googlegroups.com, it-requirements-on-c2c@googlegroups.com, c2c-hotlistrshi@googlegroups.com, rajeshkantic2c@googlegroups.com, hotlist-coderepo@googlegroups.com
Role - Data Architect
Location - New Jersey (White House Station) Must be able to come in for an F2F interview and be willing to work in the office for 3-4 days a week.
Pay Rate - $70/hr on C2C (With VMS)

JD:

We are seeking a Senior Data & Integration Architect to lead data model design and downstream integration strategy for a large-scale policy administration system modernization. You will reverse engineer legacy mainframe data structures, forward engineer them into modern SQL/MongoDB document schemas, and define how data flows to downstream consumers. You will leverage purpose-built AI agents to accelerate reverse engineering, model generation, and documentation — bringing human judgment to validate and refine AI-produced outputs.

Key Responsibilities

-       Design and maintain technology-agnostic Logical Data Models (entities, relationships, cardinality, PK/FK)

-       Transform LDMs into modern physical schemas applying aggregate-oriented and DDD patterns

-       Reverse engineer IMS hierarchical segments and DB2 tables — extract business entities from physical storage structures without original design documentation

-       Interpret COBOL copybooks as data structure definitions and map legacy field types to modern equivalents

-       Define embedding vs. referencing strategies, versioning patterns, and collection boundaries for target database platforms

-       Design downstream integration patterns — REST APIs, event streaming (Kafka/MQ), Change Data Capture (CDC), and data distribution to consuming systems

-       Direct and validate AI agent pipelines for automated reverse engineering, ERD generation, data dictionary synthesis, and schema artifact production

-       Produce data dictionaries, ERD diagrams, ETL field mapping specifications, and integration contracts

-       Collaborate with SMEs to validate models and integration flows against undocumented business logic

 

Required Qualifications

-       8+ years experience in data architecture and system integration within OLTP / transactional domains (insurance, banking, billing, or similar)

-       Hands-on experience with IBM IMS, DB2 for z/OS, and COBOL copybooks — able to read a segment hierarchy or copybook independently

-       5+ years designing physical data models for modern relational or document-oriented databases

-       Strong grasp of logical modeling: ERD notation, composition vs. reference, cardinality, key design

-       Proven experience designing integration architectures: REST APIs, event streaming (Kafka, MQ), CDC pipelines, and message-based data distribution

-       Experience with ELT processing, including designing and implementing ELT workflows, data transformation, data cleansing, and data validation

-       Experience with real-time data processing, including designing and implementing real-time data processing pipeline with event-driven architectures

-       Comfort working in an AI-augmented workflow — directing LLM-based agents, reviewing AI-generated artifacts, and applying domain expertise to close gaps AI cannot resolve

-       Scripting proficiency (Python or equivalent) for schema validation and artifact generation

-       Ability to abstract legacy physical data structures into business-oriented target models — separating IMS/DB2 storage implementation details from true business keys and domain entities

 Preferred

-       Insurance domain knowledge (policy lifecycle, coverages, LOBs, premium rating)

-       Domain-Driven Design (DDD) - aggregates, bounded contexts, event-driven design

-       Experience synthesizing a unified model from multiple heterogeneous sources (IMS + DB2 + application logic)

-       Prior experience working with AI coding assistants (Claude, GitHub Copilot, or similar) in a software engineering or data architecture context

 



 

Thanks & Regards

Mohit Saxena

Sr. IT Recruiter

Quantum World Technologies Inc.

A-129, Sector 63, Noida, Uttar Pradesh, India 

Reply all
Reply to author
Forward
0 new messages