Ab Initio ETL Developer (In Person interview if client needed ) :: Onsite

2 views
Skip to first unread message

mike

unread,
Dec 12, 2025, 12:28:26 PM (2 days ago) Dec 12
to mike

Ab Initio ETL Developer (In Person interview if client needed )

***LOCATION: Dallas, TX. Onsite.

Need only H1B Candidates

 

Required skills:
Please submit only candidates who are authorized to work in the United States.
Only applicants who are currently local to Dallas Texas or are willing to relocate will be considered.
· Design, develop, and deploy ETL processes using Ab Initio GDE.
· Build high-performance data integration and transformation pipelines.
· Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.
· Develop and optimize graphs for batch and real-time processing.
· Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.
· Implement continuous flows, web services, and message-based integration with Ab Initio.
o Continuous Flows (Co-Op & GDE)

Nice to have skills:
· Exposure to AWS, Azure, or GCP for cloud-based data solutions.
· Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.
· Containerization (Docker, Kubernetes) knowledge desirable.
· Monitoring & Security:
· Job monitoring and scheduling experience (Control-M, Autosys, or similar).
· Familiarity with security standards, encryption, and access management.

 

Skills: ·· Design, develop, and deploy ETL processes using Ab Initio GDE.
· Build high-performance data integration and transformation pipelines.
· Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.
· Develop and optimize graphs for batch and real-time processing.
· Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.
· Implement continuous flows, web services, and message-based integration with Ab Initio.
o Continuous Flows (Co-Op & GDE)
o Plans and Psets
o Conduct-It for job scheduling and orchestration
o Graphs and Parameter Sets
Nice to have:
· Exposure to AWS, Azure, or GCP for cloud-based data solutions.
· Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.
· Containerization (Docker, Kubernetes) knowledge desirable.
· Monitoring & Security:
· Job monitoring and scheduling experience (Control-M, Autosys, or similar).
· Familiarity with security standards, encryption, and access management.

Reply all
Reply to author
Forward
0 new messages