H1b Candidates || Oracle Fusion EPM Implementation Lead || Cisco Voice Network Engineer || Adobe Analytics Engineer || Azure Data Engineer w/ TIBCO BW || Data Engineer || Onsite

0 views
Skip to first unread message

Anil Pal

unread,
Dec 11, 2025, 5:28:00 PM (17 hours ago) Dec 11
to C2C requirements 2021
Hi

NEED H1B CANDIDATES FOR BELOW 5 REQUIREMENTS


Role: Oracle Fusion EPM Implementation Lead
Location: Palm Beach Garden, FL || Onsite
Position Type: Contract
Years of Experience: 10+ Years

Essential Skills:
  • Act as the internal lead for Oracle Fusion EPM implementation projects, managing scope, timelines, and deliverables.
  • Ensure adherence to project governance, change control, and compliance standards.
  • Conduct status reviews, risk assessments, and escalation management with both internal leadership and vendor teams.
  • Validate solution architecture proposed by the implementation partner for Planning, FCCS, PCMCS, Narrative Reporting, and EDMCS.
  • Ensure wholesale distribution-specific requirements (inventory, freight, rebates, SKU-level profitability) are incorporated into the design.
  • Oversee integration strategy between EPM and ERP systems (Oracle Fusion Cloud ERP, WMS/TMS, and external data sources).

Desirable Skills:
  • Provide expertise in payroll processing for US and Mexico associates in on-premise SAP HCM and/or ECP, identifying and troubleshooting issues.
  • Interact with business users to understand business requirements, prepare functional specifications, configure in SAP HCM or SuccessFactors, document, test and deploy solutions.
  • Work with SAP technical developers to get end to end solutions developed.
  • Maintain Company Structures, Job Components, Position Management and enhancing/adding fields to employee profile portlets in EC.
  • Maintain requisition templates, recruitment pipeline, applicants and candidate profiles in RCM.
  • Configure business rules, workflows, dynamic roles, messages, alerts and notifications, role-based permissions and groups.
  • Update and create new panels and point-to-point integration with EC and uploading acknowledgement documents in ONB.
  • Configure rules and maintain Kronos workforce central to capture associate’s time, managers to approve time sheets.

JOB DESCRIPTION 2:
Role: Cisco Voice Network Engineer
Location: Ada, MI || Onsite
Position Type: Contract
Years of Experience: 10+ Years

Role Description:
  1. Five9 Platform Expertise
    • Perform end-to-end administration of the Five9 environment, including users, agents, skillsets, queues, campaigns, and system variables.
    • Manage system configurations, routing profiles, and operational controls to ensure optimal platform performance.
  2. Call Flow IVR Development
    • Design, build, and maintain IVR call flows using Five9 Studio.
    • Develop and optimize IVR scripting to support business requirements, self-service enhancements, and improved call routing accuracy.
  3. API Integration Work
    • Work with Five9 REST APIs for system integrations, automation, and custom workflows.
    • Integrate Five9 with CRM systems, SSO applications, and recording solutions such as Calabrio or Verint.
    • Support integration with 2Ring dashboards/wallboards and other related reporting/analytics platforms.
  4. Reporting Analytics
    • Create, configure, and manage operational and performance reports, dashboards, and analytics.
    • Monitor KPI trends and provide insights to stakeholders to drive operational improvements.
  5. Operational Support Optimization
    • Troubleshoot Five9-related issues across IVR, call routing, agent desktop, and integrations.
    • Partner with cross-functional teams to gather requirements, implement enhancements, and ensure best-practice usage of the platform.
    • Conduct system audits, recommend improvements, and support release and change management activities.

JOB DESCRIPTION 3:
Role: Adobe Analytics Engineer 
Location: Minneapolis MN || Onsite
Position Type: Contract
Years of Experience: 10+ Years

Description:
Adobe Analytics, of front-end development experience, with functional knowledge of web technologies (e.g., HTML, CSS) and client-side programming languages (e.g., JavaScript, iOS, Android).

"Key Responsibilities
  • Collaborate with software developers, product managers, UI designers, and data scientists to develop consumer tracking and analytics solutions across all U.S. Bank digital channels.-
  • Troubleshoot and resolve technical and engineering issues related to the setup and configuration of analytics tools and reports.
  • Define appropriate data models and implement instrumentation for collecting and analyzing analytics data.
  • Create documentation for tools and libraries within the analytics platform.
  • Promote repeatable best practices that empower software engineers to independently implement analytics solutions.

Required Qualifications
  • Bachelor’s degree in computer science or engineering.
  • 5+ years of front-end development experience, with functional knowledge of web technologies (e.g., HTML, CSS) and client-side programming languages (e.g., JavaScript, iOS, Android).
  • Official Adobe Analytics Expert Developer Certification and Tealium EventStream iQ Tag Management Certification required.
  • Additional certifications required for various digital analytics role, including but not limited to:
  • Tealium EventStream API Hub Certification
  • Adobe Customer Journey Analytics Developer Expert Certification
  • 2+ years of practical implementation experience with one or more web/digital analytics tools (e.g., Adobe Analytics, Google Analytics) and tag management systems (e.g., Tealium, Ensighten, Adobe Launch, Google Tag Manager).
  • Basic SQL skills for data troubleshooting."

JOB DESCRIPTION 4:
Role: Azure Data Engineer w/ TIBCO BW
Location: Mt Laurel, NJ || Onsite
Position Type: Contract
Years of Experience: 10+ Years

Role Description:
  • Minimum 7 years experience working with relational and distributed databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Advanced experience programming ETL code, building and optimizing data pipelines, architectures, and datasets using Microsoft Azure technologies including Spark.
  • Experience with data migration projects within a Microsoft and Azure
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • A successful history of manipulating, processing, and extracting value from large, disconnected datasets.
  • Ability to plan, prioritize and manage workload within a time sensitive environment.
  • Excellent oral, written, and interpersonal communication skills. 
  • Ability to use audience appropriate communication and language to present information, analyze results and convey concepts.

Skills: Microsoft Azure, TIBCO BW, ORACLE SQL, Digital : Informatica Data Quality (IDQ), Database Technology, Unix / Linux Basics and Commands, System I - DB2 400


JOB DESCRIPTION 5:
Role: Data Engineer
Location: Mt Laurel, NJ || Onsite
Position Type: Contract
Years of Experience: 10+ Years

Role Description:
  • 8 years of banking domain, strong documentation skill set data dictionaries, meta data, requirements, mappings.
  • 8 years of data quality and data governance principles, meta data, lineage, and security standards across cloud and on prem data platforms.
  • 8 years translate business needs into detailed functional and technical specifications for informatica PowerCenter mapping, workflows and transformations
  • 8 years Validate ETL loads, troubleshoot data discrepancies and work with technical teams to resolve issues.
  • 8 years Support the development, testing and implementation of informatica PowerCenter jobs, ensuring alignment with business rules and data governance standards.
  • 8 years Perform end to end data analysis including source to target mapping, data profiling, and gap identification- using informatica and SQL.
  • 8 years of business value and understand how data is structured, connected, governed and used across data products.
  • 8 years of good understanding of data models, data product life cycle, stakeholder management experience,
  • 8 years of support ingestion and transformation of large datasets using Databricks (PySpark, SQL) and Informatica PowerCenter workflows.
  • 8 years Create comprehensive IDCs (Incoming Data Collection) for cloud ingestion, outline data sources, ingestion frequency, file formats, orchestration and Azure target layers.
  • Participates fully as a member of the team, supports a positive work environment that promotes service to the business, quality, innovation and teamwork and ensures timely communication of issues points of interest
  • Provides thought leadership andor industry knowledge for own area of expertise in own area and participates in knowledge transfer within the team and business unit
  • Keeps current on emerging trends developments and grows knowledge of the business, related tools and techniques
  • Participates in personal performance management and development activities, including cross training within own team
  • Keeps others informed and up-to-date about the status progress of projects andor all relevant or useful information related to day-to-day activities
  • Contributes to team development of skills and capabilities through mentorship of others, by sharing knowledge and experiences and leveraging best practices
  • Leads, motivates and develops relationships with internal and external business partners stakeholders to develop productive working relationships
  • Contributes to a fair, positive and equitable environment that supports a diverse workforce
  • Acts as a brand ambassador for your business area function and the bank, both internally and externally
Thanks 
Anil Kumar Pal
Reply all
Reply to author
Forward
0 new messages