1 .Role UI React Developer
Location: San Jose, CA
"Front End UI -with ReactJS/HTML exp is must.
Full Stack
NodeJs experience is a must.
Experience 7+ years.
1. Should be good in Basic and Advanced(ES6) JavaScript concepts.
2. Apply the knowledge of #1 in problem solving.
3. CSS3/HTML5, how they work.
4. ReactJS 17(released in October 2020) with hands on experience on Hooks concepts."
2 .Role :SAP FICO Consultant (Finance + S/4HANA + Governance)
Location: Irvine, CA
Competencies: Supply Chain Planning & Scheduling
Essential Skills: SAP
Role Description:
SAP FICO consultant with a strong focus on the FI module (GL/AR/AP) for an onshore opportunity based in Irvine, CA.
This expert brings deep S/4HANA experience in both finance and governance and is capable of leading and guiding teams across functional configurations and RICEFW objects.
3.Role : Informatica Developer
Location: Alpharetta, GA / New York, NY - onsite
Informatica Developer
We are looking for an experienced Informatica Developer to design, develop, and maintain ETL processes and data integration solutions. The ideal candidate will have strong expertise in Informatica PowerCenter, data warehousing concepts, and performance optimization.
Key Responsibilities
- Design, develop, and implement ETL workflows using Informatica PowerCenter.
- Create and maintain data mappings, sessions, and workflows for data integration.
- Collaborate with business analysts and data architects to understand requirements.
- Perform data validation, quality checks, and troubleshoot issues.
- Optimize ETL processes for performance and scalability.
- Support production deployments and resolve incidents promptly.
- Document technical specifications and maintain version control.
Required Qualifications
Education: Bachelor’s degree in Computer Science, Information Technology, or related field.
Experience:
- 3+ years of hands-on experience with Informatica PowerCenter.
- Strong knowledge of SQL, PL/SQL, and relational databases (Oracle, SQL Server).
- Experience with data warehousing concepts and ETL best practices.
Skills:
- Informatica PowerCenter, Workflow Manager, Mapping Designer.
- Performance tuning and debugging.
- Familiarity with Unix/Linux scripting.
4.Role : Java Tech Lead
Location: Malvern, PA - hybrid onsite
Full stack TL (JAVA, Angular, AWS)
Programming Languages Java (11, 17), Angular 10, NEST, GraphQL, Stepfunction, JESTFrameworks
Libraries Spring Boot, Spring Data JPA, Spring Security, HibernateCloud
DevOpsAmazon Web Services (AWS) AWS Glue (ETL jobs), AWS Step Functions (State Machines), Amazon ECS (Batch Fargate), S3, Lambda, IAM, CloudWatch, Secret Manager, API gateway, Lambda, Command Line Interface (CLI), Elastic Compute Cloud (EC2), Cloud Formation,
Route 53Databases PostgreSQL, Dynamo
DBTools IDEs IntelliJ IDEA STS, GitHub, Postman Bruno, MavenGradle, Jira, Bitbucket, Bamboo, GitHub, Splunk, SonarQube, ServiceNow, Grafana, Confluence, App Dynamics
Architecture Design RESTful API design, Microservices, Event-driven architecture
Testing JUnit, Mockito, Integration Testing
Monitoring Logging AWS CloudWatch, SplunkCICD
Infrastructure GitHub Actions, Docker, basic CloudFormation
Security IAM policies, AWS Secrets Manager, secure API development
Agile Collaboration Scrum, Jira, Confluence
5.Role :Databricks Engineer
Location: Alpharetta, GA / New York, NY - onsite
Databricks Engineer
We are seeking a highly skilled Senior Machine Learning Engineer with expertise in Databricks to join our innovative team. The ideal candidate will be responsible for developing, fine-tuning, and managing machine learning models in a production environment
while optimizing costs and performance on the Databricks platform.
Key Responsibilities:
- Develop and implement end-to-end machine learning solutions using Databricks, leveraging its Unified Analytics Platform
- Fine-tune and optimize machine learning models deployed in production environments
- Set up and manage automation processes for model training, evaluation, and deployment on Databricks
- Optimize Databricks cluster configurations and resource utilization to manage overall platform costs efficiently
- Collaborate with cross-functional teams to integrate ML solutions into existing products and workflows
- Implement best practices for version control, testing, and documentation of ML models and pipelines
- Monitor and analyze model performance, making data-driven decisions for continuous improvement
- Stay current with the latest advancements in machine learning and Databricks technologies
Preferred Qualifications:
- Experience with Delta Lake and Databricks workflows
- Familiarity with distributed computing and big data processing
- Knowledge of DevOps practices and CI/CD pipelines
- Experience with cost optimization strategies for cloud-based ML platforms
5.Role :Snowflake Engineer
Location: Alpharetta, GA / New York, NY - onsite - onsite
We are seeking an experienced Data Engineer to join our Platform Modernization squad, focusing on Snowflake implementation and optimization. This role will be instrumental in building and maintaining our modern data infrastructure while ensuring optimal performance,
security, and reliability.
Key Responsibilities
- Design, implement, and manage Snowflake data warehouses and compute resources
- Develop and maintain robust ETL/ELT pipelines using Snowflake best practices
- Implement data security protocols and access controls within the Snowflake environment
- Perform performance tuning and optimization of queries and warehouses
- Create and maintain documentation for data processes and architectures
- Collaborate with cross-functional teams to understand data requirements and implement solutions
- Monitor and optimize warehouse costs and resource utilization
- Implement data governance policies and ensure compliance
Preferred Qualifications
- Snowflake SnowPro certification
- Experience with cloud platforms (AWS, Azure, or GCP)
- Knowledge of data visualization tools (Tableau, Power BI, or similar)
- Experience with version control systems (Git)
- Familiarity with CI/CD practices
- Experience with data governance and security frameworks
6.Role :Sas Developer
Location: Pittsburgh, PA
Develop and Maintain SAS Programs.
Create, optimize, and maintain SAS applications for data extraction, transformation, and loading.
Data Analysis Reporting Generate tables, listings, and figures using SAS procedures prepare statistical reports and dashboards.
Data Management Perform data cleansing, validation, and transformation to ensure high-quality datasets.
Automation Develop SAS macros and reusable code to streamline processes.
Collaboration Work closely with analysts, statisticians, and business stakeholders to gather requirements and deliver solutions.
Testing Documentation Conduct unit testing, maintain documentation, and adhere to coding standards.
Performance Optimization Improve efficiency of SAS programs and troubleshoot performance issues.
Compliance Ensure secure coding practices and compliance with regulatory requirements.
7.Role :React Developer
Location: Chicago, IL
ROLE: React Developer
Description:
"• Develop micro-frontend apps and integrate with main app containers using Webpack Module Federation.
• Extend and maintain user interface component library and front-end framework
• Requirement gathering, requirement document review, test plan preparation and its review, production roll out.
• Quality improvement processes like conducting requirement and design walk through, internal, and external code reviews and other quality related procedures.
• Provide PRE/PROD technical development support whenever required for every release.
• Help & guide the offshore/onsite teams for any problems faced by them and provide solutions with the gained business/technical development/business knowledge.
• Keeping track of tasks/bugs to reduce downtime, increase productivity and communication.
• Provide the necessary support to the integration testing team to fix the errors raised by the team and delivering the correct code for retest.
"
React, Redux, React Hooks, React Router, JavaScript (ES6+), TypeScript, HTML5, SCSS, CSS3, Responsive Design, REST API, GraphQL, Material-UI, Webpack, Jest, Performance Optimization
8.Role :ETL Developer
Location: New York
• 9–10 years of experience in data Modelling Experience.
• Strong proficiency in Linux shell scripting, SQL, and stored procedures.
• Hands-on experience with MS SQL databases and utilities (BCP).
• Expertise in IBM DataStage for ETL processes.
Knowledge of ACBS, AFS, Corporate\Commercial Lendng systems is an additional advantage
8.Role :Business System Analyst
Location: New York
• 7+ years of Corp/ Commercial Banking experience
• BSA with the standard capabilities around functional requirements gathering
• Very Good communication skills
• Experience with IT SDLC, Agile, data analysis (aka using SQL, profiling data).
Corporate, Commercial and Institutional Lending lines of businesses is required . Also, prior experience with ACBS or similar conversion\migration projects will be preferred
• QA with the testing experience on user UI’s, Web applications, and database testing (SQL experience);
• Building test execution plans and automation experience is an added advantage