Job Title: Cloud Platform | Amazon Webservices (AWS)
Work Location: Pennsylvania Furnace, PA - 16865
Job Details:
· Minimum years of experience: 8-10 years
· Building APIs for cost transparency
· Building APIs to access Trizet platform for claims on AWS
· AWS tech stack and cloud friendly node JS experience.
· Need development experience and not infra support experience.
· Glue lambda, python centric offering from Node JS scripting language in middleware native soap rest APIs facets Apache Kafka, Lambda S3 buckets SQS messaging tera forms
************************************************************************************************************************
Job Title: Azure Data Factory (ADF)
Work Location: Weehawken, NJ 7086
Must Have Skills:
· Azure, Big data, Hadoop
Detailed Job Description:
· Overall experience is 8-10 years. 2-3 year’s experience in Azure.
· Should experience of development and design of Azure big data frameworks/tools: Azure Data Lake, Azure Data Factory, Azure Data Bricks
· Knowledge of design and Architecture of data security and Azure security, VM, Vnet, key management, encryption etc will added advantage.
· Azure DevOps, Git, CI/CD ( optional )
· Provision and managing azure resources
· Building applications with DR capabilities
· Creating project documentations and Software Architecture diagrams
· Deep practitioner-level knowledge of the overall Azure product suite
· Ability to quickly diagnose and triage production issues and recommend an appropriate path forward
· Deliver timely and effective technical assistance and troubleshooting when issues arise
· Should have experience in working in Agile environment
· Experience in Finance /Investment domain is added advantage.
· Responsible for the overall design of the enterprise-wide data/information architecture
· Support product management driving the product roadmap
· Produces ETL design guidelines to ensure a manageable ETL infrastructure to better manage and govern data
· Good communication to understand and explain clearly the requirements and the solution and can help in documenting the solutions, programming changes and problems and resolutions
· Attention to detail and quality; excellent problem solving and communication skills.
· Should be able to work onshore/offshore team model.
************************************************************************************************************************
Job Title: Informatica PIM
Work Location: Atlanta, GA 30313
Must Have Skills
Informatica PIM
Detailed Job Description:
· Total of 10+ years of experience & Excellent Functional and Domain Knowledge on CPG.
· At least 7 years of experience in Java technologies – J2EE framework and web technologies (JavaScript, HTML, XML, Web 2.0, J2EE development toolset).
· 5+ Experience architecting and implementing Product Information Management solutions using Informatica Product 360 (on premises/ on-cloud)
· Min 2 years’ experience in technical development, configuration and custom development of Informatica Product 360.
· Min 3 years’ experience in working on Master Data Management tools like Informatica, IBM, Reltio, Talend, Stibo etc.,
· Should have good understanding of Product and Customer MDM
· Should have solid knowledge in PIM architecture & metadata management
· Working knowledge of Oracle, SQL server, UNIX
· Strong knowledge on PIM architecture, design, and development skills.
· At least 7 years of experience in software development life cycle.
************************************************************************************************************************
Job Title: Lead Teradata Developer
Work Location: Richardson, TX 75081
Must Have Skills:
· Teradata
· Informatica
· Unix
Nice to have skills:
· GCP
· Python
Detailed Job Description:
· Technology lead with more than 8 years of experience in Teradata and Informatica.
· The candidate should have thorough knowledge in Teradata, Informatica and ETL concepts and having knowledge in Healthcare domain working experience, having knowledge in Cloud concepts and GCP is an added advantage.
· Should have experience in requirement analysis and architecture design.
· Should coordinate with onsite for requirement gathering.
· Development and testing.
· In-depth knowledge of Agile process and principles.
· He or she should have outstanding communication and presentation skills and accountability of work.
· Should exhibit excellent organizational and time management skills.
************************************************************************************************************************
Job Title: Big Data Architect
Work Location: Chicago, IL 60601
Must Have Skills:
· Hadoop/Big Data
· Scala
· Hive
· Pyspark
· Azure
Detailed Job Description:
· Required 12+ years of experience as lead architect with Datawarehouse, Big data and Hadoop implementation in Azure environment
· Exposure in interacting with multiple stake holders
· Participate in design and implementation of the analytics architecture.
· Experience in working on Hadoop Distribution, good understanding of core concepts and best practices
· Good experience in building/tuning Spark pipelines in Python/Java/Scala
· Good experience in writing complex Hive queries to drive business critical insights
· Understanding of Data Lake vs Data Warehousing concepts
· Experience with AWS Cloud, exposure to Lambda/EMR/Kinesis will be good to have.
· Work with multiple teams, arrive at common decision understand architecture principle of organization and provide solution accordingly
************************************************************************************************************************