Urgent Requirement :: Fast Moving :: Raritan, NJ - Onsite

0 views
Skip to first unread message

rohit Gupta

unread,
Jan 16, 2026, 9:46:56 AM (2 days ago) Jan 16
to rohit Gupta

Hi,

Hope you are doing well..

This side Rohit Gupta from Valzo Soft Solutions,

please go through the below Job Description and send me your Updated Resume if you are interested and well qualified for this role.

 

Role 1:

Job Title: EUS Data & Reporting Service

Location: Raritan, NJ - Onsite

Job Type: Contract

 

Role Description:

EUS Data & Reporting Service - Analytics and Reporting Implementation Specialist

 

Role Overview: The Analytics and Reporting Implementation Specialist  will play a key role in setting up and migrating the End User Services (EUS) Data & Reporting platform for the Depuy Synthes divestiture. This role is responsible for supporting the team’s Database Architect with provisioning cloud resources, replicating data pipelines, migrating reports, and ensuring the new environment delivers accurate, actionable analytics for stakeholders.

 

Key Responsibilities

Cloud Environment Setup

·                Provision and configure Azure SQL Databases, Azure Data Factory, and Microsoft Fabric (Power BI) workspaces in the Depuy Synthes cloud environment.

·                Ensure all infrastructure mirrors the existing J&J setup, including database schema, stored procedures, and security configurations.

 

Data Pipeline Replication

·                Re-create and validate Azure Data Factory (ADF) pipelines to ingest, transform, and load data from multiple source systems (e.g., ServiceNow, Microsoft 365, Salesforce Marketing Cloud).

·                Coordinate with data owners to ensure access to all required data feeds and resolve any integration issues.

 

Report and Dashboard Migration

·                Migrate Power BI project files (PBIX) and dataflows to the new Microsoft Fabric workspace.

·                Re-point reports to the new database and validate all visuals, calculations, and dashboards for accuracy.

 

Testing and Validation

  • Run parallel operations with the legacy environment, comparing key metrics and outputs.
  • Investigate and resolve discrepancies in data or reporting.
  • Support user acceptance testing and address feedback from business users.

 

Cutover and Go-Live Support

  • Coordinate the transition to the new environment, ensuring all pipelines and dashboards are operational.
  • Provide post-go-live support, monitoring system health and addressing any issues.

 

Required Skills & Experience

  • Hands-on experience with Microsoft Azure (SQL Database, Data Factory) and Power BI (including Dataflows and workspace management).
  • Strong understanding of ETL processes, data integration, and cloud-based analytics solutions.
  • Experience with data migration projects, especially in regulated or enterprise environments.
  • Familiarity with data governance, security, and identity/access management in Azure.
  • Ability to troubleshoot data pipeline and reporting issues.
  • Excellent communication and collaboration skills for working with technical and business stakeholders.

 

Preferred Qualifications

  • Experience supporting divestitures, mergers, or large-scale cloud migrations.
  • Knowledge of Microsoft Fabric, Microsoft Power BI, Azure Data Factory and Microsoft SQL Server Management Studio 
  • Power BI certification or Azure Data Engineer certification.

 

Role 2:

Job Title: Data Intelligence Analyst

Location: Raritan, NJ - Onsite

Job Type: Contract

 

Role Description:

The Data Quality Analyst will play a critical role in ensuring the accuracy, consistency, and reliability of enterprise data across multiple platforms. This position focuses on validating and reconciling data, monitoring quality through automated checks and dashboards, and investigating discrepancies to identify root causes. The analyst will work closely with engineering and business process teams to define data quality standards, document data flows, and assess the impact of issues on reporting and analytics. Leveraging tools such as Microsoft SQL Server, Azure Data Factory, Microsoft Fabric, and Power BI, the role emphasizes continuous improvement, automation, and governance to maintain high data integrity and support informed decision-making.

 

Key Responsibilities:

•            Data Validation & Reconciliation; Regularly compare source data with transformed datasets to ensure accuracy and completeness.

•            Data Quality Monitoring; Implement checks and dashboards to track anomalies, missing values, and inconsistencies across systems.

•            Root Cause Analysis: Investigate discrepancies in reporting or system outputs and identify underlying data issues.

•            Testing Data Pipelines; Validate ETL processes and confirm that transformations preserve data integrity.

•            Define Data Quality Standards; Establish clear rules for acceptable data formats, completeness, and accuracy.

•            Collaborate with Engineering & Process Teams; Work closely with system engineers and business process owners to resolve data issues.

•            Document Data Flows & Dependencies Maintain clear documentation of how data moves between systems and where checks are applied.

•            Perform Impact Analysis; Assess how data issues affect downstream reporting, analytics, and operational processes.

•            Develop Automated Validation Scripts; Create SQL queries or scripts to automate data checks and reconciliation tasks.

•            Continuous Improvement; Recommend enhancements to data governance, tools, and processes to prevent recurring issues.

 

Desired Qualifications:

•            5+ years of experience in BI, Data Analysis, or related roles, including experience with data integration, transformation, and visualization

•            Strong SQL (T SQL) skills for data validation, reconciliation, and automation in SQL Server.

•            Hands on experience with Azure Data Factory for building and testing ETL pipelines.

•            Familiarity with Microsoft Fabric (Data Pipelines, Lakehouse, Dataflows Gen2) and its impact on data quality.

•            Proficiency in Power BI for creating dashboards to monitor anomalies and data quality metrics.

•            Ability to define and enforce data quality standards (formats, completeness, accuracy).

•            Skilled in root cause analysis and impact assessment for data issues across systems.

•            Experience with automated validation scripts (SQL; Python or PowerShell preferred).

•            Strong documentation skills for data flows, dependencies, and quality checks.

•            Knowledge of data governance principles and continuous improvement practices.

•            Excellent collaboration and communication skills to work with engineering and business teams.

•            Strong problem-solving skills and ability to troubleshoot technical issues related to GenAI, Copilot, and Microsoft Fabric.

•            Bachelor’s degree in Computer Science, Data Engineering, or related field.


Role 3:

Job Title: Data Intelligence Analyst

Location: Raritan, NJ - Onsite

Job Type: Contract

 

Role Description:

We are seeking an experienced Business Intelligence Analyst to drive the development and optimization of data tools, reporting systems, and dashboards. You will collaborate with data scientists, analysts, and engineers to deliver business-critical insights. In addition to traditional BI skills, you will leverage advanced technologies such as Generative AI (GenAI), Microsoft Copilot, and Microsoft Fabric to enhance analytics capabilities and deliver innovative solutions. Your responsibilities will include designing and maintaining scalable data pipelines, building self-service BI dashboards, ensuring data accuracy, and enhancing data accessibility for stakeholders across the company

 

Key Responsibilities:

•            Develop and maintain BI tools, dashboards, and reporting systems, incorporating GenAI-powered analytics and automation.

•            Design and implement data queries, pipelines, and models using SQL, Python, and other tools, with integration of Microsoft Fabric for scalable data management.

•            Collaborate with cross-functional teams to gather and refine data requirements

•            Optimize performance and reliability of data pipelines for both real-time and batch processing, leveraging Microsoft Fabric’s capabilities.

•            Ensure high data quality standards through validation, cleansing, and documentation.

•            Manage large datasets and integrate data from various sources (e.g., APIs, cloud platforms, GenAI models).

•            Design, develop, and optimize scalable data storage architectures (e.g., Snowflake, Redshift, BigQuery, Microsoft Fabric).

•            Create data visualizations that transform complex data into actionable business insights

•            Support self-service analytics by designing intuitive and visually appealing Power BI dashboards.

•            Collaborate with business stakeholders to define strategic data requirements and improve workflow efficiency

•            Continuously enhance the data infrastructure and tools to meet evolving business needs, including adoption of GenAI and Microsoft Fabric.

 

Desired Qualifications:

•            5+ years of experience in BI, Data Analysis, or related roles, including experience with data integration, transformation, and visualization.

•            Proficiency with BI tools (e.g., Power BI, Tableau) and data visualization best practices, including Copilot and GenAI features.

•            Experience with SQL, Python, ETL processes, and Microsoft Fabric.

•            Solid understanding of data warehousing concepts and platforms (e.g., Snowflake, Redshift, Microsoft Fabric).

•            Experience in developing and maintaining data pipelines and architectures

•            Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and big data tools (e.g., Hadoop, Spark)

•            Familiarity with data governance, security, and compliance standards

•            Strong problem-solving skills and ability to troubleshoot technical issues related to GenAI, Copilot, and Microsoft Fabric.

•            Bachelor’s degree in Computer Science, Data Engineering, or related field.

•            AWS Certified Data Analytics, Azure Data Engineering, Microsoft Fabric, or similar certifications.

•            Strong collaboration and communication skills, with the ability to work independently and as part of a team.

•            Detail-oriented, proactive, and self-motivated with a problem-solving mindset



Regards,

Rohit Gupta

Valzo Soft Solutions

E: Ro...@valzosoft.com

A: 12600 Deerfield Parkway ,

Suite -2123, Alpharetta, GA-30004

Reply all
Reply to author
Forward
0 new messages