Data Full Stack Engineer - Workday ERP & Databricks Remote job

2 views
Skip to first unread message

ramashish kumar

unread,
May 5, 2026, 4:03:41 PM (8 days ago) May 5
to rama...@valzosoft.com

 

Hi,

Please let me know if you would be interested in exploring this opportunity, and I will be happy to share more information.


Role : Data Full Stack Engineer - Workday ERP & Databricks
Location:  Remote job
Long-term Contract

JD

Data Full Stack Engineer - Workday ERP & Databricks (Focused JD)

Must Have Experience

• Strong hands-on experience with Workday Financial Management modules including General Ledger, Accounts Payable, Accounts Receivable, Customer Invoicing, Supplier and Customer Master data

• Proven experience with Workday Reporting including RaaS (Reports-as-a-Service), custom reports, calculated fields and WQL for downstream data consumption

• Solid Workday integration experience using REST and SOAP APIs, including handling security, authentication and incremental data extracts

• 3+ years of experience building and optimizing data pipelines on Databricks using PySpark, Spark SQL and Delta Lake

• Strong SQL and Python skills with experience in enterprise-scale data modeling

• Experience working on cloud platforms, preferably Azure

Key Responsibilities

• Lead extraction of Workday Finance and HCM data with primary focus on Finance reporting and analytics use cases

• Design, develop and maintain Workday reports and APIs to support reliable data integrations

Build and manage scalable ETL/ELT pipelines in Databricks, following Bronze, Silver and Gold Lakehouse patterns

• Manage Workday integration setup including Integration System Users, security domains and access controls

• Implement data quality checks, monitoring and error handling for production data pipelines

• Collaborate closely with Finance, HR and Analytics teams to deliver trusted, analytics-ready datasets

Required Skills

• Deep understanding of Workday Finance data structures and reporting concepts

• Hands-on expertise in Workday Reporting, RaaS, WQL and integration services

• Strong Databricks skills including PySpark, Spark SQL and Delta Lake

• Data integration and ETL best practices in cloud environments

• Advanced SQL, Python and data modeling skills

• Understanding of data governance, security and access management

 

Regards

Ramashish Kumar

E:   rama...@valzosoft.com

 

Valzo Soft Solutions LLP

A: 12600 Deerfield Parkway ,
Suite -2123, Alpharetta, GA-30004

 

LinkedIn : linkedin.com/in/ramashishkumar86

 

Reply all
Reply to author
Forward
0 new messages