Please share email on Khurs...@sitwinc.com
Title :- Azure Data Engineer
Location :- - Omaha, Ne / Chicago, IL -Hybrid
Visa :- USC and Green Card only
Job Details:
Minimum years of experience required: 5-8 years Certification needed: Not mandatory Must Have Skills: Databricks, Snowflake, Pyspark Nice to Have
Skills: IICS, Python
Detailed Job Description:
Skill Set:
As a Senior Data Engineer, you will play a key role in leading the development, maintenance, and optimization of data pipelines and workflows within our Enterprise Data Platform. You’ll apply strong data engineering fundamentals along with software engineering and DevOps practices, so pipelines are built, deployed, and monitored as code. Your work will help ensure data accuracy, reliability, and accessibility, enabling teams across the organization to make informed decisions.
This position offers an opportunity to lead technical solutions, mentor engineers, and collaborate with cross-functional teams to solve complex data challenges and create impactful solutions.
Key Responsibilities:
• Lead the design, development, and maintenance of scalable data pipelines that process and integrate data from multiple sources into the Enterprise Data Platform.
• Build pipelines and workflows as code using modern engineering practices (version control, code reviews, automated testing, reusable components).
• Define and implement patterns for CI/CD for data pipelines (automated builds, tests, deployments, and environment promotion).
• Partner with data scientists, analysts, and business teams to gather requirements and translate them into robust data solutions.
• Build and optimize SQL queries and transformations to support complex business use cases and analytics needs.
• Design and manage data models; validate them with business stakeholders, data architects, and governance partners.
• Establish data quality checks, validation, and troubleshooting practices to ensure accuracy, consistency, and trust in data products.
• Monitor and optimize pipeline performance and reliability; implement observability (logging/metrics/alerts) and contribute to operational runbooks.
• Drive automation to improve efficiency, reduce manual effort, and increase repeatability of platform operations.
• Provide technical leadership through mentoring, reviews, and guidance on best practices and standards.
• Participate in Agile ceremonies to plan, estimate, and deliver work efficiently.
• Create and maintain documentation for data workflows, transformations, standards, and operational procedures.
Technical Skills:
• Bachelor’s degree in computer science, Information Systems, or a related field (or equivalent experience).
• 5–8 years of experience in data engineering or a related role.
• Advanced proficiency in SQL for complex data transformation and analysis.
• Hands-on experience with cloud-based data platforms such as Databricks, Snowflake, or similar tools.
• Experience with ETL/ELT tools and frameworks (e.g., Informatica, Talend, dbt, or equivalent).
• Strong proficiency in Python and/or PySpark for data processing and pipeline development.
• Strong understanding of data modeling, database design principles, and building curated datasets for analytics and operational use cases.
• Experience with DevOps practices and Git-based development (branching strategies, pull requests, code reviews).
• Experience implementing CI/CD for data pipelines/workflows and managing deployments across environments.
• CPG Domain Knowledge will be a plus.
• Familiarity with orchestration and workflow tools (e.g., Databricks Workflows, Airflow, or similar) is preferred.
• Familiarity with Infrastructure as Code (e.g., Terraform, CloudFormation) and/or containerization concepts is a plus.
• Strong problem-solving skills, attention to detail, and ability to troubleshoot complex issues end-to-end.
• Excellent communication skills and ability to collaborate across technical and non-technical teams.
Khursheed war
Email: Khurs...@sitwinc.com
Please share email to Khurs...@sitwinc.com
Title :- Python Developer with Postgres DBA Essentials
Location :- Austin TX -Onsite
Visa :- Green Card and USC
Long Term Contract
Job Summary: We are looking for Python developer with experience is database especially PostgresQL. He/She will be responsible for developing and managing the tools, automation and scripts necessary for DBAs to perform daily tasks.
Key Responsibilities:
* Develop and maintain tools to streamline database management tasks and performance tuning.
* API-Driven Services: Design, build, and maintain API-driven services to automate interactions with relational databases, such as PostgreSQL, enabling efficient database operations and integrations.
* Continuous Integration/Continuous Deployment (CI/CD): Integrate database management tasks into CI/CD pipelines to ensure smooth and automated deployments.
* Monitoring and Alerting: Implement automated monitoring and alerting systems to proactively identify and address relational database issues via self-healing mechanisms.
Minimum Qualifications
* Experience building highly available software services or platform infrastructure
* Experience in designing, building, and maintaining API-driven services
* Proficient in Python or Go
* Relational database fundamental concepts
Preferred Qualifications
* Experience building and interacting with Cloud APIs for AWS and GCP
* Good understanding of PostgreSQL internals
* Experience with Java/Kotlin/C/C++ will be an advantage
* SRE experience supporting PaaS or DBaaS
Please share email to Khurs...@sitwinc.com
Title :- Java Microservices Developer
Location :- San Leandro CA - Final round- F2F.
Visa :- Green Card and USC
Job Description
We are seeking an experienced Java Microservices Developer to join our team in San Leandro, CA. The ideal candidate should have strong experience in Java development, microservices architecture, and cloud-based application development.
Required Skills
Responsibilities
Preferred Skills
Title :- .Net Engineer
Location :- Fort Mills SC (Onsite) – (Ready for onsite final Interview)
Visa :- Green Card and USC only
Industry
Financial Services / Banking / FinTech
Role Overview
We are seeking a Senior Software Engineer to build and scale mission-critical financial systems using .NET, AWS, GitHub platform, and Kafka. This role focuses on developing secure, event-driven, cloud-native applications with robust CI/CD pipelines powered by GitHub.
You’ll work on high-throughput systems such as payments, trading platforms, or risk engines, ensuring reliability, low latency, and compliance with financial regulations.
Key Responsibilities
Required Skills & Experience
Core Development
Cloud Platform
Version Control & CI/CD (GHCP)
Streaming & Messaging
DevOps & Containers
Databases
Financial Domain Experience (Preferred)
Security & Compliance
Nice to Have
Soft Skills
What Success Looks Like
Let me know if you have any local profiles on this below . Final round of interview is F2F- Whippany NJ
Title :- Product Manager
Location :- Whippany NJ -Locals only
Visa :- H1B and USC and Green Card
Job
Description:
We are looking for an experienced Product Manager with strong expertise
in the banking/payments domain to drive product strategy, roadmap, and
execution for digital banking solutions at Barclays.
Key Responsibilities:
Required Skills:
Nice to Have: