Role: Data Architect 14+ Needed || Location: Chicago, IL 60604 (Onsite Job)

0 views
Skip to first unread message

SASI KUMAR DOVALA

unread,
Nov 7, 2025, 1:53:48 PM11/7/25
to SASI KUMAR DOVALA

Role:                   Data Architect 14+ Needed 

Skills:                 Data, Data Mesh, Data Products, Snowflake, Azure, Java/Python

Experience:     14+years

Location:          Chicago, IL 60604 (Onsite Job)

 Visa : No H1B



The skills needed are-

 

  • Sr person with at least 14 years of experience working in Data
  • Strong in Data Mesh and Data products
  • Strong exp in Snowflake, Azure.
  • Strong in either Python or Java who can assess and work on the framework
  • Hands-on leadership to assess and rectify the issues

 



Regards,

Sasikumar

cid:image001.jpg@01DC0160.520B4230


 Address: 15 Corporate Pl S, Suite #450, Piscataway Township, NJ 08854

 Email: sasi....@intellyk.com | Website: www.intellyk.com

 cid:image002.png@01DC0160.520B4230

 

DISCLAIMER: THIS IS NOT UNSOLICITED MAIL. UNDER BILL 1618 TITLE III PASSED BY THE 105TH USA CONGRESS THIS EMAIL CANNOT BE CONSIDERED AS SPAM AS LONG AS WE INCLUDE OUR CONTACT INFORMATION AND AN OPTION TO BE REMOVED FROM OUR EMAILING LIST. IF YOU HAVE RECEIVED THIS MESSAGE IN ERROR OR, ARE NOT INTERESTED IN RECEIVING OUR EMAILS, PLEASE ACCEPT OUR APOLOGIES. PLEASE REPLY WITH REMOVE IN THE SUBJECT LINE. ALL REMOVAL REQUESTS WILL BE HONORED. WE SINCERELY APOLOGIZE FOR ANY INCONVENIENCE CAUSED

 


 



SASI KUMAR DOVALA

unread,
Dec 8, 2025, 4:26:31 PM12/8/25
to SASI KUMAR DOVALA
 
 Job Title:         Performance Test Lead/ Architect
Location :       Houston TX 77002 (Onsite Job)
Job Type:         Long-Term
Visa: No H1B
Interview Type: Video Interview

Mandatory Skills
Develop and execute Performance robust Performance Test strategies that will validate impacts of the changes to Performance of individual applications and entire landscape
Collaborate with Development, Project managers/owners, solution architects to design application tests to ensure the performance and resiliency of the application stack
Participate and provide recommendations for ensuring performance is built into the solution design
Work closely with Vendors/Partners for execution of Load Tests based on release/delivery schedule
Review performance test results and ensure all aspects of infrastructure and application responsiveness are evaluated for delivery to production
Define and implement post-mortem / root-cause analysis processes – develop improved testing scenarios based upon analysis
Design and execute Performance tests to support Agile delivery
Working experience with multiple Performance Testing tools (like LoadRunner, JMeter, NeoLoadetc.)
Demonstrated experience with Application Performance Management tools( like Datadog, New Relic, AppDynamics, Dynatrace)
Good experience with switches and routers
Working experience in Cloud migration Performance testing
Working knowledge of distributed Cache, Load Balancers, Database systems
Ability to drive complex strategies and solutions
Experience with issues management, risk management
Defining the workload model based on analytics/logs
Be able to identify improvement areas and implement for delivery excellence
Strong problem-solving skills
Strong written and verbal communication skills
Good knowledge of any programming language
 
Good to have skills
Design and implement automated Performance Testing in the CI/CD pipeline.
Experienced in setting up Production monitoring dashboards based on business and IT goals
Experience with micros servers based on architecture and containerization-based deployments like Docker, Kubernetes
Knowledge of application architecture concepts, including topology, protocols, components, and principles would be advantageous
 
 Please share your resumes with sasi....@intellyk.com

SASI KUMAR DOVALA

unread,
Dec 9, 2025, 9:27:11 AM12/9/25
to SASI KUMAR DOVALA

SASI KUMAR DOVALA

unread,
Jan 7, 2026, 9:55:29 AM (6 days ago) Jan 7
to SASI KUMAR DOVALA
 
 Role:                ETL Architect
Skills:             ETL/ELT Architecture, Azure, Databricks, Data Sources (Oracle, SQL Server, SAP, Salesforce), Data Modelling, Cloud Platforms
Experience: 15+years
Location:      Chicago IL ( Locals Only)

Visa : No H1B 
 
Locals are highly recommended.
 
Key Responsibilities:
 
Design and implement ETL/ELT architecture with Databricks as the enterprise Lakehouse.
Integrate data from diverse sources (RDBMS, APIs, SaaS apps, flat files, streaming platforms, cloud services) into Lakehouse. 
Define data integration best practices, including reusability, scalability, and cost optimization.
Lead and mentor ETL/ELT developers in building robust pipelines.
Establish data quality, governance, and lineage frameworks.
Collaborate with data architects, BI developers, and business stakeholders for end-to-end data delivery. 
Evaluate and implement ETL/ELT tools and automation frameworks suited for multiple source systems.
Troubleshoot integration issues and define long-term solutions.
Keep up to date with Snowflake features and emerging data integration technologies.
 
Required Skills & Qualifications:
 
Over 15+ years in IT/ETL/DWH and 10+ years in ETL/ELT architecture and development.
Strong expertise in Databricks (warehouses, streams, tasks, notebooks, data sharing).
Strong SQL and performance optimization skills.
Experience working with varied data sources: Oracle, SQL Server, SAP, Salesforce, REST APIs, flat files, cloud-native systems.
Solid understanding of data modeling (star schema, snowflake schema, data vault) and data warehousing principles.
Hands-on experience with cloud platforms (AWS/Azure/GCP) for data integration.
Strong leadership and communication skills for onsite stakeholder management.
 
Nice to Have:
 
Experience with real-time/streaming data integration (Kafka, Databricks streaming, Azure Event Hub).
Familiarity with data governance and catalog tools (Collibra, unity catalog).
Knowledge of big data ecosystems (Spark, Hadoop).
Exposure to BI/Analytics platforms (Power BI, Tableau
Reply all
Reply to author
Forward
0 new messages