Local only | Data Engineer | Coppell, TX | USC/GC

4 views
Skip to first unread message

Anuj

unread,
Sep 5, 2025, 11:51:06 AM (4 days ago) Sep 5
to anuj....@sourceinfotech.com

Hello,

 

I hope you are doing well!

Kindly acknowledge me, are you Comfortable with this Position then please share with me your updated resume

 

Job Title: Data Engineer
Location: Coppell, TX   - 4 days onsite

Visa: USC, GC

Type: C2H

Locals for F2F !

 

 

We are seeking an experienced and highly skilled Data Engineer with expertise in Azure and Databricks to join our data and reporting team. In this role, you will be responsible for designing, building, and maintaining robust data architectures and pipelines in a cloud-based environment. You will collaborate with cross-functional teams to ensure data availability, performance, and quality for analytics and machine learning solutions.
LOGISTICS:

  • Candidate must physically work at Store Support Center, located in Coppell, TX;  Monday – Thursday. Friday is WFH until further notice

THE ESSENTIALS:

  • Design, develop, and maintain scalable and robust data pipelines on Databricks.
  • Collaborate with data/business analysts to understand data requirements and deliver solutions.
  • Architect and optimize Databricks workflows and data pipelines to process, clean, and transform large datasets.
  • Design high-performance, low-latency data models that ensure optimal storage and data retrieval for analytics applications.
  • Optimize and troubleshoot existing data pipelines for performance and reliability.
  • Ensure data quality and integrity across various data sources.
  • Guide and collaborate with data engineers to set standards, do code review and provide them solutioning for different enterprise initiatives.
  • Implement data security and compliance best practices.
  • Monitor data pipeline performance and conduct necessary maintenance and updates.
  • Collaborate with DevOps teams to automate data pipeline deployment and integration.
  • Data Exploration and Analysis: Analysts use statistical tools and techniques to explore and analyze data, identifying patterns, relationships, and trends.
  • Success in data analysis demands a balance of technical skills (data cleaning and statistical analysis) and soft skills (critical thinking and communication)
  • Familiarity with data structures, storage systems, cloud infrastructure, and other technical tools.
  • Ability to work effectively in teams of technical and non-technical individuals.
  • Ability to continuously learn, work independently, and make decisions with minimal supervision
  • Demonstrate accountability, prioritize tasks, and consistently meet deadlines.
  • Familiarity with Agile/SCRUM practices
  • Work with cloud platforms such as AWS, Azure, or Google Cloud to integrate and manage data workflows and storage solutions.

SCOPE:

  • Strong expertise in Databricks, including Spark (PySpark/Scala).
  • Hands-on experience working with cloud platforms (AWS, Azure, or Google Cloud).
  • Experience with CI/CD tools like Azure DevOpsGitHub Actions, or Terraform for infrastructure-as-code.
  • Experience in designing and developing ETL processes and data pipelines.
  • Familiarity with data modeling, data warehousing, and database design.
  • Strong understanding of data formats such as ParquetORCJSON, and Avro.
  • Ability to work in an agile environment and adapt to changing requirements.
  • Strong analytical and problem-solving skills with a focus on optimizing data engineering solutions.
  • Study, analyze and understand business requirements in context to business intelligence.
  • Design and map data models to shift raw data into meaningful insights.
  • Spot key performance indicators with apt objectives
  • Build multi-dimensional data models
  • Develop strong data documentation about algorithms, parameters, models
  • Make essential technical and strategic changes to improvise present business intelligence systems
  • Identify the requirements and develop custom charts accordingly
  • SQL querying for better results
    CREDENTIALS:
  • Bachelor’s degree in computer science, Engineering, or a related field (or equivalent experience).
  • Proven experience working with Databricks for large-scale data processing.
  • Experience with CI/CD tools like Azure DevOpsGitHub Actions, or Terraform for infrastructure-as-code.
  • Strong proficiency in SQL and experience with relational databases.
  • 5+ years of data engineering experience
  • Proficient in SQL writing skills
  • Experience in restaurant / gaming industry

THE GOODS:

  • Minimum 10 years of experience of working on Data platforms.
  • Knowledge about database management, SQL querying, data modeling, and Online Analytical Processing.
  • Additional consideration given for experience scripting and programming language such as Python
  • Experience with CI/CD pipelines for data engineering tasks.
  • Ability to mentor junior team members and contribute to technical leadership

 

Thanks & Regards,  

Anuj Tiwari

US IT RECRUITER

Source Infotech Inc.

3840 Park Avenue, Suite C-205, Edison, NJ-08820


Email: anuj....@sourceinfotech.com

 

 

 

Reply all
Reply to author
Forward
0 new messages