Job Description – GCP AI Engineer / GCP Data Engineer
Project: Clinical Policy Hub Modernization
Role: GCP AI Engineer / GCP Data Engineer
Location:Onsite - Dallas TX
Duration: Long Term Contract
Job Summary
We are looking for a skilled GCP AI Engineer / GCP Data Engineer to support the modernization of a Clinical Policy Hub platform within the healthcare domain. The ideal candidate should have strong experience in Google Cloud Platform (GCP), AI-enabled development, and modern data engineering practices. Candidates should be comfortable leveraging AI productivity tools such as GitHub Copilot and other AI-assisted engineering platforms to accelerate development and improve operational efficiency.
This role requires strong communication skills and the ability to collaborate effectively with technical and business stakeholders. Mid-level to senior-level professionals are preferred; junior profiles will not be considered.
Key Responsibilities
-
Design, develop, and maintain scalable data engineering and AI solutions on GCP.
-
Support modernization of Clinical Policy Hub platforms and related healthcare workflows.
-
Build and optimize cloud-native data pipelines and AI-enabled workflows.
-
Develop scalable ETL/ELT processes using GCP technologies.
-
Integrate AI tools and automation capabilities into development and operational processes.
-
Collaborate with architects, product owners, and healthcare business teams to gather requirements and deliver solutions.
-
Develop APIs, data services, and integrations with healthcare systems and platforms.
-
Ensure data quality, governance, security, and compliance standards are met.
-
Participate in code reviews, architecture discussions, and deployment planning.
-
Utilize AI-assisted development tools such as GitHub Copilot and similar platforms to improve engineering productivity.
Required Skills
-
Strong hands-on experience with Google Cloud Platform (GCP).
-
Experience in Data Engineering and cloud-based data processing solutions.
-
Exposure to AI/ML solutions and AI-assisted development tools.
-
Strong experience with:
-
BigQuery
-
Dataflow
-
Cloud Storage
-
Pub/Sub
-
Cloud Functions
-
Cloud Run
-
GKE/Kubernetes
-
Strong SQL and Python programming skills.
-
Experience building scalable ETL/ELT pipelines.
-
Knowledge of modern cloud architecture and distributed systems.
-
Experience with CI/CD pipelines and DevOps practices.
-
Ability to work in Agile development environments.
-
Excellent verbal and written communication skills.
Preferred Qualifications
-
Healthcare domain experience, especially clinical platforms or policy systems.
-
Familiarity with AI tools such as GitHub Copilot, GenAI coding assistants, or intelligent automation platforms.
-
Exposure to healthcare data standards such as FHIR or HL7.
-
Experience with modern data lake or data warehouse architectures.
-
GCP certifications are a plus.
Experience Required
-
6+ years of overall IT experience.
-
3+ years of hands-on GCP Data Engineering experience.
-
Mid-level to Senior-level candidates preferred.
-
Strong communication and stakeholder management skills are mandatory.