Hi,
Hope you are doing well.
. If you are interested in this role, please let me know.
Role : GCP Data Architect
Location : Nashville, TN ( Onsite)
Term:- Contract Role
Overview
We are seeking an experienced Google Cloud Platform (GCP) Data Architect to design, build, and manage scalable, secure, and cost-optimized data solutions, aligned with reporting needs. This role involves translating business requirements into robust technical architectures, ensuring data integrity, and enabling advanced analytics through GCP services like BigQuery and Cloud Storage. The ideal candidate will lead strategy, design, and implementation efforts while collaborating with stakeholders to drive data-driven decision-making.
Key Responsibilities:
- Architect Scalable Data Solutions: Design and implement data warehouses, marts, lakes, and batch and/or real-time streaming pipelines using GCP-native tools.
- Data Modeling & Integration: Design and Develop conformed data models (star/snowflake schemas) and ETL/ELT processes for analytics and BI tools (MicroStrategy, Looker, Power BI).
- Pipeline Development: Build scalable pipelines and automate data ingestion and transformation workflows using BigQuery, Dataflow , Dataproc/PySpark, Cloud Functions, Pub/Sub, Kafka, and Cloud Composer for orchestration.
- Security & Compliance: Implement IAM, encryption, and compliance standards (GDPR, HIPAA) with GCP security tools.
- Performance Optimization: Apply best practices for partitioning, clustering, and BI Engine to ensure high performance and cost efficiency.
- DevOps & Automation: Integrate CI/CD pipelines, IaC (Terraform), and containerization (Docker, Kubernetes) for deployment and scalability.
- Collaboration & Leadership: Engage with stakeholders including leadership, Project Managers, BAs, Engineers, QA, platform teams, mentor teams, and provide technical guidance on best practices.
- Troubleshooting: Resolve complex technical issues and support incident response.
- Healthcare Domain Expertise: Ensure compliance with healthcare regulations and stay updated on industry trends.
Required Skills & Working experience:
- GCP Expertise: BigQuery, Cloud Storage, Dataflow (Apache Beam with python), Dataproc/PySpark, Cloud Functions, Pub/Sub, Kafka, Cloud Composer.
- Programming: Advanced SQL and Python for analytics and pipeline development.
- Performance Optimization: Experience with optimization of query performance, partitioning, clustering, and BI Engine, in BigQuery.
- Automation: Experience with CI/CD for data pipelines, IaC for data services, automation of ETL/ELT processes.
- Security: Strong knowledge of IAM, encryption, and compliance frameworks.
- Architecture Design: Ability to create fault-tolerant, highly available, and cost-optimized solutions.
- Communication: Excellent ability to convey technical concepts to both technical and non-technical stakeholders.
- Domain Knowledge: Familiarity with healthcare data management and regulatory compliance