Hi,
Hope you are doing great.
Please share profiles for the role below, including visa status and current location, if the rate and role work for your consultant.
Role: GCP Data Architect – IAM Data Modernization 13+ Years
Location: Dallas, TX (4 days onsite)
Rate: $70-$75/hr. C2C
Project/Program
Identity & Access Management (IAM) Data Modernization – migration of an on-premises SQL data warehouse to a target‑state Data Lake on Google Cloud (GCP), enabling metrics & reporting, advanced analytics, and GenAI use cases (natural language querying, accelerated summarization, cross-domain trend analysis).
About Program/Project
The IAM Data Modernization project involves migrating an on-premises SQL data warehouse to a target state Data Lake in the GCP cloud environment. Key highlights include:
This modernization establishes a single source of truth for enterprise-wide data-driven decision-making.
Required Skills
Data Lake Architecture & Storage
· Experience with Hadoop/HDFS architecture, distributed file systems, and data locality principles
Qualifications
Data Ingestion & Orchestration
· Experience building batch and streaming ingestion pipelines using GCP-native services
· Knowledge of Pub/Sub-based streaming architectures, event schema design, and versioning
· Strong understanding of incremental ingestion and CDC patterns, including idempotency and deduplication
· Hands-on experience with workflow orchestration tools (Cloud Composer / Airflow)
· Ability to design robust error handling, replay, and backfill mechanisms
Data Processing & Transformation
· Experience developing scalable batch and streaming pipelines using Dataflow (Apache Beam) and/or Spark (Dataproc)
· Strong proficiency in BigQuery SQL, including query optimization, partitioning, clustering, and cost control.
· Hands-on experience with Hadoop MapReduce and ecosystem tools (Hive, Pig, Sqoop)
· Advanced Python programming skills for data engineering, including testing and maintainable code design
· Experience managing schema evolution while minimizing downstream impact
Analytics & Data Serving
· Expertise in BigQuery performance optimization and data serving patterns
· Experience building semantic layers and governed metrics for consistent analytics
· Familiarity with BI integration, access controls, and dashboard standards
· Understanding of data exposure patterns via views, APIs, or curated datasets
Data Governance, Quality & Metadata
· Experience implementing data catalogs, metadata management, and ownership models
· Understanding of data lineage for auditability and troubleshooting
· Strong focus on data quality frameworks, including validation, freshness checks, and alerting
· Experience defining and enforcing data contracts, schemas, and SLAs
· Familiarity with audit logging and compliance readiness
Cloud Platform Management
· Strong hands-on experience with Google Cloud Platform (GCP), including project setup, environment separation, billing, quotas, and cost controls
· Expertise in IAM and security best practices, including least-privilege access, service accounts, and role-based access
· Solid understanding of VPC networking, private access patterns, and secure service connectivity
· Experience with encryption and key management (KMS, CMEK) and security auditing
DevOps, Platform & Reliability
· Proven ability to build CI/CD pipelines for data and infrastructure workloads
· Experience managing secrets securely using GCP Secret Manager
· Ownership of observability, SLOs, dashboards, alerts, and runbooks
· Proficiency in logging, monitoring, and alerting for data pipelines and platform reliability
Good to have
Security, Privacy & Compliance
· Hands-on experience implementing fine-grained access controls for BigQuery and GCS
· Experience with VPC Service Controls and data exfiltration prevention
· Knowledge of PII handling, data masking, tokenization, and audit requirements
Thanks, and Regards
| T 732-348-5000 x 837 M- 7322284897
|