BigData Engineer with GCP
Location: Dearborn, MI, post covid.
Required EXP 7 to 10+years
Job Description: Big Data Engineer with GCP experience.
Need to have GCP Migration expreince or Extensively working with GCP.
Required Skills: HDFS Hadoop, Spark/Python, Scala and Hive
- Design, Build and operationalize large-scale enterprise data solutions.
- Hands-on experience analyzing and re-platforming on-prem data warehouses to data platforms on GCP cloud using GCP/3rd party services
- Experience using Cloud Spanner to handle relational data and Big Table to store huge volue of Key Value pairs and Big Query to do interactive data analysis.
- Experience using Google Dataflow in conjunction with PubSub / Kafka to process and analyze real-time streaming data.
- Expertise in designing and building data pipelines from ingestion to consumption within a hybrid bigdata architecture
- Experience with integration of data from multiple data sources
- Proven ability to work effectively in a fast-paced, interdisciplinary, and deadline driven environment.
Strong problem solving and troubleshooting skills.