Hiring Data Engineer

0 views
Skip to first unread message

Abhinav Mohanty

unread,
6:42 PM (2 hours ago) 6:42 PM
to

Hello Everyone,

Please share suitable profiles.

 

Don’t call me, Once I review the profile will give you a call.

 

If you are sharing any profile, please mention:

Rate –

Location –

Work Authorization –

"Before submitting any candidates please share the visa back and front copy must and LinkedIn id "

 

Role: Data Engineer

Location: San Jose- CA (Hybrid) - Need Local – 3 days a week. (Onsite Interview next week)

 

NO H1B, CPT, OPT

 

Job Description:

Job Summary: Lead data architecture design and implementation for the Customer Data Store platform, focusing on optimal utilization of GCP data services. Define data models, access patterns, and migration strategies for transitioning from Oracle to cloud-native solutions.

Essential Responsibilities: -
Design and implement data models for Spanner and Bigtable based on access patterns - Create data architecture blueprints for multi-region, globally distributed systems - Lead data migration strategy from Oracle to GCP including schema conversion - Optimize query performance and data access patterns for sub-10ms latency - Design consistency models and transaction boundaries for distributed data - Implement CQRS patterns and event sourcing where appropriate - Collaborate with application teams to define data contracts and APIs - Establish data governance, lineage, and quality frameworks - Prototype and validate performance characteristics of different data storage options - Document data architecture decisions and best practices

Required Qualifications: -
Minimum 8 years of experience in data architecture and database design - Bachelor’s degree in Computer Science or equivalent experience - Deep expertise in both relational and NoSQL database technologies - Experience with distributed database systems and CAP theorem implications - Strong SQL and data modeling skills - Experience with database migration projects - Understanding of data consistency, replication, and partitioning strategies

Technical Skills Required: -
Proficiency in Java with experience in Spring Data, JPA/Hibernate - Experience building reactive data access layers with R2DBC - Hands-on experience with Google Cloud Spanner and Bigtable - Knowledge of data streaming with Apache Beam/Dataflow - Experience with CDC tools (Debezium, Striim, GoldenGate) - Understanding of event-driven architectures with Pub/Sub or Kafka - Expertise in query optimization and database performance tuning

Preferred Qualifications: -
Hands-on experience with Google Cloud Spanner and Bigtable - Oracle database administration and migration experience - Experience with real-time data streaming and CDC technologies - Knowledge of data mesh and domain-driven design principles


--
Thanks & Regards,
Abhinav
Direct - 216 435 6682
Reply all
Reply to author
Forward
0 new messages