Opportunity for Data Engineer - Bigdata (GCP/AWS)

14 views
Skip to first unread message

Indeed Inspiring

unread,
Dec 2, 2020, 12:13:56 AM12/2/20
to Job Portal - Indeed Inspiring Infotech
Opportunity for Data Engineer - Bigdata (GCP/AWS)
SALARY (₹)
5.00 - 18.00 Lacs
 
EXPERIENCE
3 - 8 yrs
LOCATION
Bangalore/Bengaluru, Mumbai (All Areas)
KEYSKILLS
Hadoop,Spark,Python,GCP,AWS,SQL,Big Data,Data Engineer,Big Data Engineer
Dear Candidate,
Company Profile

Quantiphi is an award-winning Applied AI and Big Data software and services company, driven by a deep desire to solve transformational problems at the heart of businesses. Our signature approach combines
groundbreaking machine-learning research with disciplined cloud and data-engineering practices to create breakthrough impact at unprecedented speed.
Some company highlights:
• Quantiphi has seen 2.5x growth YoY since its inception in 2013.
• Winner of the "Machine Learning Partner of the year" award from Google for two consecutive years - 2017 and 2018.
• Winner of the "Social Impact Partner of the year" award from Google for 2019.
• Headquartered in Boston, with 700+ data science professionals across different offices.

Job Description
Role: Data Engineer - Bigdata
Experience Level: 3 to 8 Years
Work location: Mumbai/Bangalore

Role & Responsibilities:
• Work with cloud engineers and customers to solve for big data problems by developing utilities for
migration, storage and processing on Google Cloud.
• Design and build a cloud migration strategy for cloud and on-premise applications.
• Diagnose and troubleshoot complex distributed systems problems and develop solutions with a
significant impact at massive scale.
• Build tools to ingest and jobs to process several terabytes or petabytes per day.
• Design and develop next-gen storage and compute solutions for several large customers.
• Communicate with a wide set of teams, including Infrastructure, Network, Engineering, DevOps,
SiteOps teams, and cloud customers.
• Build advanced tooling for automation, testing, monitoring, administration, and data operations across
multiple cloud clusters.

Required Skills:
• 4+ years’ experience of Hands-on in data structures, distributed systems, Hadoop and spark, SQL and
NoSQL Databases
• Strong software development skills in at least one of: Java, C/C++, Python or Scala.
• Experience building and deploying cloud-based solutions at scale.
• Experience in developing Big Data solutions (migration, storage, processing)
• BS, MS or PhD degree in Computer Science or Engineering, and 5+ years of relevant work experience
in Big Data and cloud systems.
• Experience building and supporting large-scale systems in a production environment

Technology Stack:
• Cloud Platforms – AWS, GCP or Azure Big Data Distributions
• Any of Apache Hadoop/CDH/HDP/EMR/Google DataProc/HD-Insights Distributed processing
Frameworks.
• One or more of MapReduce, Apache Spark, Apache Storm, Apache Flink. Database/warehouse
• Hive, HBase, and at least one cloud native services Orchestration Frameworks
• Any of Airflow, Oozie, Apache NiFi, Google DataFlow Message/Event Solutions
• Any of Kafka, Kinesis, Cloud pub-sub Container Orchestration (Good to have)
• Kubernetes or Swarm 
Reply all
Reply to author
Forward
0 new messages