Big Data Developer with Oracle Database_ Need Visa Copy and Photo ID with the resume

1 view
Skip to first unread message

Rajat Ahuja

unread,
Nov 20, 2018, 12:30:52 AM11/20/18
to

Hello,

 

Hope you are doing fine!!

Please review the requirement and if you are interested then reply back with your consultant resume and contact details ASAP to ra...@net2source.com

Position : Big Data Developer with Oracle Database
Location : Quincy, MA
Duration : 6+ months

Big data with hands on experience in Big data Eco- system related technologies like Map Reduce, Spark, Hive, Impala, Sqoop, HBase, Kafka.

Technology –  Oracle 11g, SQL, PL/SQL, Performance Tuning, Unix, Autosys, Data Warehouse Experience, Data Modeling

Mandatory Skills


•             Interact with Business Analyst and Product team to understand the business & functional requirements.
•             Actively participate in project scrum meeting, planning meeting and story sizing.
•             Perform data modeling and implement the business rules using Oracle database objects.
•             Define source to target data mapping and data transformation logic per the business need.
•             Develop data marts for various applications by integrating source data.
•             Review the data models designed and developed for each module.
•             Analysis of Big data with hands on experience in Big data Eco- system related technologies like Map Reduce, Spark, Hive, Impala, Sqoop, HBase, Kafka. 
•             Hands on development and maintenance of the Hadoop Platform and various associated components for data ingestion, transformation and processing.
•             Develop and support RDBMS objects and code for data profiling, extraction, load and updates.
•             Possess and demonstrate knowledge of data warehouse concepts, big data, architecture, techniques, various design alternatives, and overall data warehouse strategies.
•             Exposure to Hadoop ecosystem (including HDFS, Spark, Sqoop, Flume, Hive, Impala, MapReduce, Sentry, Navigator)
•             Design and implement real-time integration and data driven customer personalization using an API driven big data platform solution (SOAP, REST, OData).
•             Design and develop data integration solutions (batch and real-time) to support enterprise data platforms including Hadoop, RDBMS and NoSQL. 
•             Experience with Hive/Spark SQL and performance tuning.
•             Experience on Hadoop data ingestion using ETL tools, specifically Informatica Big Data Edition and Hadoop transformation (using MapReduce, Spark, Blaze)


Regards,

Rajit Ahuja

Net2Source Inc

Reply all
Reply to author
Forward
0 new messages