Hiring for Hadoop Developer at Dearborn, MI

0 views
Skip to first unread message

USIT Recruiter

unread,
Nov 4, 2019, 10:29:41 AM11/4/19
to abk...@sagegrouptech.com

Hi ,

 

Please lookup the below position and if you are comfortable then share your updated resume.

 

Position:  Hadoop Developer

Location:  Dearborn, MI

Duration : 12+ Months

Interview Mode : Skype hire

 

Summary:

The Hadoop Developer position will provide expertise in a wide range of technical areas, including but not limited to: Cloudera Hadoop ecosystem, Java, collaboration toolsets integration using SSO, configuration management, hardware and software configuration and tuning, software design and development, and application of new technologies and languages that are aligned with other FordDirect internal projects. Preferred past/current experience on a CRM integration assignment.

 

Job Functions:

·         Design and development of data ingestion pipelines.

·         Perform data migration and conversion activities.

·         Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.

·         Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).

·         Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.

 

Required:

1.       Java/J2EE

2.       Web Applications, Tomcat (or any equivalent App server) , RESTful Services, JSON

3.       Spring, Spring Boot, Struts, Design Patterns

4.       Hadoop (Cloudera (CDH)) , HDFS, Hive, Impala, Spark, Oozie, HBase

5.       SCALA

6.       SQL

7.       Linux

 

Good to Have:

·         Google Analytics, Adobe Analytics

·         Python, Perl

·         Flume, Solr

·         Strong Database Design Skills

·         ETL Tools

·         NoSQL databases (Mongo, Couchbase, Cassandra)14. JavaScript UI frameworks (Angular, NodeJS, Bootstrap)

·         16. Good understanding and working knowledge of Agile development

 

Responsibilities:

·         Document and maintain project artifacts.

·         Suggest best practices, and implementation strategies using Hadoop, Java, ETL tools.

·         Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.

·         Other duties as assigned

 

Requirements:

·         Must have a Bachelor’s degree in Computer Science or related IT discipline

·         Must have at least 5 years of IT development experience.

·         Must have strong, hands-on J2EE development

·         Must have in-depth knowledge of SCALA – Spark programming

·         Must have 3+ years relevant professional experience working with Hadoop (HBase, Hive, MapReduce, Sqoop, Flume) Java, JavaScript, .Net, SQL, PERL, Python or equivalent scripting language

·         Must have experience with ETL tools

·         Must have experience integrating web services

·         Knowledge of standard software development methodologies such as Agile and Waterfall

·         Strong communication skills.

·         Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary

·         Specific Knowledge, Skills and Abilities:

·         Ability to multitask with numerous projects and responsibilities

·         Experience working with JIRA and WIKI

·         Must have experience working in a fast-paced dynamic environment.

·         Must have strong analytical and problem solving skills.

·         Must have good verbal and written communication skills

·         Must be able and willing to participate as individual contributor as needed.

·         Must have ability to work the time necessary to complete projects and/or meet deadlines

 

Skill Matrix:

Total IT Experience:

Experience with Hadoop Development:

Experience with Hadoop platform:

Experience with HBase, Hive, MapReduce, Sqoop, Flume:

Experience with Java/J2EE:

Experience with RESTful Services:

Experience with JSON:

Experience with Spring / Spring boot:

Experience with Struts:

Experience with ETL tools

Experience with JavaScript:

Experience in integrating web services:

Experience in system integration experience in the Hadoop environment:

Experience with SCALA:

Experience with SQL:

Highest qualification:

Certification- If Any:

 

 

Thanks & Regards,

 

Abhishek Kumar | Sr. Technical Recruiter

Sage Group Technologies Inc., www.sagegroupinc.com

Direct: 732.837.2134 | Email: abk...@sagetl.com

W: 732.860.4602 X 305

3400 Highway 35, Suite # 9A, Hazlet, NJ 07730

Description: cid:image002.jpg@01D3AA60.A9AB3B30

Disclaimer: This is not a Spam mail, we are contacting you because either you have applied for a similar role with our company in the past, have your resume posted on job boards, professional groups, etc., We apologize if this email has caused any inconvenience, please disregard if this is not relevant to you or reply with remove and we will be sure to take the necessary steps to avoid any further inconvenience.

 

 

Reply all
Reply to author
Forward
0 new messages