Job! Hadoop Admin @ Chicago, IL

2 views
Skip to first unread message

Anthony Michael

unread,
Dec 4, 2015, 12:50:58 PM12/4/15
to ant...@deegit.com

Hi Professionals,

Hope you are doing great!!!

 

My name is Anthony Michael, I am a Sr. recruiting specialist at Deegit Inc. I have a requirement open for Hadoop Admin position that we are exclusively recruiting for our Direct Client. We are looking to fill this requirement urgently. This is a Long term project with a possible extension depending upon client’s needs, budgets and most importantly your performance.

 

Please send me an updated resume with those required skills included in it.

 

Below is the Job Description for your reference:-

 

Hadoop Admin

Location : Chicago, IL

Duration : Long term

 

Mandatory Skills:

Knowledgeable in SQL server DB

Aggressive self-starter, can operate without Infrastructure support

IOT experience

 

Desired Skills:

Predictive analytics, advanced analytics

R, SAS

Big Data Access Tools e.g. Platfora, Datameer. 

 

Job Description:

1.     a. Config DB backup and recovery processes b. File system (Blob & HDFS) management and monitoring c. Manage and review Hadoop log files; house keeping d. Experience with at least one popular Hadoop flavor – Cloudera, HortonWorks, MapR e. Installation and Config of Hadoop projects such as Hive, Pig, HBase, Spark, etc. f. Performance optimization (MR job and other) g. Setup performance monitoring & tuning

2. Cloud a. Solid experience managing & administering cloud platform - Azure, AWS, BigTable, etc. b. deploying Big Data solutions in Cloud  c. Cluster install & config d. Capacity estimation and configure for elasticity e. Script automation for allocation, deallocation, etc.

3. Establish Big Data Security a. Data at Rest & Transit b. Access from backend and front-end applications c. Row level security & Role based security d. integration  - LDAP, AD, Okta

4. Ingestion a. Sqoop, Kakfa, Storm – highly desirable b. ADF, Informatica Cloud, etc. are definite plus c. Setup processes for historical and incremental data ingestion

5. Front-end access – Configure connectivity and interfaces enabling Cognos, Tableau, PowerBI, R, and SAS applications to access Hadoop Big data.

6. Scheduler & Runtime environment – Configure YARN and Oozie  for optimal resource utilization and end-to-end workflow process.

7. Deploy and configure Neo4J (Graph DB), Document DB and emerging big data repositories

8. Big data modeling

 

Best Regards,

 

Anthony  Michael |Sr-Recruiter

Deegit™ Inc | Technology Consulting

1900 E Golf Rd., Suite 925, Schaumburg, IL 60173

Phone (847) 440 2436 Ext. 341

Email Ant...@deegit.com

www.deegit.com

The information transmitted is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender and delete the material from any computer.

 

 

 

 

Reply all
Reply to author
Forward
0 new messages