Scoobi job fails when compression is used

21 views
Skip to first unread message

Amit Jaiswal

unread,
Sep 3, 2014, 1:19:20 PM9/3/14
to scoobi...@googlegroups.com
Hi,

We have seen that when compression is used, then Hadoop job fails in scoobi with the following error:

‘container_1403048840935_184686_01_000512] is running beyond physical memory limits. Current usage: 2.8 GB of 2.8 GB physical memory used; 4.5 GB of 5.8 GB virtual memory used. Killing container’.

The error is the same as discussed in previous posting: https://groups.google.com/forum/#!topic/scoobi-users/7QJ6utJRWF8 where we tried all the different Hadoop setttings as mentioned in the thread.

After couple of trials, we found that this killing container error comes when compression is used. Without compression the job runs fine.

Scoobi: scoobi-0.8.5
Hadoop: hadoop-2.4.1-2.1.3.0-2

Can you please check how compression is done in scoobi if that can lead to memory usage that can go beyond the configured limit.

Thanks.
Amit

Amit Jaiswal

unread,
Sep 11, 2014, 1:55:25 PM9/11/14
to scoobi...@googlegroups.com
Hi,

Can somebody help on this issue?

Thanks.
Amit
Reply all
Reply to author
Forward
0 new messages