Hi,
We have seen that when compression is used, then Hadoop job fails in scoobi with the following error:
‘container_1403048840935_184686_01_000512] is running beyond physical memory limits. Current usage: 2.8 GB of 2.8 GB physical memory used; 4.5 GB of 5.8 GB virtual memory used. Killing container’.
The error is the same as discussed in previous posting:
https://groups.google.com/forum/#!topic/scoobi-users/7QJ6utJRWF8 where we tried all the different Hadoop setttings as mentioned in the thread.
After couple of trials, we found that this killing container error comes when compression is used. Without compression the job runs fine.
Scoobi: scoobi-0.8.5
Hadoop: hadoop-2.4.1-2.1.3.0-2
Can you please check how compression is done in scoobi if that can lead to memory usage that can go beyond the configured limit.
Thanks.
Amit