Ah, thx a lot, finally i built flume from sources and it works now.
I'll sup up what i did, maybe it helps somebody.
first i just enabled lzo and got the following error
# com.cloudera.flume.handlers.debug.InsistentAppendDecorator: attempt
failed: Compression codec com.hadoop.compression.lzo.LzoCodec not
found.
This means, that the hadoop-lzo jar file is missing in fluem.
Then I copied the hadoop-lzo.jar to /usr/lib/flume/lib
After restart i got this error
# ERROR com.hadoop.compression.lzo.GPLNativeCodeLoader: Could not load
native gpl library
# java.lang.UnsatisfiedLinkError: no gplcompression in
java.library.path
This means that the native libs are missing in flume.
Then i added the following file /usr/lib/flume/bin/flume-env.sh
and put in the following path (last folder depends on the system)
export JAVA_LIBRARY_PATH=/usr/lib/hadoop-0.20/lib/native/Linux-
amd64-64
I still got the same error message.
i am using flume version 0.9.3+5-1~lucid-cdh3b4.
Theres is a patch which has to be applied to this version:
https://github.com/cloudera/flume/commit/4093a38f218f8ce024908618ec0697ba0c4b32a0
I applied it and restarted.
Then i got new error messages
# INFO com.cloudera.flume.handlers.debug.InsistentAppendDecorator:
append attempt 3 failed, backoff (60000ms): null
# INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening
hdfs://localhost/tmp/2011-03-11/0900/keyword-log.00000347.20110311-144818299+0100.21747225607120.seq
# INFO com.cloudera.flume.handlers.debug.StubbornAppendSink:
# INFO com.cloudera.flume.handlers.rolling.RollSink: closing RollSink
'escapedCustomDfs("hdfs://localhost/tmp/%Y-%m-%d/%H00/","%
{scribe.category}-%{rolltag}" )'
# WARN com.cloudera.flume.handlers.rolling.RollSink: TriggerThread
interrupted
# INFO com.cloudera.flume.handlers.rolling.RollSink: opening RollSink
'escapedCustomDfs("hdfs://localhost/tmp/%Y-%m-%d/%H00/","%
{scribe.category}-%{rolltag}" )'
# INFO com.cloudera.flume.handlers.debug.InsistentOpenDecorator:
Opened MaskDecorator on try 0
# INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening
hdfs://localhost/tmp/2011-03-11/0900/keyword-log.00000019.20110311-144848294+0100.21777220544373.seq
Then i realized i need the current release and built flume on my own:
# git clone
https://github.com/cloudera/flume.git
# cd flume
# ant
# cd /usr/lib/flume/lib
# cp /path/build/flume-0.9.3-test.jar flume-0.9.3-CDH3B4-test.jar
# cp /path/build/flume-0.9.3-core.jar flume-0.9.3-CDH3B4-core.jar
i restarted and everything was fine:
# sudo /etc/init.d/flume-master restart; sudo /etc/init.d/flume-node
restart
On 11 Mrz., 01:10, NerdyNick <
nerdyn...@gmail.com> wrote:
> Most likely your are missing the hadoop-lzo.jar from flumes classpath.
> I tend to just sym link it from my hadoop install on the same node to
> /usr/lib/flume/lib.
>
> You will also need to make sure the native lib path is in
> java.library.path for flume itself. To do this just add this line to
> your flume-env.sh file in /usr/lib/flume/bin.
>
> 64Bit
> export JAVA_LIBRARY_PATH=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64
>
> 32Bit
> export JAVA_LIBRARY_PATH=/usr/lib/hadoop-0.20/lib/native/Linux-i386-32
>
> Do note this requires Flume 0.9.3 with the most recent patches for
> JAVA_LIBRARY_PATH. Otherwise you need to make the same change as in
> this commit (
https://github.com/cloudera/flume/commit/4093a38f218f8ce024908618ec06...)