Hi
I use Camus to transfer data from kafka 0.8.2 to hdfs.
Camus job failed with following:
5/11/26 20:01:51 ERROR kafka.CamusJob: Error for EtlKey [topic=deploy.web.web partition=0leaderId= server= service= beginOffset=7893235 offset=7893236 msgSize=394 server= checksum=105963225 time=1448539196191 message.size=394]: java.io.IOException: java.lang.ClassCastException: com.linkedin.camus.etl.kafka.common.KafkaMessage cannot be cast to [B
at com.linkedin.camus.etl.kafka.mapred.EtlRecordReader.getWrappedRecord(EtlRecordReader.java:152)
at com.linkedin.camus.etl.kafka.mapred.EtlRecordReader.nextKeyValue(EtlRecordReader.java:292)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.ClassCastException: com.linkedin.camus.etl.kafka.common.KafkaMessage cannot be cast to [B
at com.creditease.bdp.paas.ByteArrayMessageDecoder.decode(ByteArrayMessageDecoder.java:17)
at com.linkedin.camus.etl.kafka.mapred.EtlRecordReader.getWrappedRecord(EtlRecordReader.java:142)
... 12 more
If you need any other info, please let me know.
It's really confusing.
THANK YOU VERY MUCH