I am getting ClassNotFoundException on DummySchemaRegistry
I have included the classpath for the class file
camus-example/target/classes/com/linkedin/camus/example/schemaregistry/DummySchemaRegistry.class
Appreciate if you could point me to the right direction.
Here is the exception:
Caused by: com.linkedin.camus.coders.MessageDecoderException: java.lang.ClassNotFoundException: com.linkedin.camus.example.DummySchemaRegistry
at com.linkedin.camus.etl.kafka.coders.KafkaAvroMessageDecoder.init(KafkaAvroMessageDecoder.java:40)
at com.linkedin.camus.etl.kafka.coders.MessageDecoderFactory.createMessageDecoder(MessageDecoderFactory.java:24)
... 16 more
Caused by: java.lang.ClassNotFoundException: com.linkedin.camus.example.DummySchemaRegistry
Shekar
--
You received this message because you are subscribed to the Google Groups "Camus - Kafka ETL for Hadoop" group.
To unsubscribe from this group and stop receiving emails from it, send an email to camus_etl+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
The only jar file under target has,
[ctippur@pppdc9prd310 camus]$ jar tf target/camus-parent-0.1.0-SNAPSHOT-tests.jar
META-INF/
META-INF/MANIFEST.MF
META-INF/maven/
META-INF/maven/com.linkedin.camus/
META-INF/maven/com.linkedin.camus/camus-parent/
META-INF/maven/com.linkedin.camus/camus-parent/pom.xml
META-INF/maven/com.linkedin.camus/camus-parent/pom.properties
- Shekar
On Monday, 16 June 2014 11:35:53 UTC-7, Roger Hoover wrote:
- Shekar
I think I got through this.
I had missed schemaregistry in the config.
# Used by avro-based Decoders to use as their Schema Registry
kafka.message.coder.schema.registry.class=com.linkedin.camus.example.schemaregistry.DummySchemaRegistry
I get a different error now:
com.linkedin.camus.coders.MessageDecoderException: com.linkedin.camus.coders.MessageDecoderException: java.lang.InstantiationException: com.linkedin.camus.example.schemaregistry.DummySchemaRegistry
at com.linkedin.camus.etl.kafka.coders.MessageDecoderFactory.createMessageDecoder(MessageDecoderFactory.java:28)
at com.linkedin.camus.etl.kafka.mapred.EtlInputFormat.createMessageDecoder(EtlInputFormat.java:390)
at com.linkedin.camus.etl.kafka.mapred.EtlInputFormat.getSplits(EtlInputFormat.java:264)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:491)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:508)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:280)
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:608)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.linkedin.camus.etl.kafka.CamusJob.main(CamusJob.java:572)
Caused by: com.linkedin.camus.coders.MessageDecoderException: java.lang.InstantiationException: com.linkedin.camus.example.schemaregistry.DummySchemaRegistry
at com.linkedin.camus.etl.kafka.coders.KafkaAvroMessageDecoder.init(KafkaAvroMessageDecoder.java:40)
at com.linkedin.camus.etl.kafka.coders.MessageDecoderFactory.createMessageDecoder(MessageDecoderFactory.java:24)
... 16 more
Caused by: java.lang.InstantiationException: com.linkedin.camus.example.schemaregistry.DummySchemaRegistry
at java.lang.Class.newInstance(Class.java:359)
at com.linkedin.camus.etl.kafka.coders.KafkaAvroMessageDecoder.init(KafkaAvroMessageDecoder.java:31)
... 17 more
- Shekar
- Shekar
Here is the constructor..
public DummySchemaRegistry(Configuration conf) {
super();
super.register("DUMMY_LOG", DummyLog.newBuilder().build().getSchema());
super.register("DUMMY_LOG_2", DummyLog2.newBuilder().build()
.getSchema());
Before throwing the exception, I see this message.
[CamusJob] - Fetching metadata from broker 10.132.62.231:9092 with client id camus for 0 topic(s) []
In my camus config file, I have
kafka.whitelist.topics=DUMMY_LOG,test
I generate a message on kafka using ..
bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test <<EOF
THIS IS A MESSAGE FOR test TOPIC
EOF
$ bin/kafka-console-producer.sh --broker-list localhost:9092 --topic DUMMY_LOG <<EOF
THIS IS A MESSAGE FOR DUMMY_LOG TOPIC
EOF
I am able to read this message.
$ bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic DUMMY_LOG --from-beginning
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
THIS IS A MESSAGE FOR DUMMY_LOG TOPIC
THIS IS A MESSAGE FOR DUMMY_LOG TOPIC
I am am still working on it. Are you producing messages that are Avro encoded?
- Shekar