Hi,
On Saturday, July 13, 2013 9:29:19 AM UTC+2, mongonix wrote:
Hi Mike,
On Saturday, July 13, 2013 3:33:51 AM UTC+2, Michael Sick wrote:Mr Mongonix,
Do you still have the serializer for Avro? Does it parse / check schema? I would like to see it and also see if I could create a Protocol Buffer equivalent.
I'll look around to see if I still have the sources for an avro-based serializer somewhere. If I manage to find them, I'll post it here.
-Leo
Here is the old code for my Avro-based serializer test:
package test.hazelcast.serialization;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import org.apache.avro.Schema;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;
import org.apache.avro.io.Decoder;
import org.apache.avro.io.DecoderFactory;
import org.apache.avro.io.Encoder;
import org.apache.avro.io.EncoderFactory;
import org.apache.avro.reflect.ReflectData;
import org.apache.avro.reflect.ReflectDatumReader;
import org.apache.avro.reflect.ReflectDatumWriter;
import com.hazelcast.nio.serialization.ByteArraySerializer;
public class AvroSerializer implements ByteArraySerializer<Object>{
final static ReflectData reflectData = ReflectData.get();
// TODO: Currently it uses a fixed schema for a concrete class.
// To make it more universal you may want to create a run-time extendible mapping from classes to their schemas
final static Schema schema = reflectData.getSchema(Main.class);
final DatumWriter<Object> writer;
final DatumReader<Object> reader;
Encoder e;
ByteArrayOutputStream os;
DecoderFactory decoderFactory = DecoderFactory.get();
EncoderFactory encoderFactory = EncoderFactory.get();
public AvroSerializer() {
// System.out.println("Schema is:\n " + schema.toString());
writer = new ReflectDatumWriter<Object>(schema);
reader = new ReflectDatumReader<Object>(schema);
os = new ByteArrayOutputStream(4096*4);
}
public int getTypeId() {
return 122;
}
public byte[] write(Object object) throws IOException {
os.reset();
e = encoderFactory.binaryEncoder(os, null);
writer.write(object, e);
e.flush();
byte[] bytes = os.toByteArray();
return bytes;
}
public Object read(byte[] bytes) throws IOException {
Decoder decoder = decoderFactory.binaryDecoder(bytes, null);
Object obj = reader.read(null, decoder);
return obj;
}
public void destroy() {
}
}
As indicated inside code comments, this serializer works for one single user-defined class, which is statically known at compilation time. But it should be very straight forward to extend this serializer to support any class that is dynamically passed to it at run-time.
Please let me know if this code is of any help for you and if you managed to do something interesting using a combination of Hazelcast and Avro.
Regards,
Leo
P.S. BTW, the latest code in the Avro-trunk contains some significant speed improvements (e.g. it is 3-5 times faster than before) for
org.apache.avro.reflect-based Avro serializers that are used in my code above (https://issues.apache.org/jira/browse/AVRO-1282). Therefore you may want to give it a try.