I'm trying to write a decoder for kafka(bytes to case class) . I have written something like this and its working fine ( actual message is in protobuff)
private[decoder] val decoder = new PBDecoder[java.util.Map[String,Any]](
classOf[Schema.Click],
new TypeReference[java.util.Map[String,Any]]() {})
def fromBytes(bytes: Array[Byte]): java.util.Map[String,Any] = {
decoder.fromBytes(bytes)
}
message structure is nested something like below :
{
timestamp=1468389261962,
click=281651856214611047,
fingerprint= {
id=,
metadata= {
}
}
However , consdiering some downstream manipulation @ spark streaming I'm planning convert the same to case class , that's how I came across to salat , what I noticed it is not able to deserialize nested structure , works fine for normal collection and data types . Also , some of my element is optional , seems like option is not working also .
By any chance I can make it work for nested structure ?
case class Click(
timestamp: Long,
device_fingerprint:Option[LinkedHashMap[String, Object]],
type
: String)