Hey all,
I've gotten Kafka Connect (3.3.1) processing some JSON messages and writing out to S3 nicely (hat-tip to some useful posts by rmoff having to do with the specific format needed for JSON messages). The issue is that the data being transmitted contains fields that are DECIMAL(18,10) and I can't tell if that's supported by the JSON converter. The following works:
{
type: "double",
optional: false,
field: "price"
}
However, there's a loss of fidelity here. The following doesn't work:
{
type: "decimal",
optional: false,
field: "price"
}
I get an error around that being an invalid field type when I kick off my converter. I've looked through the code
here and only see floats. I'm surely missing something here and would much appreciate getting some help from folks around if my use-case (decimal data types) is supported by Kafka Connect 3.3.1. My connector config is shown below.
{
"name": "foo",
"config": {
"connector.class": "io.confluent.connect.s3.S3SinkConnector",
"tasks.max": "1",
"topics": "bar",
"s3.region": "<NIP>",
"s3.part.size": "26214400",
"flush.size": "3000000",
"storage.class": "io.confluent.connect.s3.storage.S3Storage",
"format.class": "io.confluent.connect.s3.format.json.JsonFormat",
"schema.generator.class": "io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator",
"partitioner.class": "io.confluent.connect.storage.partitioner.TimeBasedPartitioner",
"schema.compatibility": "NONE",
"path.format": "'year'=YYYY/'month'=MM/'day'=dd/'hour'=HH",
"locale": "en_US",
"timezone": "UTC",
"name": "s3-sink"
}
}