Привет :)
It's true, we don't have a built-in sink for JDBC. In terms of sinks/sources we focus mainly on Kafka. Our vision is that Kafka should be the main way of integrating Flink with the outside world, and we want to leave the integration with other systems for tools like KafkaConnect or NiFi.
However, it's entirely possible to create JDBC sink with a custom SourceFactory - probably the best way is to use Flink JDBC connector. If you want to create one, the biggest question is - what will be the parameters shown to the user in the Designer. E.g. for Kafka-Avro Sink we have two flavours - one let's use write expression, which contains the whole Avro message, the other generates a form with Avro message fields, based on a schema.
hope it helps, if you have questions feel free to ask
thanks,
maciek