JDBC Sink implementation

61 views
Skip to first unread message

Александр Сергеенко

unread,
Nov 2, 2021, 12:08:01 PM11/2/21
to Nussknacker
Hi! 

I'm testing Nussknacker to check whether it fits our needs and currently I'm struggling against a process that ends with an insertion into PostgreSQL.
As far as I see Nussknacker doesn't contain the JDBC sink implementation. Am I right?
Is it possible to implement a custom sink using the plugin option?

Great thanks!

maciek.p...@gmail.com

unread,
Nov 3, 2021, 2:35:55 AM11/3/21
to Nussknacker
Привет :)

It's true, we don't have a built-in sink for JDBC. In terms of sinks/sources we focus mainly on Kafka. Our vision is that Kafka should be the main way of integrating Flink with the outside world, and we want to leave the integration with other systems for tools like KafkaConnect or NiFi. 
However, it's entirely possible to create JDBC sink with a custom SourceFactory - probably the best way is to use Flink JDBC connector. If you want to create one, the biggest question is - what will be the parameters shown to the user in the Designer. E.g. for Kafka-Avro Sink we have two flavours - one let's use write expression, which contains the whole Avro message, the other generates a form with Avro message fields, based on a schema. 

hope it helps, if you have questions feel free to ask
thanks,
maciek

Alexander Sergeenko

unread,
Nov 3, 2021, 5:48:58 AM11/3/21
to Nussknacker
Привет Maciek :)

I appreciate your goodwill alot, thanks!
The vision is now clear to me, I agree that Kafka is the best way to achieve a so-called kappa architecture using Apache Flink, but our current business goal is to eliminate any external tools and solve the problem using Flink exclusively .
Thanks for your help!

Alexander

среда, 3 ноября 2021 г. в 09:35:55 UTC+3, maciek.p...@gmail.com:
Reply all
Reply to author
Forward
0 new messages