Hi all,
I'm a big user of protobuf in several roles now and a common theme is recreating the publish and subscribe pattern as well as RPC with protobuffers. In particular our use case at the moment is publish protobuf messages to Google Pub Sub, but have also used/experimented with Kafka/Nats/SQS.
The benefits are of course the speed of parsing, typings and forward compatibility, but one large downside is the ability to accidentally subscribe to the wrong type on a topic or publish the wrong type, at the moment, the consumer of a topic provides the type and we reflect that type when consuming, for example.
This can look like the following (in Go)...
c.On(pubsub.HandlerOptions{
Topic: "name_of_topic",
Name: "update_order_in_bigquery",
Handler: func(ctx context.Context, o *order.Order, _ *pubsub.Msg) error {
return publishToBQ(o)
},
})
As stated before, if the topic is incorrectly named or the type doesn't match the topic, subscribing fails. Worse so in publishing you can publish one type to a channel of other types. This message will never be processed.
Ideally we'd like to type both the publisher and subscribers and generate the code, much like gRPC and the service definition in protobuf.
I propose something like the following..
message Orders {
string id = 1;
}
service Orders {
rpc Get(GetRequest) returns (GetResponse);
rpc Create(CreateRequest) returns (CreateResponse);
rpc Update(UpdateRequest) returns (UpdateResponse);
rpc List(ListRequest) returns (ListResponse);
rpc Subscribe(SubscribeRequest) returns (stream SubscribeResponse) {};
}
publisher Orders {
topic Created(Order);
topic Updated(Order);
}
Any thoughts of stories of how other are handling this would be great!
Happy to help contribute too, especially in Go, JS and Typescript domains