Strategy for distributing protobuf APIs

980 views
Skip to first unread message

Mario Steinhoff

unread,
Aug 26, 2016, 10:51:47 AM8/26/16
to grp...@googlegroups.com
Hey everyone,

I have two Java services A and B. A provides RPC services via gRPC and
contains a proto file with service and message definitions. Now in B,
I want to call services from A.

Currently, I am not distributing the proto file but compile the proto
file into Java classes and create a JAR file in the service A build.
The JAR file gets publishes to my internal artifact repository but
without grpc-stub and grpc-protobuf transitive dependencies
(compileOnly). The calling service includes gRPC and the jar file with
the compile classes and then creates a ManagedChannel and a stub for
the xxxGrpc service.

It works for now, but is this a good idea? Would it be possible to
update gRPC in the calling service or might this break APIs and I have
to recompile the JAR file in the providing service?

Another way I could think of is to put the proto file into a JAR file
and add the JAR file to dependencies in all calling services. Then I'd
have to compile protobuf classes in each calling service but the gRPC
versions in both services are fully independent.

Whats the recommended way of doing this?

Mario

Paul Johnston

unread,
Aug 26, 2016, 11:10:13 AM8/26/16
to grpc.io, steinho...@gmail.com
My opinion, YMMV...

The service abstraction is already captured by the proto3 file A.proto, so I would either: 

1. distribute no implementation (just the A.proto) 
2. everything (A.jar with transitive deps).  

If you just distribute the proto file (my preferred choice), your client B can generate the stub implementation for it's environment.

You may be able to get away with your current setup, but it sounds brittle.

Nicolas Noble

unread,
Aug 26, 2016, 11:39:34 AM8/26/16
to Paul Johnston, steinho...@gmail.com, grp...@googlegroups.com

The grpc API is stable now since 1.0, so distributing generated sources, while not recommended, sound work. But you should consider distributing the proto alongside anyway. Maybe someone wants to communicate with your service in another language than Java?


--
You received this message because you are subscribed to the Google Groups "grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email to grpc-io+unsubscribe@googlegroups.com.
To post to this group, send email to grp...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/grpc-io/f109acf1-3e34-45c5-a2a5-e65c9979e346%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Eric Anderson

unread,
Sep 17, 2016, 1:52:47 AM9/17/16
to Mario Steinhoff, grpc-io
On Fri, Aug 26, 2016 at 7:51 AM, Mario Steinhoff <steinho...@gmail.com> wrote:
Currently, I am not distributing the proto file but compile the proto
file into Java classes and create a JAR file in the service A build.
The JAR file gets publishes to my internal artifact repository but
without grpc-stub and grpc-protobuf transitive dependencies
(compileOnly). The calling service includes gRPC and the jar file with
the compile classes and then creates a ManagedChannel and a stub for
the xxxGrpc service.

That's fine and a supported use-case. I would include real deps (not just compileOnly) on grpc-stub and grpc-protobuf, but to each his own. As long as service A is using the same grpc version or newer there shouldn't be issues. Note that if your protos are reusable in other services/consumers you should ship at least .protos so that your consumers can make their own protos that include yours (this uses a protobuf dependency in Gradle).

Another way I could think of is to put the proto file into a JAR file
and add the JAR file to dependencies in all calling services. Then I'd
have to compile protobuf classes in each calling service but the gRPC
versions in both services are fully independent.

That's also fine and a supported use-case. I sort of expect more people to do this (at least Maven and Gradle users, because of the plugins), just because it is probably a bit easier for them.

Shipping protos works better when dealing with multiple languages. Shipping generated code for many languages takes some effort, and I'd really only expect it for large shops providing many services to many consumers (... like Google :) ).

You can also include both the generated code and the .protos in the same JAR and let the consumer decide whether they want to run protoc. This also makes developing with the Gradle plugin quite natural, since any compile dependency naturally works for both Java and .proto (protos in the project can include dependencies, but the dependencies aren't re-codegen'd). This is what we will be doing at Google for our public artifacts.

Mario Steinhoff

unread,
Oct 5, 2016, 7:52:43 AM10/5/16
to Eric Anderson, grpc-io
Hey, thanks everyone for your feedback. :)

I decided to move all .proto files from the service projects into a
separate API project. The API project is language-agnostic but could
be used for distributing language-specific generated code in the
future, if this ever becomes a requirement.

The .proto files are copied from the API project into each individual
service project for now. In the future I can replace manual copying
with jar dependencies including only the .proto files. I still need to
figure out the details how to manage changes in the .proto APIs
without breaking backwards-compatibility, but protobuf seems to make
this easy.

Distributing generated code does not make sense for us for now, as we
are a very small shop and only use Java. Thus the language-specific
code is also generated in every service project. This allows me to
make all service builds fully independent from each other, and I can
also update grpc-java independently in each service.
Reply all
Reply to author
Forward
0 new messages