Hello everyone :-)
The company I'm working with embraced gRPC a while ago (micro-service architecture). The gRPC implementation we use in the official Java implementation, and here's how our build process looks like:
- Each micro-service has it own git repository.
- If Micro-service (named 'A') protobuf files depends on another Micro-service (named 'B') protobuf files to be compiled - upon building Micro-service A, a Gradle plug-in will reach and grab Micro-service B protobuf files.
- When all the dependencies exists, the same Gradle plug-in will use protoc to generate the gRPC stub and compile the micro-service A. Additional steps like create Docker image and deploying the service also happen.
- Because some of our UI service using rest API, along with compiling the stubs and service - we use the gRPC gateway to generate a REST API gateway, along with Swagger JSON files, and deploy those separately.
This worked well, but it suffer from two problems:
1. Each build require the project to reach external project in order to get the latest Protobuf files, and this might take time.
2. The protobuf code is being generated over and over. What would be better, is to have a JAR already out-there for each Micro-service. So Micro-service A can just reach and consume the Micro-service B Jar.
Also lately - more people are embracing gRPC, and this including more language like Python and Go. So a broader build process is needed to support multitude of languages. Following Google foot-steps and using "googleapi" repository as guideline, we decided to have a single repo that will host all the company protobuf files. So building will now happen in a single places rather than every project. Now what need to be done - is to implement a unified solution to build generated protobuf code in multiple languages, publish the artifacts (in package when possible like for example JAR), build a gRPC gateway for each service and the Swagger files. Here's two approach:
1. Create a basic 'non-language specific makefile (Shell script of a sort). It will probably just visit each directory (directory per service), use protoc couple of times (one per language), create Packages if possible along with the gateway and swagger. I can even call directly to the Gradle plug-in when building the Java artifacts, but that's a bit hacky.
2. Use
Bazel - googleapi use
bazel-tools and
rules-go to create artifacts in Java and Go. But I couldn't find a plug-in that handles the creation of the gRPC gateway or the Swagger. I did find another repository that have couple of Bazel rules called "rules_proto" (
https://github.com/stackb/rules_proto) that do have support in Swagger and gRPC gateway - but it wasn't able to make it work out of the box (but I didn't fully debugged what went wrong yet). So Bazel is an option, but it feels like it's not really a mature solution, as it require tailoring a specific setup - and it's not streamlined yet (for example, there no way to create a single JAR with dep-tree between services, only jar per micro-service).
So before I'm investing more time writing an external Makefile, or tweaking/writing Bazel plugins - I figured I'll come ask here - because I probably not the only one trying to do something like that.
Thank you!