Async response computation for HTTP2 Handlers

80 views
Skip to first unread message

Ragnar Rova

unread,
Apr 18, 2023, 10:33:06 AM4/18/23
to Netty discussions
Hello,

I am trying to implement a HTTP2 server in netty, only h2c is needed, no TLS. The code is based on the sample found in io.netty.example.http2.helloworld.server. Right now, I am struggling with how to correctly send back a response asynchronously from when a request was received in a Http2FrameListener. See my current implementation at:


it is a small hello world server. For GET requests, I read the request in Http2FrameListener.onHeadersRead and simulate that a response is computed asynchronously by some other thread pool than the netty event loop, and when the response CompletableFuture completes I want to send the response back in the correct context. Here I tried to execute the sending on the response on the same executor as the context event loop. The implementation I linked above is obviously not correct. With this implementation, I sometimes get a correct response sent back to the client and sometimes only headers are sent back but no data frame. I assume this is because I am using HTTP2FrameListener and the context in the wrong way and the frame encoder encounters some race condition or similar. In the code, if the request path is /graphql then I try this async response handling, but on other paths like / the response is sent back directly and that works fine.

 How would a better implementation look like if I still want to execute the computation of the response in another thread pool?

to send responses I use:

private fun sendResponse(
ctx: ChannelHandlerContext,
streamId: Int,
payload: ByteBuf,
status: HttpResponseStatus = HttpResponseStatus.OK
) {
val headers = DefaultHttp2Headers().status(status.codeAsText())
encoder().writeHeaders(ctx, streamId, headers, 0, false, ctx.newPromise())
encoder().writeData(ctx, streamId, payload, 0, true, ctx.newPromise())
ctx.flush()
}

in Http2FrameListener.onHeadersRead I have a graphResponseFuture which is the future for my response which I then invoke sending the response on like this (see github link above for full code):

graphResponseFuture.whenCompleteAsync({ graphResponse, throwable ->
if (throwable != null) {
log.error("Error processing data", throwable)
val errorPayload = ctx.alloc().buffer().writeBytes("Internal server error".toByteArray())
sendResponse(ctx, streamId, errorPayload, HttpResponseStatus.INTERNAL_SERVER_ERROR)
} else {
val responseBuffer = ctx.alloc().buffer()
serializeGraphResponseToByteBuf(graphResponse, responseBuffer)
sendResponse(ctx, streamId, responseBuffer)
}
}, ctx.channel().eventLoop())

When testing, I reproduce problems by repeatedly requesting like this:

curl -v --http2-prior-knowledge "http://localhost:9093/graphql"

but I never see issues with

curl -v --http2-prior-knowledge "http://localhost:9093/"

the difference being that in the second request the response is sent synchronously when the request handler is invoked.

James Sager

unread,
Apr 19, 2023, 7:11:53 AM4/19/23
to ne...@googlegroups.com
Question:  Anyone have trouble with new Eclipse?

I think it has to do with JRE 17 not happy with NETTY?

My project when I compile says:

Error: Could not find or load main class netty.DiscardServer

Caused by: java.lang.ClassNotFoundException: netty.DiscardServer



Reply all
Reply to author
Forward
0 new messages