JSON-RPC 2.0 specification prevents me to stream batch requests and responses

506 views
Skip to first unread message

Sergej Zagursky

unread,
Sep 3, 2015, 6:45:26 AM9/3/15
to JSON-RPC
Hi!

I'm going to implement JSON-RPC 2.0 server capable of massive streaming. I found that specification is kinda restrictive on that.

In theory I can read requests from HTTP input stream one by one and feed them to request executor. But the specification is saying that if incoming JSON is not valid at all, then I should return single error. This is enforced by the following example (from the spec):
--> [
  {"jsonrpc": "2.0", "method": "sum", "params": [1,2,4], "id": "1"},
  {"jsonrpc": "2.0", "method"
]
<-- {"jsonrpc": "2.0", "error": {"code": -32700, "message": "Parse error"}, "id": null}

At the time I parse single request from batch I don't know whether the rest of requests would be well. So according to the specification I should read all the batch into the memory and only after that I can process individual request. This is the thing I'm trying to avoid.

What I'd like to see is:
--> [
  {"jsonrpc": "2.0", "method": "sum", "params": [1,2,4], "id": "1"},
  {"jsonrpc": "2.0", "method"
]
<-- [
{"jsonrpc": "2.0", "result": 7, "id": "1"},
{"jsonrpc": "2.0", "error": {"code": -32700, "message": "Parse error"}, "id": null}
]

Wouldn't it be better if specification would give me a bit more freedom in case of malformed batch?

Roland Koebler

unread,
Sep 3, 2015, 7:05:57 AM9/3/15
to json...@googlegroups.com
Hi,

> What I'd like to see is:
>
> --> [
> {"jsonrpc": "2.0", "method": "sum", "params": [1,2,4], "id": "1"},
> {"jsonrpc": "2.0", "method"
> ]
> <-- [
> {"jsonrpc": "2.0", "result": 7, "id": "1"},
> {"jsonrpc": "2.0", "error": {"code": -32700, "message": "Parse error"}, "id": null}
> ]
the problem is, that it's difficult to determine where the error in the
JSONRPC-(batch-)request occurred, and that the 1st line is really
the complete, errorfree 1st request. What would you do e.g. in the following
case?

--> [
{"jsonrpc": "2.0", "id": "1", "method": "sum", "params": [1,2,4,]}
5]}

In addition, most JSON-decoders are non-streaming, so most JSON-decoders
couldn't parse the request at all, since it is invalid JSON. So, the
best choice here is to reject the whole batch and tell that there was
an error. Trying to guess where the error could have been and trying to
partially correct the error would make it much more complicated
and could result in unexpected errors.

> Wouldn't it be better if specification would give me a bit more freedom in
> case of malformed batch?
I would suggest something different: Do not use batch, but simply stream the
JSONRPC-requests.

As I said, most JSON-decoders are non-streaming, but can only decode a
complete, valid JSON-string. If you have a streaming JSON-decoder (or a
JSON-splitter), you don't need batches at all -- you can simply send several
requests over the same connection:

--> {"jsonrpc": "2.0", "method": "sum", "params": [1,2,4], "id": "1"}
--> {"jsonrpc": "2.0", "method"

<-- {"jsonrpc": "2.0", "result": 7, "id": "1"},
<-- {"jsonrpc": "2.0", "error": {"code": -32700, "message": "Parse error"}, "id": null}


best regards
Roland

Sergej Zagursky

unread,
Sep 3, 2015, 1:48:32 PM9/3/15
to JSON-RPC
I very much dislike "autocorrection" of requests. This is definitely not what I want.

So in the following case the very first request is malformed because of trailing comma before ].

--> [
  {"jsonrpc": "2.0", "id": "1", "method": "sum", "params": [1,2,4,]}
  5]}
I would suggest something different: Do not use batch, but simply stream the
JSONRPC-requests.

I like your suggestion. But I doubt that any JSON-RPC client library in the wild actually support this.
But it seems to be OK for my case. Thank you!

But I think that JSON-RPC spec should not restrict streaming of batches nevertheless.

Matt (MPCM)

unread,
Sep 3, 2015, 2:01:07 PM9/3/15
to JSON-RPC
Making the streaming parser work off of the (JSON) Text Sequences spec might be an interesting approach to handle segmenting the bad request payloads.

http://datatracker.ietf.org/doc/rfc7464/

Batch is very much a package of potential request objects. While streaming looks like that, the goals are often not to consume in the style of a package, like Roland mentioned.

Please post back around what you are building, since streaming parses and processors hasn't gotten much attention thus far, AFAIK.

Roland Koebler

unread,
Sep 4, 2015, 7:05:57 AM9/4/15
to JSON-RPC
Hi,

On Thu, Sep 03, 2015 at 11:01:06AM -0700, Matt (MPCM) wrote:
> Making the streaming parser work off of the (JSON) Text Sequences spec
> might be an interesting approach to handle segmenting the bad request
> payloads.
>
> http://datatracker.ietf.org/doc/rfc7464/
yes, this looks interesting.
Maybe we should add this as note to the JSON-RPC spec, keep it in mind for
the next JSON-RPC-specification, or add a separate document (similar to the
JSON-RPC-over-HTTP-document).


The alternative is to use a JSON-splitter, like I wrote one many years
ago (http://www.simple-is-better.org/json-rpc/jsonsplit.py). But the
above json-seq is superior, since it's simpler and allows recovery
after defective / truncated JSON-parts.


So, I think json-seq/RFC7464 is the best way for handling streaming
JSON-RPC. It's simpler than both streaming JSON-decoders and JSON-splitters,
and it's more robust than e.g. JSON-RPC-over-netstrings; and it should
work quite well.


best regards
Roland

Reply all
Reply to author
Forward
0 new messages