If I understand correctly, pipelining would essentially be “send n GETs in quick succession over the same HTTP request, then multiplex their results into one”, which would be perfectly fine.
In terms of POST, the way I’ve tackled this in the past (and am recommending to tackle in an API I’m designing at the moment) is to use parameters called `id[]` (with the `[]` to signify multiple values are allowed). That way, a bunch of smarter web frameworks automagically treat the `id` (or `id[]`, depending on your framework) request parameter as an array, and it shouldn’t be horrendously hard if your framework doesn’t to get to all the values. (Though some servers, by default, may kill all values save the last.) It also means that, if a GET falls with 413/414, the client should (for a given value of should, of course) be able to retry with a POST and just dumping all the parameters as an x-form-encoded POST body.
I’d also be wary of anything that relies on GET for super-long queries, simply because you can’t guarantee what intermediate proxies and such will do to your query string.
Ultimately, pipelining and POST are the safest of the two options, and the friendly to your own systems and your consumers’ out of the box. If you wanted to be really nice to your consumers, support both.