> This works, but the downside is that each {...} of bytes has to be pulled into memory. And the functions that is called is already designed to receive an io.Reader and parse the VERY large inner blob in an efficient manner.
Is the inner blob decoder actually using a json.Decoder, as shown in your example func secondDecoder()? In that case, the simplest and most efficient answer is to create a persistent json.Decoder which wraps the underlying io.Reader directly, and just keep calling w2.Decode(&v) on each call. It will happily consume the stream, one object at a time.
If that's not possible for some reason, then it sounds like you want to break the outer stream at outer object boundaries, i.e. { ... }, without fully parsing it. You can do that with json.RawMessage:
However, you've still read each object as a stream of bytes into memory, and you've still done some of the work of parsing the JSON to find the start and end of each object. You can turn it back into an io.Reader by creating a bytes.NewBuffer around it, if that's what the inner parser requires. However if each object is large, and you really need to avoid reading it into memory at all, then you'd need some sort of rewindable stream.
Another approach is to stop the source generating pretty-printed JSON, and make it generate in
JSON-Lines format instead. It sounds like you're unable to change the source, but you might be able to un-prettyprint the JSON by using an external tool (perhaps jq can do this). Then I am thinking you could make a custom io.Reader which returns data up to a newline, then sends EOF and sends you a fresh io.Reader for the next line.
But this is all very complicated, when keeping the inner Decoder around from object to object is a simple solution to the problem that you described. Is there some other constraint which prevents you from doing this?