Yeah, I understand where JSON has come from, but it has also proved to
be a very versatile data storage format and one that I would like to use
more generally in out domain which is notorious for format
proliferation[1].
I should point out that in most cases this 'problem' is not an issue for
me since object with private fields are small enough not to result in
large allocations, but I am wondering about the design aspects to see if
it is worth reconsidering for Go2.
There is no real in-principle reason not to store very large data sets
in JSON if one accepts that you can do the same with XML (and people do
- both in my field and elsewhere - e.g. wikipedia).
Theres a clear difference in the implementation of JSON encoding in Go
and gob encoding that make handing an encoded slice of bytes to the gob
encoder sensible that does not make nearly as much sense for JSON - i.e.
there is no reason that I can see that prevents the json.Encoder from
handing an io.Writer to an EncodeJSON method pretty much as is done for
xml (yes, this is not strictly true, but certainly close enough).
I don't see how the interaction between Marshal and Encoder impacts on
the signature of JSON. If the signature were changed to
MarshalJSON(io.Writer) error there would be minimal change to the
package, mainly in {,addr}MarshalerEncoder[1] but also compact which
flows on to 1 other place (Compact) with trivial changes either to shim
(wrap the privided []byte with a bytes.Reader) or alter (change the
signature to Compact(io.Writer, io.Reader) error.
Finally, yeah. No. XML is not an option.
[1]
http://www.biostars.org/p/7126/#7136
[2]
http://golang.org/src/pkg/encoding/json/encode.go?s=12503:13250#L416
--
Omnes mundum facimus.
Dan Kortschak <
dan.ko...@adelaide.edu.au>
F9B3 3810 C4DD E214 347C B8DA D879 B7A7 EECC 5A40
10C7 EEF4 A467 89C9 CA00 70DF C18F 3421 A744 607C