On 01/03/14 01:10, Jon Cooper wrote:
> I'm parsing some rather wide JSON objects (~300 fields) which contains
> many attributes that derive from nullable SQL numeric types.
>
> Each of these attributes may take on values in the range [0, <max>] or null.
>
> I do not control the input JSON.
>
> What I would like to do is parse these into struct fields of primitive
> type, coercing JSON null to a magic value per golang type.
>
> For example, given a struct like:
>
> type Record struct {
> A int
> B float64
> }
> Parsing JSON like:
>
> {"A": null, "B": null}
>
> Should yield:
>
> {-1, -1.0}
>
> Options that I've considered:
>
> * typedef each numeric type to my own type, use custom UnmarshalJSON to
> do coercion, have to cast all over the place later
> * json.Decoder with UseNumber() set, decode into map[string]interface{},
> manually type-assert and copy each field into instance of struct
Instead of manually copying into each field you could update the value
in the map[string]interface{}, encode it back into json, then decode it
into your struct which might be slow, but is relatively straight forward.
> * several increasingly janky riffs using reflection
Did you consider using
type Record struct {
A *int
B *float64
}
Then you can find the nil values and update them afterwards?
> I'd really appreciate any ideas. Have been unable to come up with an
> approach that isn't fug. Perhaps the low-hanging fruit is to modify the
> encoding/json package, but it would be nice to find an idiomatic-ish
> approach.
There was a discussion here quite recently about setting default values
for json decoding. Unfortunately my search foo is too low to find it at
the moment!
--
Nick Craig-Wood <
ni...@craig-wood.com> --
http://www.craig-wood.com/nick