Parsing difficult JSON

閲覧: 109 回
最初の未読メッセージにスキップ

Tobias Klausmann

未読、
2023/09/14 9:36:122023/09/14
To: golan...@googlegroups.com
Hi!

I am trying to write Prometheus exporter for stats queried fro the Kea
DHCP server. Unfortunatly, the JSON is structured very badly if I want
to use the base library JSON Unmarshal functionality:

{
"arguments": {
"cumulative-assigned-addresses": [ [ 1, "2023-09-13 12:08:09.597483" ], ... ],
"pkt4-decline-received": [ [ 0, "2023-09-13 08:01:35.964113" ], ... ],
"subnet[1].assigned-addresses": [ [ 1, "2023-09-13 08:01:36.014127" ], ... ],
"subnet[1].cumulative-assigned-addresses": [ [ 0, "2023-09-13 08:01:36.014006" ], ... ],
"subnet[1].declined-addresses": [ [ 0, "2023-09-13 08:01:36.014069" ], ... ],
"subnet[1].reclaimed-declined-addresses": [ [ 0, "2023-09-13 08:01:36.014074" ], ... ],
"subnet[1].reclaimed-leases": [ [ 0, "2023-09-13 08:01:36.014080" ], ... ],
"subnet[1].total-addresses": [ [ 15, "2023-09-13 08:01:36.013997" ], ... ],
"subnet[1].v4-reservation-conflicts": [ [ 0, "2023-09-13 08:01:36.014010" ], ... ],
"subnet[2].assigned-addresses": [ [ 4, "2023-09-14 13:32:20.906085" ], ... ],
"subnet[2].cumulative-assigned-addresses": [ [ 4, "2023-09-14 13:32:20.906090" ], ... ],
"subnet[2].declined-addresses": [ [ 0, "2023-09-13 08:01:36.014088" ], ... ],
"subnet[2].reclaimed-declined-addresses": [ [ 0, "2023-09-13 08:01:36.014096" ], ... ],
"subnet[2].reclaimed-leases": [ [ 3, "2023-09-14 00:08:10.270122" ], ... ],
"subnet[2].total-addresses": [ [ 223, "2023-09-13 08:01:36.014015" ], ... ],
"subnet[2].v4-reservation-conflicts": [ [ 0, "2023-09-13 08:01:36.014025" ], ... ],
"subnet[3].assigned-addresses": [ [ 1, "2023-09-13 08:01:36.014135" ], ... ]
... rest of subnet[3] and more subnets
},
"result": 0
}


The int, timestamp lists are already Not Great, but I can deal with
those.

The problem is the series of subnet[x] fields. They vary depending on
how many subnets the server serves, and nothing in the JSON indicates
how many there are. And even if it did: getting the stdlib JSON
Unmarshaler to actually pick them up (without hardcoded struct tags)
seems impossible, short of essentially writing my own JSON Unmarshaler
from scratch.

So I have three questions:

1. Am I missing some wildcard-ish functionality where I can tell the
stdlib JSON Unmarshaler just make a slice out of all the JSON
elements that fit a pattern?
2. Is there a Golang JSON library that is better suited to dealing with
this?
3. What other options do I have (besides "use another
language/exporter", "make upstream produce better JSON" and "write
your own parser")?

Best & TIA,
Tobias

Peter Galbavy

未読、
2023/09/14 9:48:272023/09/14
To: golang-nuts
You will need to create a custom type and unmarshal method. Plenty of example if you search for "golang custom json unmarshal".

As I've only had to implement it for a couple of simple types I can't offer my own valid example.

Tobias Klausmann

未読、
2023/09/14 11:18:092023/09/14
To: golan...@googlegroups.com
Hi!

On Thu, 14 Sep 2023, Peter Galbavy wrote:
> You will need to create a custom type and unmarshal method. Plenty of
> example if you search for "golang custom json unmarshal".
>
> As I've only had to implement it for a couple of simple types I can't offer
> my own valid example.

I am aware of the ability to make custom unmarshalers, but in this
particular case, I would have to ma an unmarshaler for basically the
whole JSON - since I cant map all variants of foo[n].something to one
type/unmarshaler: the stdlib JSON functionality using struct tags does
not allow fro wildcards/patterns, as far as I can tell, and I don't know
what range N might be in, so I can't hardcode it.

Best,
Tobias

Brian Candler

未読、
2023/09/14 11:45:392023/09/14
To: golang-nuts
Could you just unmarshal it to a map[string]IntTSList, and then walk it and build whatever structure you want?

If you want to pick certain fixed fields into a struct and separate out the leftovers to be parsed as "subnet[X].Y", then see
The last answer points to some other JSON libraries which can handle this part for you - but you'd still have to process the "subnet[X].Y" keys.

Tobias Klausmann

未読、
2023/09/14 13:05:522023/09/14
To: golan...@googlegroups.com
Hi!

On Thu, 14 Sep 2023, Brian Candler wrote:
> Could you just unmarshal it to a map[string]IntTSList, and then walk it and
> build whatever structure you want?

I will try and make that work tomorrow, thanks for the hint!

> If you want to pick certain fixed fields into a struct and separate out the
> leftovers to be parsed as "subnet[X].Y", then see
> https://stackoverflow.com/questions/33436730/unmarshal-json-with-some-known-and-some-unknown-field-names
> The last answer points to some other JSON libraries which can handle this
> part for you - but you'd still have to process the "subnet[X].Y" keys.

That looks like it might work as an alternative. I suspected that _some_
JSON lib out there would allow for "unknown at compile time" fields
(without resorting to [][]interface{} --- which leads to doing all the
parsing myself). So thanks for helping me find a blade of hay in a
needlestack :)

Best,
Tobias

Jim Idle

未読、
2023/09/14 23:02:082023/09/14
To: golan...@googlegroups.com
You can also unmarshal in stages to Raw and then unmarshal the next piece based on what you have so far. I presume that you cannot change the format? 

--
You received this message because you are subscribed to the Google Groups "golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/golang-nuts/cf1b2521-79dc-4651-90de-afacff084f9a%40skade.local.

Tim Casey

未読、
2023/09/15 0:45:292023/09/15
To: golang-nuts

I solved this in some bizzaro way.  Maybe what is needed is a json parser which is allowed to be multiple types.  Or, another way to think about parsing which is easier on the type strictness, but still strongly typed.  Like a duck typed struct of some sort.  map[string]duck, where duck is allowed to be map[string]duck.

全員に返信
投稿者に返信
転送
新着メール 0 件