type ObjectHandler interface {
Set(key string, value interface{})
}
// Creates an object that takes the place of map[string]interface for free-form JSON
func (d *Decoder) UseObjectHandler( newObj func() ObjectHandler )
I am parsing a large JSON document.
The data in the source document is laid out in a fairly organized way, such that it makes sense to read it. Related fields are clustered together, etc...
The way that I render this data makes the most sense to the end user if it is organized in the same order as it appears in the JSON source data.
Currently, the data is being rendered in a confusing way because the keys are jumbled by the map. Even worse, they are not the same from one run to another. Simply being able to get the original key order information from the decoder would make all the difference.
I am parsing a large JSON document.
The data in the source document is laid out in a fairly organized way, such that it makes sense to read it. Related fields are clustered together, etc...
The way that I render this data makes the most sense to the end user if it is organized in the same order as it appears in the JSON source data.
This is not a question of whether the current implementation confirms to the spec (it does). I'm also not arguing that the JSON encoder/decoder should preserve order in every case, but rather that there should be some way that the developer can determine the original order and write out an object in a predictable order.
There are many cases where JSON is used as a (semi) human readable config format, for example. Having the file scramble itself every time you pass it through the decoder/encoder is not user friendly.
In my case, I am provided with arbitrary complex JSON data, which I don't have control over. I'm not displaying that information to the end user directly, but I transform it and display it back to the end user in a form in which the structure of the rendered information is similar in structure to the original JSON (i.e. ordering, clustering, naming). The original JSON has a logical order (guaranteed or not), and the end user benefits by seeing their output ordered in the same general way.
I don't think this is an uncommon scenario.
I know we are still talking API philosophy, but I had some thoughts about a more natural API implementation than I originally suggested. I'll go ahead and throw it out there.
type Member struct {
Key string
Value interface{}
}
type OrderedObject []Member
func (* Decoder) UseOrderedObject()
If UseOrderedObject is called on the Decoder, then the Decoder will return an OrderedObject rather than a map[string]interface{} when each JSON object is decoded.
Similarly, when the Encoder marshals an OrderedObject, it would write out the JSON object preserving order in the output stream.
This is not a question of whether the current implementation confirms to the spec (it does). I'm also not arguing that the JSON encoder/decoder should preserve order in every case, but rather that there should be some way that the developer can determine the original order and write out an object in a predictable order.
There are many cases where JSON is used as a (semi) human readable config format, for example. Having the file scramble itself every time you pass it through the decoder/encoder is not user friendly.
In my case, I am provided with arbitrary complex JSON data, which I don't have control over. I'm not displaying that information to the end user directly, but I transform it and display it back to the end user in a form in which the structure of the rendered information is similar in structure to the original JSON (i.e. ordering, clustering, naming). The original JSON has a logical order (guaranteed or not), and the end user benefits by seeing their output ordered in the same general way.
I don't think this is an uncommon scenario.
- Imagine if you were writing a configuration file editor (possibly a web page), where the end user could use a friendly interface to edit fields that were stored in an underlying JSON file. If the order of the fields were randomly changing during the process, that would be pretty unworkable as a solution.
- Another case, let's say someone has serialized data from a database table as an array of objects, each record is an object with each field in the object rendered as a JSON key/value pair. If you are displaying this data to the end user, showing them records in which the columns are randomly in different orders would be a real problem.
Is using JSON as a config file a good decision in the first place?
* Can you change from JSON to something else?Maybe xml or edn is better suited for this?* Can you change the structure of JSON?If the order is important use a JSON array.
If you really have no other option, i.e. changing the structure/format, I would suggest forking the json package to json2/json-ordered or something.There are also tons of json packages - maybe some of them already solves your problem http://godoc.org/?q=json; although I didn't notice one.
Alternatively, there is a pending issue for json Tokenizer (https://github.com/golang/go/issues/6050). With a Tokenizer implementing your usage case would become easier.
func (d D) MarshalJSON() ([]byte, error) {
var b bytes.Buffer
if d == nil {
b.WriteString("null")
return nil, nil
}
b.WriteByte('{')
for i, v := range d {
if i > 0 {
b.WriteByte(',')
}
// marshal key
key, err := json.Marshal(v.Name)
if err != nil {
return nil, err
}
b.Write(key)
b.WriteByte(':')
// marshal value
val, err := json.Marshal(v.Value)
if err != nil {
return nil, err
}
b.Write(val)
}
b.WriteByte('}')
return b.Bytes(), nil
}
Ignoring the fact that this is a terrible feature of MongoDB,
this is a large part of why gopkg/mgo.v2/bson provides the bson.D type - it
can be used to provide keys in an arbitrary order.
[{"Name":"find","Value":"myColl"},{"Name":"filter","Value":[{"Name":"category","Value":"cafe"}]}]
`If you're going to talk to MongoDB in Go, I'd strongly suggest using
the above package rather than producing commands for the mongodb
shell (strictly speaking the shell doesn't even produce/accept JSON,
as it has extra types for its values).
You could probably use the Decoder type if you really needed to do this.
Still though I'm surprised so many funcs in json package are internal, it would be pretty easy to extend the code if some functionality was declared as public.
You could probably use the Decoder type if you really needed to do this.
--