There are two big problems with this:
- JSON has no concept of refernces, it's a purely algrebraic
structure. As a consequence, you can't serialize graph-structures with
it. So you need to come up with a special key/value pair that denotes a
reference. This is purely to your own design.
- if that behavior would be default, it would mean that e.g. returning
a user in facebook would return all his 4000 friends. Which is a rather
dumb thing to do if all you want is a user's primary attributes, don't
you agree?
So I'm afraid this can't be implemented the way you wish. I'm not sure
how this is dealt with in TG2, but in TG1 it was very easy to declare a
special rendering for specific objects, based on rules dispatch. You
should try & do that.
Diez
But the __json__-method is exactly such a spceification. And both the
__json__-method as well as any declarative approach you suggest (which
would be *very* hard to implement for us, because its's
SQLAlchemy/SQLObject that we'd need to change) are IMHO to limited anyway.
Because there might be occasions where I different json-representations
for the same classes.
So instead, I suggest you familiarize yourself with the simplejson API.
There you can declare custom serializers, and if you want, write one
that knows how to fully traverse an SA-object and it's relations.
This could be very well a recipe in the docs - but I don't think it's
justified to be included in the core, as the need for customization is
very high so everybody will have to write json-encoders anyway.
Diez
we have 2 choices:
1. don't use relationship of SA to use the default encode mechanism of
turbojson
2. write a customized "__json__"
I think, the 1st choice is not acceptable in most cases, but the 2nd
one is also annoying when we have more than 40 tables.
pulling "everything in" is bad and putting too little is also bad.
__json__ was invented for these cases. Now could you please explain
why 40 tables = 40 __json__'s ? Sorry to say this but if you have back
and forth relationship to everything then there is something wrong
with your DB design instead of the json layer, allll those joins will
be painful and your network objects (size of the json data will be
huge).
that isn't really true. Almost every json string will be smaller than
the XML equivalent. The only exception is some deformation (which by
the way somehow breaks the idea of XML) that some protocols do, for
example the webservices defined by WSGI have some "optimization" where
you can extract a set of the XML and replace it with a reference then
put that set below in the XML stream which saves you space because the
block of code is only one time in the steam and you have several
references to it. That in itself has two mayor problems.
1- It break the concept that XML is human reable, because it breaks
the flow and more importantly
2- It means your data model is broken! Why do you want to store the
same information twice in the same stream??
> Also, I think you need
> Elixir installed on top of SQLAlchemy to handle M:N relationships in
> the traditional fashion. Then maybe it'd be simpler.
huh? have you used sa declarative? I think you are confusing active
record (what you call traditional) with "data mapper ("SA's way")
Your solution for this problem could work with the
"when" decorator but I don't see it as the best solution. Which
criteria will you use to strip all unwanted attributes from all
objects? It seems to me like a very big set of hasattr check that will
return false on all objects that don't have it and only true on a flew
that do.
Wouldn't it be better to subclass the serializer and do things from
there? again which will be the criteria? Will anyone come up with an
example where this is needed?
Umm this is interesting AssociationProxy is an extension that doesn't
follow the _sa_ convention. And I agree this is a rather simple patch.
In fact it should be as simple as adding a new function to
http://svn.turbogears.org/projects/TurboJson/trunk/turbojson/jsonify.py
will you work on a patch + test for it? I don't have any models around
that use that extension.
> My model only had one class that used an association proxy, but if you had a
> bunch of classes that had a reasonably finite set of objects in an
> association proxy (say <10 keywords each), I think that's a use case that
> justifies a global way to do it for a given project, such as the when
> decorator. And if they were all called "keywords", then it's only one
> hasattr to check.
> I hadn't thought about subclassing the whole serializer. I think that's a little bit more intimidating, but like I said, I'm pretty new to TG, so I'm still trying to absorb it all.
> In my case, subclassing would have made sense, because I wanted to take out
> something (_AssociationProxy_* keys) and I couldn't figure out how to do
> that in a when
> decorator without basically overriding the whole process anyway.
In TurboJson this will be done with a simple jsonify.when decorated
function that will magically get called by TurboJson. As explained
here http://docs.turbogears.org/1.0/JsonifyDecorator
Now as I said that has been removed (although an equivalent
implementation could be set in place for 2.1, the question is do we
want it?) the proper way will be to subclass the Encoder here and (for
now monkey patch L39: _instance = GenericJSON()) to put in your
modified instance. Suggestions on how to make that better are welcome.
So in summary I believe subclassing GenericJSON + telling TG to use
your version is WAY simpler than having someone go try to understand
generic functions and then implement one.
Back to the specific case I really believe this should be patched into
tg2.1 and TurboJson as AssociationProxy needs to be filtered (at least
by default) just like every other SA relation.
>
> I think the decorator syntax is nice and clean, and the best examples are in
> TurboJson it self: datetime, decimal and explicit. Presumably those are
> handled in TG 2.1 as well, so hopefully there's a nice way to subclass and
> override similar types of objects, such as a duration or a currency.
yes indeed the functionality is 99% equivalent the only thing missing
is the "when", the problem is that the old implementation used generic
functions which well created a ton of dependencies and did not bring
in a lot. I know it sounds like a lot but pretty much all of TurboJson
was replaced by one 50 lines file
http://hg.turbogears.org/tg-21/src/tip/tg/jsonify.py which is
equivalent and I'm pretty sure it's faster.
>
> Umm this is interesting AssociationProxy is an extension that doesn't
> follow the _sa_ convention. And I agree this is a rather simple patch.
> In fact it should be as simple as adding a new function to
> http://svn.turbogears.org/projects/TurboJson/trunk/turbojson/
> jsonify.py
> will you work on a patch + test for it? I don't have any models around
> that use that extension.
Yeah, I should be able to do that next week. Though looking at 2.1 (or
2.0 for that matter) I don't think it can be done in a function, but
is a one line change to the existing conditional.:
if not key.startswith('_sa_') and not
key.startswith('_AssociationProxy_':
Are there instructions to get a source environment for 2.1 anywhere,
or is it just as simple as replacing subversion with mercurial from
the 2.0 docs?
> Now as I said that has been removed (although an equivalent
> implementation could be set in place for 2.1, the question is do we
> want it?) the proper way will be to subclass the Encoder here and (for
> now monkey patch L39: _instance = GenericJSON()) to put in your
> modified instance. Suggestions on how to make that better are welcome.
I'm not familiar with what "monkey patch L39" references. As far as
making is easier to subclass, I think some additional callback to
deal with relationships would be extremely helpful, and save the
inheritor from having to reimplement the entire is_saobject(obj)
condition, including filtering out of private SA attributes, which may
be subject to change. Just implementing your own default won't do it,
because the key is skipped before the value. Actually, what about
moving the startswith _sa_ check to it's own function that could then
be overridden? That would allow derived classes to exclude at the
field or relationship level, using the key, while still calling
through to the super class that can implement the private key
behavior. Is that what you meant by a new function above? If so, I
guess it just took me a little while to get there :)