Hello
> The first reason has to do with application size. Consider the fact that for
> every type you want to serialize across the wire through RPC, the GWT
> compiler must generate that serialization code.
I don't understand this explanation much.
I would expect that every serializable type would have only one
serializer and deseliarizer generated. Types consisting of other types
would then reuse de/serializers of that types. There would be one
universal method for serializing any (serializable) object to stream
and one universal method for deserializing stream to object this
stream contains. Both methods would use the generated serializers I
described above. And I don't think there would be more JS as it is
now. We need to serialize primitive types, their object counterparts,
strings, arrays, lists, maps, sets (=various collections) and then
only all types implementing serializable. And I think users are making
types to implement Serializable only if they want them to be passed
over RPC.
> The second reason has to do with application security. If your application
> could serialize any Object subclass, that opens up a major security
> vulnerability in that an evildoer could exploit the serialization code and
> inject a type that could essentially be deserialized into any type that
> extends Object.
Once I expect object on the server side, I must be able to handle any
type. This is responsibility of the application and not serialization.
If I am testing that object wether it is int, double, string, date
else throw IllegalArgumentException then I see no security problem if
they will inject there FooObject. Or am I missing something? Do you
have some example where service expecting object could be "hacked" by
injecting some different object?
Thanks in advance for explanations.