In our application, some of the Direct requests may be lengthy, and
sometimes they may even time out. If a single request in the HTTP
batch times out, the whole batch aborts. The client then retries the
request (the whole batch), which may time out again, even though some
of the individual Direct calls may have succeeded -- and therefore
they executed multiple times.
It occurs to me that the DoPost method could start an execution thread
for each Direct transaction. This would allow the method calls to run
in parallel. It could also wait for some timeout value, and return the
responses for any of the requests that succeeded. The response for the
ones that didn't complete could be null or contain some error code. I
have started a conversation thread in the Ext.Direct forum to see how
I can handle these batched replies on the client side. Worse comes to
worse, each callback could handle the retry.
Is this something that seems valuable? It would help us immensely with
performance and with some errors caused by repeated calls.
http://www.sencha.com/forum/showthread.php?160579-Accessing-Batched-Calls
The problem we're having is that, for example, we could have 3 calls
that get batched, 2 of which are very small and quick, and one that is
very lengthy. Let's say that the long one is the second one. The first
call completes, but the second one takes the whole HTTP request beyond
the timeout. The JavaScript then retries the whole request (all 3
calls), even though the first one was successful. Now the server is
still executing some part of the first request, maybe finishes the
long request and the last little one, and another thread starts
executing all 3 calls again.
What I'm thinking is that, after starting one execution thread for
each call, the response array could be updated by each of the threads.
At some point, the server could return the partial reply, with result
data in the first and third elements of the array and null (or some
other indication) in the second one. The JavaScript side could decide
to retry only the empty/failed call (the second one).
I suggested a timeout so that the ExtDirectHandler would have a way to
decide when to send the reply. It would have to be some amount of time
smaller than the actual timeout. This negative delta time could also
be configurable.
In our case, we usually have several, maybe 6 to 10 calls that could
return very quickly, but they all keep failing because of one that
causes the whole request to exceed the timeout. Taken alone, that long
request might not take more than the timeout period. If the calls are
run in parallel, chances are none of them would timeout. We've also
played with increasing the timeout, but the UI becomes sluggish.
On Dec 8, 12:04 pm, Gian Marco Gherardi <gianmarco.ghera...@gmail.com>
wrote:
> > the DoPost method could start an execution thread
> > for each Direct transaction. This would allow the method calls to run
> > in parallel
>
> Yes, executing batched request in different thread can boost performance.
> The implementation of this is not trivial but can be done.
>
> > It could also wait for some timeout value
>
> In my opinion, ExtDirectHandler should not handle timeout, because this is
> relate to direct method's implementation so every method should handle
> timeout if needed
>
> > and return the responses for any of the requests that succeeded
>
> i don't know if i've understood correctly, but you can have a single HTTP
> response for every HTTP request so, if multiple direct method call are
> bundled on a single HTTP request, all the response has to be returned
> together in the same HTTP response.
>
> > The response for the ones that didn't complete could be null or contain
>
> some error code
>
> This is how it is currently working: if you batch 3 calls and one of this
> call result in error, the response will bundle 2 success response and 1
> error response.
>
> > I have started a conversation thread in the Ext.Direct forum to see
>
> how I can handle these batched replies on the client side.
>
> Interesting, can you point me to the thread?
>
> Gian Marco Gherardihttp://gianmarco.gherardi.me