Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Interface between main thread and cache & socket transport service threads

36 views
Skip to first unread message

Brian Smith

unread,
Jan 13, 2012, 5:33:24 PM1/13/12
to dev-tech-network
Bug 717761 [1] boils down to the fact that in some cases we are blocking the main thread on I/O that is happening on the cache thread, by having the main thread wait to acquire a lock held by the cache thread.

Apparently, we still have a few public functions exposed from the cache that are synchronous. Bug 695399 [2] is about removing those functions. I believe that we have to get rid of all these synchronous APIs in order to handle bug 660749 [3], because AFAICT, any solution for bug 660749 will potentially require any cache lookup (for HTTPS pages) to require a certificate validation which may require network I/O (for OCSP) and/or disk I/O--even if the cached resource is in the *memory* cache. Currently, we apparently make some assumptions that anything in the memory cache can be safely accessed synchronously, but that would no longer be true--especially since we don't have a solid plan for using the memory cache for anything other than no-store entries, which is something that isn't worth optimizing for.

(Having said that, we definitely haven't decided we're only going to use the memory cache for no-store entries. However, bug 665707 makes me think that perhaps Necko shouldn't be the thing maintaining memory caches, but rather we should have some higher-level non-network component, layed on top of Necko, managing any memory caches in content-dependent formats.

As I mentioned in bug 717761 [1], it doesn't seem right to me that our cache operations bounce between the main thread, the cache thread, and the socket transport thread so much, potentially spending a lot of time during each "bounce" waiting for unrelated events in the target thread's event queue to complete. In some cases it is effectively like this:

Main Thread Cache Thread Socket Transport Thread
------------- ---------------- -----------------------
Post event to cache unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing Process that event
unrelated processing disk I/O for event
unrelated processing Post event to main
unrelated processing
process event from cache
Ask cache something unrelated I/O
wait unrelated I/O
wait unrelated I/O
wait unrelated I/O
wait compute answer
get answer from cache
Ask cache something unrelated I/O
wait unrelated I/O
wait unrelated I/O
wait compute answer
get answer from cache
Ask cache something unrelated I/O
wait unrelated I/O
wait unrelated I/O
wait compute answer
get answer from cache
compute entry freshness
ask network to
revalidate entry unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing send request
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing send response to main thread
unrelated processing
process 304 response
ask cache for cached
response body unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing return response body
unrelated processing
unrelated processing
process response body


It seems to me that we should make an incremental improvement so that it looks more like this:


Main Thread Cache Thread Socket Transport Thread
------------- ---------------- -----------------------
Post event to cache unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing Process that event
unrelated processing disk I/O for event
unrelated processing compute entry freshness
unrelated processing ask network to
unrelated processing revalidate entry unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing send request
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing send response to main thread
unrelated processing
process 304 response
ask cache for cached
response body unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing unrelated I/O
unrelated processing return response body
unrelated processing
unrelated processing
process response body


And, in parallel to that effort, work on de-serializing I/O in the cache, and/or prioritizing I/O for the cache (e.g. so that I/O for loading things currently visible on the active tab in the active window can budge in the I/O queue) so that it ends up more like this:


Main Thread Cache Thread Socket Transport Thread
------------- ---------------- -----------------------
Post event to cache unrelated I/O
unrelated processing Process that event
unrelated processing disk I/O for event
unrelated processing compute entry freshness
unrelated processing ask network to
unrelated processing revalidate entry unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing send request
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing send response to main thread
unrelated processing
process 304 response
ask cache for cached
response body unrelated I/O
unrelated processing return response body
unrelated processing
unrelated processing
process response body


And then finally we could remove the main thread interaction in the middle:


Main Thread Cache Thread Socket Transport Thread
------------- ---------------- -----------------------
Post event to cache unrelated I/O
unrelated processing Process that event
unrelated processing disk I/O for event
unrelated processing compute entry freshness
unrelated processing ask network to
unrelated processing revalidate entry unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing send request
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing unrelated socket I/O
unrelated processing send response to cache thread
unrelated processing unrelated I/O
unrelated processing Process 304 response
unrelated processing read cached body
unrelated processing return cached body to main thread
unrelated processing
unrelated processing
process response body


When we don't need to revalidate an entry, then this would collapse to:


Main Thread Cache Thread
------------- ----------------
Post event to cache unrelated I/O
unrelated processing Process that event
unrelated processing disk I/O for event
unrelated processing compute entry freshness
unrelated processing read cached body
unrelated processing return cached body to main thread
unrelated processing
unrelated processing
process response body


Thoughts?

- Brian

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=717761#c1
[2] https://bugzilla.mozilla.org/show_bug.cgi?id=695399
[3] https://bugzilla.mozilla.org/show_bug.cgi?id=660749

Jason Duell

unread,
Jan 14, 2012, 12:49:01 AM1/14/12
to dev-tech...@lists.mozilla.org
On 01/13/2012 02:33 PM, Brian Smith wrote:
> As I mentioned in bug 717761 [1], it doesn't seem right to me that our cache operations bounce between the main thread, the cache thread, and the socket transport thread so much
> Thoughts?

Good diagrams. It certainly seems like we could eliminate some overhead
and latency by streamlining as you suggest.

Jason

Boris Zbarsky

unread,
Jan 14, 2012, 1:12:14 AM1/14/12
to
On 1/13/12 5:33 PM, Brian Smith wrote:
> Thoughts?

Yes!

I particularly like the change to not have to go through the main thread
event loop for cache stuff. Right now having to go through there for
OnCacheEntryAvailable before we can even start reading from the cache
means that even in the no-validation-needed case we have to process at
least two events from the cache on the main thread before we can start
getting any data. If the main thread is being at all busy (e.g. some
sort of JS animations running), each trip through the main event loop
can easily be on the order of 10ms.... We've run into that issue
before, in fact.

The proposed event flow looks much much better.

-Boris

Christian Biesinger

unread,
Jan 15, 2012, 11:24:00 PM1/15/12
to dev-tech...@lists.mozilla.org
On 1/13/2012 14:33, Brian Smith wrote:
> Thoughts?

I think that's a very good idea. I'm a little concerned about all the
assumptions in the current code about which thread it runs on, but that
can probably be worked out.

-christian
0 new messages