cache document structure to improve speed

49 views
Skip to first unread message

open768

unread,
Nov 4, 2010, 12:17:58 PM11/4/10
to Make the Web Faster
I've invented a client side template solution that works on all
browsers today.
its under the working title of meob. http://www.paglis.co.uk/meob

web servers typically have a handful of standard page layouts that
they use to display
their content. They spend CPU cycles putting content into the layouts
and churning out
complete HTML.

Meob moves the responsibility or merging the content with the
structure to the client browser,
This means that document structure can be treated like a static
resource and cached.
so in multiple visits to the same page, eg a news page, the web server
no longer has to serve
the structure every time, or spend the CPU cycles merging content with
structure.

Currently Meob works on Internet Explorer, Firefox , Opera, opera
mini, Safari, Chrome, Minimo and
on most OS's including andriod, meego, windows, apple, ipad, iphone.
Infact the index page of Meob is
written in Meob.

I havent got the facilities to done any tests but would love to know
if Meob could make the web faster.

other reasons I wrote Meob were to
* make the web restful - every web page becomes an API
* separate web page workflow, designers work independently from app
coders
* rapidly change the appearance of web pages without changing code.
By changing the template reference
* give control of the appearance back to the user - select a local
template (eg to change google home page into an MS home page not
implemented)

Sunil









Sajal Kayan

unread,
Nov 4, 2010, 2:46:51 PM11/4/10
to make-the-...@googlegroups.com
Looks interesting, but the downside to this approach is that the pages
wont be seen by non Javascript browsers (Including search engine bots)...

Good approach if you don't care abt search rankings...

open768

unread,
Nov 5, 2010, 7:48:53 AM11/5/10
to Make the Web Faster
Hi Sajal,

thanks for the critique, yes happy to accept that. Would be no
different
to say if the site was written in flash.

couple of approaches:
* update search engines. long shot, but If the big G were to use this
I'm sure there would be a solution ;-)
* sites put indexable pages as their index page, ie dont use Meob for
landing pages
* Plan to develop further to meet accessibility guidelines by
detecting non
javascript browsers and bots and redirecting to server side page to
perform the merge transparently to the user.

Sunil

Dan Johansson

unread,
Nov 5, 2010, 10:50:36 AM11/5/10
to make-the-...@googlegroups.com
Sites written in flash to pose the same problem. Not all flash developers are aware of that. But those that are, often make the content also available in some sort of static form.

I think the main benefit would be in applications instead of websites. Some web applications already do this in their own way, like gmail for example. In that scenario, indexing is often not a necessary consideration. 

--
You received this message because you are subscribed to the Google Groups "Make the Web Faster" group.
To post to this group, send email to make-the-...@googlegroups.com.
To unsubscribe from this group, send email to make-the-web-fa...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/make-the-web-faster?hl=en.


Jonathan Klein

unread,
Nov 5, 2010, 10:56:38 AM11/5/10
to make-the-...@googlegroups.com
This is a clever idea, but I definitely have some questions about it.  For one thing, you are "moving the responsibility of merging the content with the structure to the client browser", which is a pretty hostile and unknown environment.  You are asking the client to spend more CPU cycles laying out the page, when they might have fewer resources available than the server.  In addition you are making the client download a substantial amount of JavaScript to render the page, and by its very nature this JavaScript has to be blocking (unless you want a nasty flash of unstyled content). 

Saving CPU on the server side is a noble goal, but in reality the time that the server spends putting content into layouts and churning out HTML is very very small.  Servers are a heck of a lot better at string manipulation than browsers are. 

Like Dan alluded to, this would probably be used for web applications where this kind of thing is already happening.  Browsers make Ajax calls and bring back JSON encoded strings which are then manipulated on the client side into the layout.  This is efficient because you transmit fewer bytes across the wire (and it is what allows things like Google Instant to perform well).  So for dynamic applications this approach makes a lot of sense, but if you are generating something like a news page which is mainly static I think you would be much better served to crunch the HTML on the server. 

Just some thoughts, let me know what you think.

-Jonathan




open768

unread,
Nov 5, 2010, 4:12:18 PM11/5/10
to Make the Web Faster
Thanks Jonathan, my comments in situ :)

On Nov 5, 2:56 pm, Jonathan Klein <jonathan.n.kl...@gmail.com> wrote:
> This is a clever idea, but I definitely have some questions about it.  For
> one thing, you are "moving the responsibility of merging the content with
> the structure to the client browser", which is a pretty hostile and unknown
> environment.  

I like clever! I wouldnt agree unknown as Meob
does work today on all major browsers on all major operating systems.
I even bought myself android and mac devices to prove it.

>You are asking the client to spend more CPU cycles laying out
> the page, when they might have fewer resources available than the server.
yes true, though the same could be said of new browser features or
changes to HTML. I've not noticed an appreciable increase in page
render time as the speed is IO bound usually. Ie it takes longer to
download than it takes to process.

> In addition you are making the client download a substantial amount of
> JavaScript to render the page, and by its very nature this JavaScript has to
> be blocking (unless you want a nasty flash of unstyled content).
wouldn't say 18k is substantial ,and it can be further optimised by
removing debugging information. I think I could get it down to 9K of
javascript
and it would get cached on the client so its a one time hit. Sites
allready employ
javascript extensively so I dont follow the additional burden
argument.
Replacing javascript with a browser plugin would address any
additional
burden surely..

ultimately I would like to see the logic embedded in HTML standards
for
example a src attribute on container tags. ie src="~myvar" would put
the
contents of the javascript myvar into the tag.

> Saving CPU on the server side is a noble goal, but in reality the time that
> the server spends putting content into layouts and churning out HTML is very
> very small.  Servers are a heck of a lot better at string manipulation than
> browsers are.

the cost is incremental, for low throughput web servers yes, agree
totally.
when talking about hundreds or thousands of requests per second every
improvement helps. it would be good to get some empirical data but
I dont have test facilities on those scales.

maybe CPU isnt the right way to sell this invention. my experience is
that busy
servers tend to get stuck with blocking IO and waiting threads. Less
IO means
faster web.

Say a person goes to a site and views 10 pages with the same layout
say
, for arguments sake there is 20K of HTML of which 1K is content and
the
rest of structure. first hit this would be 29K, subsequent visits 1K.,
total IO
38K versus 200K. Multiply by 200tps and saving in IO is substantial.

the main benefit from my perspective is separating content from
structure
and giving the flexibility to change the layout with very little
effort. Having been
the lead coder for a high throughput site it always amazed me that
marketting had to
come to a web coder to make UI changes. There will always be a degree
of coupling,
between UI and logic as the but MEOB adds a whole lot of flexibility.


on so many
>
> Like Dan alluded to, this would probably be used for web applications where
> this kind of thing is already happening.  Browsers make Ajax calls and bring
> back JSON encoded strings which are then manipulated on the client side into
> the layout.  This is efficient because you transmit fewer bytes across the
> wire (and it is what allows things like Google Instant to perform well).  So
> for dynamic applications this approach makes a lot of sense,

thanks

> but if you are
> generating something like a news page which is mainly static I think you
> would be much better served to crunch the HTML on the server.

perhaps, perhaps not. depends on what you want to do. has advantages
and disadvantages like any technology.


>
> Just some thoughts, let me know what you think.
>
> -Jonathan
>
thanks for the feedback

I'd rather have this discussion on the basis of empirical data, but
consider this
as a proof of concept.

Sunil

open768

unread,
Nov 6, 2010, 5:00:57 AM11/6/10
to Make the Web Faster
Thanks Dan :) yes the main benefit would be for dynamic applications

Jonathan Klein

unread,
Nov 7, 2010, 9:26:41 AM11/7/10
to make-the-...@googlegroups.com
I guess when I said unknown I meant more along the lines of hardware as opposed to browser, I just hesitate to shove more work onto the client when they could be using a pretty underpowered machine.  

I agree that 18K isn't huge, and reducing it to 9K would definitely be a big improvement.  At that point I think you are right that the incremental cost to download the JS is very small.  

Regarding CPU my main point was that you likely wouldn't see a huge improvement even at pretty high load, but you are right that there are other big benefits even without taking CPU into account.  Separating content from structure and shipping less HTML over the network are both big wins. 

Like you say it would be nice to test this at scale and see how it performs.  Have you used WebPagetest and looked at CPU usage for pages using MEOB and pages that aren't?  It would be interesting to see what sort of effect it has on the client CPU to see if making the browser do all of the parsing slows things down.  

Overall I think for some applications MEOB could provide a big speed boost, but for others it might actually be better to do the work on the server side.  As you say it has pros and cons like everything, but I definitely think that on specific kinds of websites you could see a huge reduction in bandwidth use along with an easier design process.  

I'm excited to see where you go with it.

Cheers,

Jonathan




open768

unread,
Nov 11, 2010, 3:17:25 PM11/11/10
to Make the Web Faster
Jonathan thanks for your kind response. I'll take a look at
webpagetest.
Cheers

Sunil

On Nov 7, 2:26 pm, Jonathan Klein <jonathan.n.kl...@gmail.com> wrote:
...
> > make-the-web-fa...@googlegroups.com<make-the-web-faster%2Bunsu bsc...@googlegroups.com>
> > .

open768

unread,
Feb 19, 2011, 6:05:13 AM2/19/11
to make-the-...@googlegroups.com
MeOB is now on sourceforge https://sourceforge.net/projects/fabiola/
Reply all
Reply to author
Forward
0 new messages