When Alex was answering my moderator question in the All About Polymer gathering at the SFHTML5 event he mentioned that the components are/can be cached locally. This made me think two things. First, this is fantastic since instead of using a backend to generate the same structure of code over-and-over again to send across the internet, you can just generate a few tags and push down far fewer bits overall on your site. Which is just fantastic for overall performance, especially on mobile networks.However, that led me to another thought. What about sites sharing the same components? Developers are being urged to make their components generic so anyone can easily pick them up and integrate them. Would it be beneficial to provide a CDN for components to be hosted from? That way sites that are using the same components but aren't modifying them could then save some resources by using the same CDN hosted copy. This not only saves more data going across networks but also decreases the overall cache size on clients.
Does anyone have any thoughts on the benefits and drawbacks of a Web Components CDN? Or is there a discussion of this before around that I missed in searching?Thanks,-Garbee
Follow Polymer on Google+: plus.google.com/107187849809354688692
---
You received this message because you are subscribed to the Google Groups "Polymer" group.
To unsubscribe from this group and stop receiving emails from it, send an email to polymer-dev...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/polymer-dev/3cb1ca5a-32e7-40a3-9ec0-8615836b772d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
I just had an interesting idea from you mentioning "new specs"...What if there was a new attribute introduced. Something like "remote". Following something like the subresource integrity specificiation [1] draft. So you could declare the remote attribute on an import. So *if* the remote source is already cached then use that. If it isn't, try to pull it from there. If a link can't be established, then fallback to using your hosted version. Adds some extra time to the request if the CDN can't be hit, but would allow shared resources with a fallback in a simple way. Possibly even could be coupled with an integrity check to verify the bits are right.
Yea, the whole "canonical" CDN point is pretty moot imo. Because people will disagree with how things are done or just find a reason to make their own. Further even if you have a CDN, the splitting between versions of an item causes fracturing among the cache base. However, something that was talked about was like a community code review kind of system. It would be cool too if there were a community repository like CDNJS (or possibly even talk with these guys to expand CDNJS to allow web components) where anyone can submit their component in a structured way for sharing. Of course, this takes money and someone needs to put that up for it to operate in the first place which means proving its worth.We can never fully eliminate duplicate cached items. However, something that could be aimed for is a "most canonical" CDN (like Google's AJAX CDN seems to be for jQuery) which would try and have as many caches as possible. This way developers can try to give the best possible performance to clients and use less disk space. Because if 300 sites use a web component that a user visits, having it cached from one CDN (even if 4 versions) is better than 300 different cached copies from each location taking up space.Your hash of the resource point is interesting. But I'm worried about the "I don't care where it came from" part. We know that it is possible to craft things to cause a hash collision in some scenarios, so who is to say someone doesn't make malicious stuff on their site to hand out just to conflict with other peoples stuff? So maybe we need to get a smarter brain on hashes to talk about that kind of attack with.