Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah -- for all that people are worried about efficiency gains, I'm kind of doubtful that most end-users, even on slow connections, will even notice that caches are restricted to within domains.

I suspect that website who are conscious of loading times are already testing performance with nothing cached. And websites that aren't conscious of loading times are probably using bundling techniques that would already make cross-site caches useless. In both cases, I'm having a hard time believing that loading JQuery is the reason anyone's website is slow.

There are theoretical schemes that could allow us to share libraries between sites without having the same privacy impacts, but I'm not sure it's even worth the effort of proposing them.



I'm not sure how far this is technical possible but for people which are on so slow/low bandwidth connections that they have a noticeable drawback because of this change I believe there is a better solution:

An extension keeping widely used versions of libraries preloaded as well as a small db of CDN/urls so that it can serve the pre-loaded libraries instead of the CDN ones when possible. This also could do thinks like collapse foobar-latest and foobar-X.Y.Z (X.Y.Z == latest) and could force load a different version with security patches. I.e. it would act kinda like a linux package manager for a limited part of common libraries.


Decentraleyes does exactly this.


Check out LocalCDN for a fork with actively-updated CDNs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: