This is really cool, but I'm not sure I'd use it because of the plugin issue. Usually, I don't just use JQuery - I use JQuery, plus a few JQuery plugins, plus JQuery UI, plus some custom JavaScript files that are built off this. To deploy them, I concatenate them all together in dependency order, minify the whole thing, and GZip it. That saves HTTP requests, and multiple files with the same coding standards tend to compress very well. (JQuery + Dimensions + Dropshadow + Corner = 20504 bytes when compressed together, vs. 21919 bytes when compressed separately.)
It looks like with this, I'd need to request JQuery off Google's infrastructure and then all the other stuff off my own, so it's multiple requests where one used to do, and you don't get the benefit of compressing them together.
I tend to agree, and I didn't make any claims about the number of requests. But data transfer really isn't all that slow, and 1415 bytes only takes about 10 milliseconds on a dsl connection (and 100ms on dial up), which is insignificant compared to the other factors involved.
Basically, he suggests that every script tag have an option attribute called 'hash'. Whenever the browser downloads a script, it computes the hash and caches the script. For any further requests that specify that hash, the browser can used the cached copy instead of downloading a new one. The main benefit here is that everyone can continue hosting their own scripts, yet still take advantage of caching.
Brendan Eich (creator of JS) proposes a different solution:
In your script tags, you would specify both a local version (using the src attribute) and a canonical version (using a 'shared' attribute).
Brendan's concern about the hash solution is the poisoned message attack (http://th.informatik.uni-mannheim.de/People/lucks/HashCollis...). However, I'm not sure that applies here. I believe that you need to be able to generate both documents in order to easily find a collision. Anyone else know if that's true?
I still think I'd prefer to host them myself. When I used YUI style sheets hosted on Yahoo's servers they always took longer to load than local files. Also, Google ads are always the slowest item in a page to load. Until these libraries are embedded in the browser I don't see much benefit.
In the video they say that they used many of Steve Souders' techniques to speed up the deliver of the files since most web servers aren't properly optimized out of the box
Looks like Google missed the part in Souders' High Performance Web Site where he talked about using ETag to speed up sites. :P
curl -I http://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js
HTTP/1.1 200 OK
Last-Modified: Mon, 26 May 2008 18:45:05 GMT
Content-Type: application/x-javascript
Expires: Wed, 27 May 2009 17:39:22 GMT
Date: Tue, 27 May 2008 17:39:22 GMT
Cache-Control: public, max-age=31536000
Content-Length: 55740
Server: GFE/1.3
Oops, you are right. For some reason I thought using the If-None-Match header was the "right" way of doing this, it just adds flexibility to how you cache stuff.
"If you're not taking advantage of the flexible validation model that ETags provide, it's better to just remove the ETag altogether. The Last-Modified header validates based on the component's timestamp. And removing the ETag reduces the size of the HTTP headers in both the response and subsequent requests."
It's a neat idea, but I can't imagine using it. Do you really, really trust Google with your business? What happens when google decides to "fix" something in one of those frameworks that the upstream developers disagree with? Even worse, what happens when your app is irreconcilably broken because of something that happened at Google? Worse still, what happens when your app somehow becomes reliant on Google's version of the framework(s), and you don't realize it until it would be too expensive to unhook yourself from Google?
There's nothing new about depending on external parties - if your site hosts ads (or uses an external stats service) you're already running code hosted elsewhere, so you should probably be comfortable linking through to Google (I trust them a lot more than most ad networks). The question is always "do I trust this provider not to screw me over" - Google's developer network stuff HAS to be trustworthy or they'll lose the hearts and minds they've been cultivating overnight.
As for your app becoming reliant on Google's version of the framework, you can always download the JS file they've been serving and host it yourself. You can't get locked in that way.
As for your app becoming reliant on Google's version of the framework, you can always download the JS file they've been serving and host it yourself. You can't get locked in that way.
This misses the point. I'm not concerned about where the file comes from. If breakage occurs with regard to where the download comes from, a few customers get annoyed while you fix it (or even better, you implemented caching and customers never know the difference). That's a normal and expected maintenance issue that you plan for.
What I'm really talking about is the content of the file. If Google fixes bugs or adds features and you don't realize it until your app has become dependent on Google-specific changes, it's not an easy thing to fix. This is a different problem than getting screwed over by ad providers.
Aah I understand where you're coming from. I would be amazed if Google made changes to the libraries they are serving up (since it would undermine the entire concept of the hosting service).
It would definitely seem to go against the motto, right? You need look no further than Microsoft (and other software vendors whose entire strategy is based on lock-in), and assess the probabilities for yourself.
The scripts look like they're organized by versions, ie ajax.googleapis.com/ajax/libs/prototype/1.6.0.2/prototype.js, so google can't really do any 'fixes' if they want to stick to the original authors' versions. Being afraid of have to 'depend' on this isn't a big problem; it's just a text file. You can create a constant in your codes for the URL string so that you can change it easily. The Script.aculo.us and other libraries are pretty big, so having it load from Google would help a little.
Maybe one possible solution to my concerns would be to keep a table of sha1 sums of the version you want to depend on, and check them periodically (randomly select 1 of every 1000 visits to receive the checking code that reports back to your backend).
I just have an instinctive wariness of this kind of dependency, even if the cost of avoidance is a performance penalty.
It looks like with this, I'd need to request JQuery off Google's infrastructure and then all the other stuff off my own, so it's multiple requests where one used to do, and you don't get the benefit of compressing them together.