DerekAllard.com

jQuery hosted on Google and some implications for developers

I finally made the long discussed flip over to jQuery.  It took me about 4 and a half minutes.  It should have been a 30 second process, but I had a few lightbox images relationships named incompatibly.  Anyhow, all fixed up.  I also decided to implement the much talked about Google hosted Javascript library.  In a nutshell, Google is hosting some popular Javascript libraries.  The idea is that if enough people are using the hosted libraries, then there’s a good chance that your visitor has already locally cached the files, and your page will (give the illusion of) load faster.

As a handy extra, they take care of compressing and minifying for you, and are committed to keeping a library online permanently after it is hosted.

That said, I’m not sure how long I’ll keep it.  There are a few things that I think every responsible webmaster has to think about first.  Personally, I would only use it as part of an informed company strategy (I could see a savings on a big site like ExpressionEngine.com in terms of bandwidth and perceived load time).  But there are still some downsides I just haven’t fully reconciled yet.  Let me address the three most relevant ones that I see.

What if Google goes down?

Google?  Not likely, but this is probably the most valid argument I can think of.  I suppose if one were really concerned about it, they could use:

<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js"></script>
<script type="text/javascript">
if (!
jQuery)
{
    
// jQuery obviously hasn't been loaded... let's get on that shall we...
    
document.write('<script type="text/javascript" src="/js/jquery.min.js"></script>');
    
// oh, and don't flame me for document.write... its just an example
}
</script> 

But the truth is that I’m not going to bother doing that here.  Without javascript DerekAllard.com runs just fine anyhow.

By allowing Google to put Javascript on your page you are allowing them to over-ride or re-write your content by manipulating your DOM

This is true, and of course is a risk with any remotely loaded script or content (think of all those Google Ads… now there is a hacker’s wet-dream).  I already allow Google Analytics here, so its a risk I’ve already thought about, and am prepared to take.

What about privacy?  Google is collecting a lot of data.

Also valid of course.  Firstly, I’m already giving them access to that data via analytics, but the broader question I suppose is what do they do with all the data they collect.  For now, I’m comfortable with their privacy policies and published data collection protocols, but I admit to being uncomfortable with any one agency having all that data.

Also, there were already a lot of sites that were pointing to Google Code to load the Javascript anyways, so this is almost just like formalizing something many people were doing anyhow.

Now, for frameworks like CodeIgniter, I see another major benefit, albeit, on a pretty selfish level.  As has been discussed, CodeIgniter is getting a jQuery library added in.  One of the really pain in the ass things about building this library, has been trying to figure out the most elegant ways to include the library source files.  Adding the whole jQuery library to the CI download isn’t really a good option; not everyone will want them, and every dev has different preferences about where they keep site assets anyhow.  With the advent of this API, we could conceivably use google as a default source dir, and allow devs to over-ride that with a config or initialization setting.  Let me just caution now that this is a totally “off the top of my head” thought, and I’m not saying CodeIgniter necessarily will do this, only that its something worth exploring.

I haven’t fully worked through the implications in my head yet (those of you who know me know that that might take a long time… I’m not the smartest guy), but the experiment has started.  Google has the potential to do something really neat here.  In a perfect world, properly implemented, file size will become nearly a non-issue when selecting a javascript library, and bundled apps could rely on a robust, stable delivery for scripts.  I’m not naive enough to think that’ll ever fully come to pass, but the goal is admirable.

Comments

Michael Wales wrote on

It’s definitely nice to see Google takes this step and formalize the situation, like you said, “a lot of people were doing this anyways.”

There are numerous concerns with doing so - but I see this “cloud computing” really taking off. A lot of the bigger companies are making available assets and resources the little man just did not have 3-4 years ago.

Look at Amazon with all of their amazing services - it doesn’t get much better than that. Instant scalability without having to worry about how you’re going to get the new server through the door? I’m in…

Yahoo has been offering their bandwidth for awhile now - in loading the YUI files from their servers. Google’s simply taken this a step further by supporting a lot more than YUI.

Alexandru Plugaru wrote on

A centralised javascript repository for popular libraries from Google is demonstrating once again that these guys know what they are doing and that they are in touch with dev peoples.
As always Google rocks.

I think that creating a plugin/hellper for CI that allows to load different Javascript libraries from Google is a good ideea.

Pål Degerstrøm wrote on

I am an avid user of the YUI library in a lot of my client projects, and see to clear benefits to serving files from Yahoo!‘s - and now Google’s - CDN’s (Content Delivery Networks):

1) Upgrades: It is much easier for me to upgrade client sites to the latest version. No files to upload, just change the parameter that determines the version to be downloaded.

2) Caching: The bigger the files get, the more benefit you’ll get from using a CDN. Users will download and cache the file from Google/Yahoo!, and not download the same version of jQuery (or one of the other libraries) from each site they visit.

And completely unscientific: Perhaps sites will appear faster when files are downloaded from different servers, once you consider the browser limitations in simultaneous downloads/connections to a single server.

Steve wrote on

Would this approach be any better than setting far future expiration dates and gzipping the files yourself? I guess if a majority of your visitors have the file cached from their visits to other sites there may be some difference but how much could it be? A few ms of load time? The downfall to me (and one of the reasons I don’t do this myself) is that accessing a file from another website, no matter how small, requires an additional DNS lookup on pageload which would actually have the opposite effect. It typically takes 20-120 milliseconds for DNS to lookup the IP address for a given hostname. ( ref: http://developer.yahoo.com/performance/rules.html#external )

For instance, I cleared my cache & cookies etc. and reloaded this page. According to my Firebug plugin it took the 17KB file jquery.min.js 200ms to load from ajax.google.com. Loading the same 17KB file on my own website took 120ms.

So although it’s a nice idea in theory, I think i’d prefer to have the files on my own webspace.

Michael Wales wrote on

One of the good things about the Google idea that I only mildly thought about earlier.

With all of the black fiber Google has been purchasing, the position of server farms in key locations around the country, even rumors of staging portable server farms in tractor-trailers - utilizing Google’s infrastructure in this manner could vastly speed up your site’s loading time.

Mark van der Walle wrote on

While I think it a grand idea to use this new centralized javascript library server I just thought of a few things that might not make it so great.

For the projects we do often a lot of javascript is used. Mostly jquery with a few plugins and some custom code. We always minify/combine the javascript prior to putting it to production. This obviously avoids requests. With loading jquery from google while we can minify it from our own site we would introduce an extra request. Even though good http caching schemes might prevent that request altogether it feels best to just load one somewhat bigger file from our own servers.

Still some tests need to be done before one can actually say anything about speeding up load times.

Michael Wales wrote on

The benefit on loading times isn’t necessarily involving the user on your site - it’s speeding up the user’s loading times on all sites.

By using the version hosted at Google, the user has that cached. Now, whenever they visit any site using the library, as offered by Google, they have it cached.

So, sure you could achieve similar loading times by gzipping and setting longer cache’ing periods on your site itself. But, think about when you user hits another site that does the same thing - they are duplicating effort.

Google’s contribution is more about doing the greater good for the Internet as a whole than worrying about your own site’s load times.

Mark van der Walle wrote on

I know it is something for the greater good and it sounds good. I was just wondering what it would mean for load times across multiple sites.

I have done some tests and it seems that there is a minor increase in speed when using google apis. On the whole i guess that when more and more sites start using it there is a big chance that the library is already cached. The benefit of this will become more profound when a lot of sites use it.

I have already made a proposal of using this in upcoming projects.

Derek wrote on

I actually tested this.  Cached files will always give the illusion of loading faster (whether served from my site or served from Google) over an uncached library.  If they are both cached, the difference seems negligible to me (regular “noise” makes it hard to be definitive, but in any event, both are very fast), but Mark you are right that if uncached it is quicker to gzip and serve the compressed file locally.

Again though, I think we’re all saying the same thing here - the idea is that hopefully your user will already have it cached… what we’re trying to do is increase the likelihood that its already been cached.

Great comments, thanks.

Mark Scrimshire wrote on

This may prove to be of more use for mobile users, particularly those not on 3G or faster networks. Using a cached library from Google may provide faster load times over slower networks - offsetting the downside of requiring a DNS lookup.

Debug wrote on

Think Google collected all personal data he wanted already

Dom wrote on

We’ve chatted about this here at the officee. Current arguments against (which appear to mirror some of the comments above):


1. We can bundle everything into one .js file that lends to better compression during download.

2. There’s only apparent speedup if user happens to have those libraries loaded from some other site that uses google’s service (and what sites are those?). Maybe in time.

3. If someone has *certain* anti-virus software it can prevent loading 3rd party scripts = your site is massively broken.

4. Google pulls same crap they did with google_analytics so the js library load fails = your site dies if you arn’t keeping on top of google changes.

omed habib wrote on

what about development environments that don’t have access to the internet? i know that sounds stupid, but sometimes some people enjoy working on their laptop at a park, train ride to/from work, etc.

Derek wrote on

I don’t think it sounds stupid, but obviously a PHP developer is going to have some expertise here and know what is right or not for them.  In those cases, a hosted library simply wouldn’t be appropriate.