One of the factors that dictates browsing speed is the time it takes to do a DNS lookup – that is, convert a domain name such as google.com into an IP address such as 184.108.40.206. Generally most people use the DNS servers operated by their ISP. Usually this is fine, but sometimes ISP DNS servers can be unreliable, and they’re not always the fastest choice.
There are many free public DNS services, such as OpenDNS and search giant Google’s Public DNS, but it’s difficult to know which one is best for you. Enter NameBench, a free cross-platform tool which tests a raft of public DNS services using either your browser history or a list of top domains. Once the tests are complete, you receive a summary of the results including suggested primary, secondary and tertiary servers:
So if you’ve got a few minutes to spare, why not see if you can shave a few milliseconds from your page load times?
Google offer a standalone installer for the Windows build of Google Chrome, as opposed to the standard download which is actually just a small stub application that connects to Google’s servers to download and install the actual browser.
The offline installer is handy if you have a number of machines on which to install or update Chrome, but unfortunately Google haven’t updated it recently, so you end up with version 220.127.116.11 rather than the latest all-singing, all-dancing, extension-supporting version 18.104.22.168.
You can of course update to 22.214.171.124 from the About screen, but this defeats the purpose of using the standalone installer in the first place, and you may be unlucky enough to be on a corporate network which breaks the in-browser upgrade functionality.
By using Fiddler2 to monitor the activity of the stub installer, I was able to establish that it connects to the following google.com URL to download the latest build:
I noticed today that the Google logo shown at the top of all search results is actually a composite image, sliced up through clever use of CSS positioning:
At first, I thought of this as nothing more than a neat trick, but then I began to think about why Google might have decided to use this technique to their advantage.
Whenever a client browser requests a page, it will also make a request for each of the images (and other media) embedded into the page. Once an image has been displayed once, it is usually cached client-side to conserve bandwidth and improve performance for subsequent loads. For example, the RSS logo at the top of my blog will be downloaded from my server on your first visit, but as you move through the site, future references to the file will be fulfilled from your browser’s cache.
Google isn’t particularly image-heavy, but a typical results page could contain five or more ‘sprites’ or graphical elements. By squeezing them into a single file, user’s Web browsers need only make two requests (one for the page itself and one for the composite image) instead of six or more.
This might sound trivial, but considering that Google serve billions of result pages to millions of different visitors every day, the cumulative saving in bandwidth and server resources is likely to add up to quite a figure.
If you operate a moderately high-traffic site, it might be worth considering using similar tactics. The only other site that I’ve noticed that has used CSS image slicing in this way is the now-defunct Cdiscount UK site, for its pricing images.