Are Slow Load-Times Killing Your Conversion Rate – Part 2: Top Ten Ways To Improve Site Speed

Top Ten Ways To Improve Site Speed

As we saw in Part One of our series on website optimization, a slow site can seriously impact user experience. Anything over 3 seconds is going to be an irritation to users, and irritated users lead to lost sales, reduced conversions, and higher bounce rates. As an example, figures released by Walmart show that increased load times are significantly correlated with lowered conversion rates, with a precipitous drop as load times increase between one and three seconds.

Slow load times can also have an effect on SERP results. Google use site speed as one of their signals for determining ranking. It has a much smaller effect than relevant content, but any decent SEO will tell you that, all else being equal, a quicker loading site can give you a bump compared to competitors.

With that in mind, we’re going to take a look at 10 best practices that website owners can implement on their sites to make sure that their users have the best possible experience.

Metrics

Before doing anything, website developers and designers need to gather information. Firstly, to objectively determine how fast their website actually is, and secondly, to monitor the effectiveness of any site optimization methods they implement. Those of you who are using Google Analytics can use it to monitor site speed, and Google also has a free tool that will analyze a site and offer suggestions for improvement, as do Yahoo! at YSlow. For more precise information, including testing with different browsers, bandwidths, and locations, WebPageTest is excellent.

Reduce Round Trip Time

Round-trip time (RTT) is the amount of time that elapses between a browser making a request, and the data being delivered. RTT is a major factor in site speed.  The length of RTT can vary enormously depending on the location of servers and the quality of the network they are part of. A  number of the suggestions below relate to reducing RTT. The key principle to keep in mind is that every time a browser needs to request a resource from a new server, it increases the time a page takes to load. This principle is somewhat complicated by the benefits of parallelization of resource loading, which we’ll discuss below, but ideally sites shouldn’t be retrieving less than 6 resources per server. If a site is requesting 10 resources from 5 different servers, a lot of time is going to be lost in extra DNS lookups and HTTP requests.

Reduce DNS Lookups

A significant chunk of the RTT is caused by Domain Name System lookups. Having resources spread across different servers results in extra DNS requests, which, depending on factors like caching and the proximity of the server, can increase page load times.

The solution is to keep resources on fewer servers to reduce the number of superfluous DNS lookups. Additionally, using subdomains like resources.example.com causes extra DNS lookups, whereas URL paths like example.com/resources do not. Developers should also avoid including links that lead to redirects.

Again, how exactly a website arranges its hosts will depend on striking the best balance between reducing DNS lookups and inducing parallel downloads.

Parallelize Downloads

Web browsers are capable of downloading multiple resources at the same time, but various aspects of page architecture can prevent them from doing so. For example, browsers will stop downloading from all hosts while they wait for external JavaScript to download. If developers position their JS scripts after their CSS files, the CSS can be downloaded in parallel without having to wait for JS that is unnecessary for the initial displaying of a page. If possible, developers should generally put their JS scripts down at the bottom of the page, so everything else loads first.

Browsers limit the number of parallel downloads from each host, so this is a case where spreading resources across hosts can improve web page speeds. Whether incurring the extra RTT from using multiple hosts is worthwhile depends on the number of resources a given page requires. This has to be optimized for specific pages, there is no one size fits all solution.

Google have an excellent article explaining this in more detail with examples.

Use CSS Sprites

When a website has lots of images, because each one incurs an HTTP request, a lot of time is wasted in sending requests and setting up downloads. CSS sprites are a way of combining many images together in one file, and then using CSS to select which area of the image to display. That means as many images as a developer likes can be downloaded with only a single HTTP request. There’s an article that goes into detail about CSS sprites on CSS Tricks.

Optimize Images

There are a number of methods to squeeze down image size without reducing the quality. The Smushit tool from Yahoo! will help developers remove extra bytes from images.

 Use Minificatation and Compression

We’ve all been taught that keeping code readable is a virtue, but minification, the removal of unnecessary characters and comments from CSS and JavaScript, can reduce script sizes significantly. This won’t have much of an effect on a simple site, but if a site has complex and lengthy scripts, big savings can be made by minifying them.

Modern browsers and servers support compression, most often with gzip, of HTML and other files.  Compressing large files on the fly can cause excessive load on the server and the client’s processor, and images shouldn’t be compressed this way (they are already compressed), but for most small to medium text files the benefits are substantial.

Keep an Eye on Embeds

Embeds like analytics scripts, tracking scripts, and social media widgets are not always designed and written as well as one might hope, and including too many of them in a web page can incur large slowdowns in loading time, as this Ghostery study shows. Embeds are a  necessary part of the modern web, but loading a site up with too many of them will tarnish user experience.

Make Proper Use of Caching

Browsers pay attention to caching directives from servers and will keep files in their cache for longer than usual if they are told to do so. This won’t affect the loading speed for first time visitors to a site, but it will make repeat visits load much more quickly. Increased cache times can be applied to all rarely changed files, including images and scripts.

To do this developers will need access to their server’s .htaccess file to edit the expires headers (assuming they are using Apache). Resources that never change can be set to never expire in the cache, and other resources can be given appropriate future expiration times. Have a look at this article for detailed examples.

Use mod_pagespeed

Another one for Apache users who have root access to their servers. Mod_pagespeed is an Apache module from Google which “rewrites resources using filters that implement web performance best practices.”  It’s capable of implementing a number of the optimization methods we have mentioned above.

If you found this useful, or want to add some tips of your own, feel free to leave a comment below, or share it with your friends. 

Enhanced by Zemanta