Over the past week, many SEO blogs have reported that Google is now using page load time as a factor in their rankings algorithm. I wanted to follow up and explain specifically how to optimise for visitors and robots, because there is a distinct difference between the two.
Optimising for robots
When search robots crawl your site, they aren’t downloading all the external files your HTML references which includes images, scripts, and stylesheets. Below are a few easy ways to speed up how quickly your HTML is served are to robots.
Optimise your database queries
Inefficient joins and not using indexes correctly can dramatically slow page load times. If you have a CMS or database driven site, logging the time to serve each template for a few days can give you valuable information on which pages are being served the slowest so you can focus on the really slow ones first.
Ensure scripting and stylesheets are in external files
Keeping your scripts and stylesheets in external files is really important in minimising the amount Google has to index and cache. These are also typically in the <head> tags which push your unique content further down the HTML document.
Use GZIP compression
This will use more server processor time, but if you are serving a lot of content in each HTML document, it’s can deliver huge HTML file size savings.
Optimising for visitors
All of the above also helps visitors, but there are few extra things you can do to improve the user experience:
Local web hosting
Be sure to host your site domestically in the country where most of your visitors reside. This will decrease the latency making pages load faster which also benefits your SEO.
Client side caching
Setting longer cache periods for photos and external scripts and stylesheets can significantly improve the user’s experience. Whether it’s the user’s first visit and they browse multiple pages or if they’re returning visitors you can often save them from re-downloading hundreds of kilobytes per page view.
Consolidating external scripts & stylesheets
You can take externalising scripts and stylesheets one step further by consolidating all your CSS and scripts into a single file (one for each). This can save multiple HTTP queries which will reduce the HTTP requests per HTML document.