How to improve my website's load time? Part 1

Usually on sites applies the 80/20 rule, where 20% corresponds to the time our server takes to reply a webpage and 80% to the time needed by a browser to load the whole page (images, css, javascript, rendering). Let's analyze where all that valuable time is "lost" and how we can interfere.

Minimizing HTTP requests

These days a modern webpage, even one generated by a highest quality CMS, uses a lot of images, javascript and cascading style sheets (CSS). All of these combined with the limitation of many browsers to open very few connections to a domain lead to notable delay.

One way to make our site faster is by reducing HTTP requests is combining the files. For graphics and images being used in a site one way to achieve this is by using CSS Sprites, where all icons and backgrounds are combined into one single image and by using background-image and background-position CSS properties we are able to display only the desired image segment.

The above technique is somehow valid for CSS and Javascript as well with an additional feature, minification. For example, instead of serving different CSS files layout.css, style.css, print.css etc. with the aid either of tools available on the Internet, either by using plugins for our CMS. We are able not only to serve a single .css or javascript file for the whole site, but we can benefit by minification as well, where additional spaces making the code easier to read for a developer, but useless for a browser are removed.

One other way is to serve images, CSS and Javascript files (static files) from a Content Delivery Network (CDN) or in a simpler approach by a "different domain". (Remember, a browser is able to open about 2-4 connections to a domain), but using few subdomains like images1.mydomain.com, images2.mydomain.com, images3.mydomain.com, images4.mydomain.com, instantly we allow a browser to download in parallel our images with 12+ connections instead of 2-4 as it handles our subdomains as different domain.

Compression

Reason tells that the smaller the files the faster a site would load. Well it is true, but only if we first have made certain with our previous steps that they aren't unreasonably too many. In the case of a site that we want to make it to load faster, we usually start first by images, which are by rule the biggest part of a page.

Compressing images can be achieved by many ways, that not necessarily means that the correct way is always picked. There are many algorithms lossless (.gif, .png) and lossy (.jpg), the first doesn't lose any of the original information, while the last sacrifice a little quality for the final filesize. Although if they are used properly, nearly all users are unable to spot any difference between the original and the compressed image. The general rule applied here is that we use .jpg for photographs, persons, landscapes, etc and .png for graphs, drawings etc.

We already mentioned to CSS & Javascript compression using minification. We have left with the whole page's compression, which can be achieved by applying the gzip compression to our webserver (Apache, Nginx). This leads to a reduction of about 70% as a page is consisted by text, and text compression has very high performance.

Caching

Have you ever noticed that a web page takes a while to load for the first time, but as you keep browsing it gets faster and faster? The fastest option of course for a browser is to not download an image, a CSS or Javascript file that has already downloaded. The famous "static files", those that do not change from often to never. Using caching techniques we tell the user's browser to keep for at least 1 week, a month, 6 months (as long as we consider safe) the same file if it hasn't changed on server.

This article covers of course only the top of the iceberg of the very important Web's chapter, the one of website optimization in order to load faster, we will explain in detail each technique in future articles. Stay tuned!