site logoTune The Web

Setting up Gzip on your website


One of the biggest performance impacts of websites are that they are often located far away from the user. It's actually really amazing that they are as quick and responsive as they are when you consider they are often delivered from the other side of the world! Network slowness - primarily latency (how long it takes to send messages back and forth between your web browser and the web server), and bandwidth (how quickly you can transfer large volumes of data) are the two key points of contention. Gzipping content attempts to optimise your bandwidth usage by compressing data on the server, sending it compressed, and then the web browser uncompressing it at the other side so the data transferred is as small as needs be without losing any data. The reason this works so is that lots of parts of websites are made up of text, including the code technologies of HTML, CSS and Javascript. Text data can be gzipped very efficiently (on average 70%-90% of network traffic of text resources can be saved by zipping).

Having a web server zipping up content, might sounds like a pretty big overhead, and in the past that was indeed an issue but with performance improvements of CPU cycles, all but the busiest websites will struggle to notice any negative impact. Even if you are one of those large websites there are ways to mitigate the impact including server side caching or pre-zipping content. 81% of web sites compress text resources and you should too unless you have a very good reason not to - especially with the huge rise in mobile usage where network bandwidth is one of the main limiting factors.

How to set it up

How you set it up depends on your web server. For you Apache you need to include mod_deflate:

LoadModule deflate_module modules/

and then define which files you want to compress, what you don't want to compress. This could be with config like below:

<IfModule mod_deflate.c> #Only compress specific content AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css text/javascript application/x-javascript application/javascript application/json application/x-font-ttf application/ image/x-icon #Alternatively compress everything by uncommenting the next line #SetOutputFilter DEFLATE #but don't compress content which is already compressed including images and woff and woff2 fonts SetEnvIfNoCase Request_URI \.(gif|jpe?g|png|swf|woff|woff2) no-gzip dont-vary # Make sure proxies don't deliver the wrong content Header append Vary User-Agent env=!dont-vary </IfModule>

Special note about fonts

Fonts come in various formats including .eot (only used by Microsoft Internet Explorer and Microsoft Edge), .ttf (used by older non-Microsoft browsers) .woff (used by all modern browsers) and .woff2 (used only by newer browsers). Of these only .woff and .woff2 are already compressed so do not benefit from gzipping again. The others require compressing. This is done by including the "application/x-font-ttf" and "application/" mime-types as shown in above config. Or alternatively by including the file extensions .eot and .ttf if you have gzip configured that way.


Every modern browser (going right back to Internet Explorer 5.5) supports gzipped content. Web browsers also send a Content-Encoding header when they can support gzip content, so web servers will not send gzip content in the unlikely event that a browser does not support gzipped content. Very old versions of browsers (e.g. Netscape Navigator 4) used to have some issues with gzip content, so you might see some extra config to handle those in some web server config, but those browsers are long gone and that config is no longer necessary. All modern web servers also support gzipped content.

Some old corporate proxies used to not support gzip meaning all the workers using their office PCs behind the proxy would not get gzipped content even if their browsers did support this, but again the Content-Encoding header will ensure compression is only used where necessary if that is still an issue.

You should also use gzip, rather than the equivalent deflate method of compressing content, as there were some implementation differences for deflate. For those of you using Apache, the mod_deflate module uses gzip despite it's name and has replaced the old mod_gzip module.

The Downsides

Gzipping is standard practice now and you really should be doing it. The main downside is the increased CPU cost on both the server, and the client side, but that is unlikely to be an issue for most people. Additionally for the client side the gains in performance are definitely worthwhile since, while the main web browsers which have any CPU limitation are mobiles, they are also much more network bound than CPU bound for internet browsing. On the server side, providing you are running modern hardware, the performance impact should not be noticeable. The most recent in depth investigation of the performance impact of gzip is from 2008 and it suggested that only when you have 50 hits a second on 100kb files do you need to be concerned and hardware has moved on a lot in the last 7 years so it's likely even that is an issue.

As mentioned in the introduction above, in the unlikely event that you do experience performance issues using gzip, there are options to improve this including pre-zipping or caching zipped files.


Implenting gzip on your web server is an easy performance gain for your users, and will help reduce your own bandwidth costs. Support is near universal, the CPU impact of gzipping content should not be an issue for all but the busiest sites and, once set up, there is very little need to change the set up unless a new file type comes along that does or doesn't need compressed and even when that does, like when woff2 came along, the impact of compressing this when you shouldn't, is a bit of a waste, but will still work fine for both server and browser. 81% of websites use gzip compression on text file, and there's really no good reason why that shouldn't be nearer 100%.

Update - June 2017

A new compression format on the block is Brotli, which seems to be similar speed as Gzip but with higher compression ratios. It is widely gaining support.

More Resources

This page was originally created on and last edited on .

How useful was this page?