Welcome!

Web Performance is a Journey, Not a Destination

Mehdi Daoudi

Subscribe to Mehdi Daoudi: eMailAlertsEmail Alerts
Get Mehdi Daoudi via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Blog Feed Post

Compression: Making the Big Smaller and Faster (Part 2)

In the last blog, we discussed the different methods of compression and how it works. In this post, we are going to talk about the Brotli Compression in comparison to other compression algorithms.

Brotli Compression

Per rfc7932: Brotli is a lossless compressed data format that compresses data using a combination of the LZ77 algorithm and Huffman coding, with efficiency comparable to the best currently available general-purpose compression methods.

It’s maintained by Google and reduces bandwidth consumption and helps content load faster.

When it was developed in 2015, Brotli was used to compress fonts in the WOFF2(Web Open Font Format). With time, it’s popularity increased and now it’s even used for compressing/decompressing textual data. Brotli has a direct affect on page load times by reducing the page size and ultimately the bandwidth consumed.

This compression algorithm is now supported by many browsers as a new Accept-Encoding scheme that can be used to compress static web page assets.

Brotli: Browser Support

http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli1-300x84.png 300w" sizes="(max-width: 612px) 100vw, 612px" />

How to check if a Browser supports Brotli

most browsers that support Brotli compression advertise their ability to accept Brotli-compressed resources in the Accept-Encoding request header.

The Accept-Encoding Request HTTP header advertises which content encoding the browser understands. Using content negotiation, the server selects one of the proposals and informs the client of its choice with the Content-Encoding response header.

Here is a screenshot of HTTP Request-Response Headers captured from an instant test run for a website that supports Brotli compression.

The Instant test was run using Chrome Version: 53.

http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli2-300x125.png 300w, http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli2-768x319.png 768w" sizes="(max-width: 942px) 100vw, 942px" />

If you observe the Request Headers section, you will notice that the browser sent an HTTP Request advertising the Encoding schemes it supports.

accept-encoding: gzip, deflate, sdch, br

The Response Headers section clearly shows that the Web Server supports Brotli compression.

content-encoding:br

Google turned on the support for Brotli with its release of Chrome 51 in May 2016. Firefox had added support for Brotli as early as September 2015.

Since Brotli is another compression algorithm; like G-zip, let’s have a look at how each of the compression algorithms fair when tested for CSS, JS, and a JPEG resource.

So, let the comparisons begin. In this case, we are plotting the compression ratio comparing Brotli and G-Zip.

Compression Ratio

Compression Ratio is calculated using the formulae: Uncompressed Size/Compressed Size. A 10 MB uncompressed file compressed to 2 MB, the compression ratio will be 10/2 = 5. The lower the compression ratio, the better is the compression method.

http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli3-300x82.png 300w, http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli3-768x209.png 768w, http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli3-1024x279.png 1024w" sizes="(max-width: 1563px) 100vw, 1563px" />

In the screenshot above, we see Brotli reporting a better compression ratio for CSS and JavaScript file. G-zip and Brotli reported the same compression ratio (Compression ratio = 1) for a JPEG file of 4.23 MB.

Decompression Speed

A very important factor when it comes to benchmarking compression algorithms is decompression speed. How quickly browsers can decompress the compressed content at their end determines how strong the compression algorithm is.

http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli4-300x78.png 300w, http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli4-768x200.png 768w, http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli4-1024x266.png 1024w" sizes="(max-width: 1639px) 100vw, 1639px" />

When it comes to decompression speed as well, Brotli performed better than G-zip; reporting faster decompression times for the same JS, CSS, and JPEG files.

Compression Speed

Compression speed is an indicator of how quickly the algorithm could compress the content. The less time it takes to compress, the better the algorithm.

http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli5-300x79.png 300w, http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli5-768x203.png 768w, http://blog.catchpoint.com/wp-content/uploads/2017/06/brotli5-1024x271.png 1024w" sizes="(max-width: 1639px) 100vw, 1639px" />

Looking at the compression ratio and decompression time metrics, we see that Brotli outperformed G-zip for all the resource types tested (JPEG Image, CSS, and JavaScript). However, the time taken for compression or compression speed shows us that Brotli took a lot more time than G-zip when compressing content. So, is this a problem?

I would say no. It is not advisable to judge or benchmark compression algorithms simply by comparing the bytes saved. Here’s why:

  • G-Zip offers nine quality levels of compression. Lower quality levels such as one or two will result in extremely fast compression but will not result in a lot of file savings.
  • Using a quality level of nine will improve the compression quality and will result in greater file saving, but the compression speed will be slow.
  • G-Zip offers nine quality levels whereas Brotli offers 11. The higher the quality level, the better the compression and slower is the compression speed.
  • G-Zip by default uses a quality level of six which has been optimized for speed as well as efficient compression. Brotli, on the other hand uses the maximum available quality level of 11 which results in unbelievable compression quality but is slower. Please note that the tests were run at the default quality settings for both G-Zip and Brotli.
  • It is recommended to tweak the quality levels of the compression algorithm being used by you (either G-Zip or Brotli) based on your requirements to achieve efficient results.

Future of Brotli

  • G-Zip has been around for the past 19 years whereas Brotli as a compression algorithm was introduced only a couple of years back. Brotli may not be as big as a game changer when compared to the features that HTTP/2 may have to offer but it still is a game changer. With websites getting complex with each passing day, with the #items loading to serve a webpage increasing day by day, every second and every byte saved makes a difference to the overall end user experience.
  • With the default compression setting of 11, Brotli may not be a good pick when it comes to compressing dynamic content on the fly. These settings, however, can be tweaked to achieve a balance between compression speed and compression quality.
  • Brotli is supported by all major browsers out there including Google Chrome, Mozilla Firefox, EDGE and Opera.
  • According to Google: “The AMP cache, which stores slimmed-down web pages on Google’s servers, now uses Google’s Brotli compression algorithm to reduce document size by 10 percent in supported web browsers, and compresses images 50 percent more efficiently without affecting quality.” This was announced in Google I/O 2017. AMP or Accelerated Mobile Pages is a Google Project. You can see the keynote here: https://youtu.be/BGyF5Uh3w1M
  • Companies like Microsoft and LinkedIn have already started to experiment with Brotli and this a positive step towards the overall development and adoption of “Brotli” as an efficient compression algorithm and the future of this compression algorithm; named after a “Swiss bakery product” looks bright.

**All tests to compare the performance of G-Zip and Brotli were run using “R” on a computer with the following specifications:

Processor: Intel(R) Core(TM) i7-6600U CPU @ 2.60GHz, 2808 Mhz, 2 Core(s), 4 Logical Processor(s); RAM: 8.00 GB

The post Compression: Making the Big Smaller and Faster (Part 2) appeared first on Catchpoint's Blog - Web Performance Monitoring.

Read the original blog entry...

More Stories By Mehdi Daoudi

Catchpoint radically transforms the way businesses manage, monitor, and test the performance of online applications. Truly understand and improve user experience with clear visibility into complex, distributed online systems.

Founded in 2008 by four DoubleClick / Google executives with a passion for speed, reliability and overall better online experiences, Catchpoint has now become the most innovative provider of web performance testing and monitoring solutions. We are a team with expertise in designing, building, operating, scaling and monitoring highly transactional Internet services used by thousands of companies and impacting the experience of millions of users. Catchpoint is funded by top-tier venture capital firm, Battery Ventures, which has invested in category leaders such as Akamai, Omniture (Adobe Systems), Optimizely, Tealium, BazaarVoice, Marketo and many more.