Author: Ben Bozzay
In general, website load speed suffers when a browser must performance unnecessary work. The cornerstone of this course is to find ways to reduce the burden placed on the browser so that a webpage becomes useable as soon as possible.
Although websites are slow for the same general reason, optimization should not use a "one size fits all" approach.
Website owners might not see loadspeed issues because of file caching, fast internet connections, and optimal computer hardware.
In my career, the worst offender of this was an enterprise website that caused the user to download over 200mb of images. Company stakeholders didn't notice this due to caching and because they viewed the website on a 1GB connection. They only realized there was an issue when analytics showed a high bounce rate for the page.
Caching is an optimization technique that benefits users who frequent a specific website. Typically when a page is visited in the browser, the page resources are downloaded from a server and saved to the computer's disk.
141ms download time.
The overall network request (connection + download) time depends on various factors, like the user's internet connection, file size, and other factors related to communicating with the server.
On subsequent page loads, if the same resource is used on another page (like
main.js), then your browser serves the disk's "cached" version of the file instead of downloading it again from the server.
Download time reduced from 141ms to 7ms for a cached file.
Retrieving a file from disk is significantly faster than downloading from the server, so these subsequent page loads can be really fast.
As the size of the CSS or JS bundles increases, the first page load ultimately increases. However, due to caching, subsequent page loads don't significantly increase. This can cause an optimization blindspot.
Caching provides amazing performance improvements for subsequent pageloads, but not for users who have never visited the website before.
A common optimization approach is to combine (bundle), compress, and cache resources. This combination approach is sometimes enough for optimization, but often leads developers into scaling issues.
Unbundled & Uncompressed:
// INPUT src/page1_stylesheet.css src/page2_stylesheet.css src/page3_stylesheet.css // OUTPUT dist/page1_stylesheet.css dist/page2_stylesheet.css dist/page3_stylesheet.css
Bundled and Compressed:
// INPUT src/page1_stylesheet.css src/page2_stylesheet.css src/page3_stylesheet.css // OUTPUT dist/main.min.css
Network requests take time to interact with a server. An extra 101ms is spent communicating with the server for one network request.
When this process occurs over several resources, the cumulative time can really add up. By reducing these network requests from 3 to 1 through bundling, we could potentially save around 200ms in server communication time!
Bundling is completely fine for small websites, but usually does not scale for large websites (enterprise especially).
As the complexity of a website increases, the number of unique resources (such as page templates) increases. At this point, bundling is no longer practical.
Compression does not automatically result in good optimization, especially if a page contains 2000 compressed images that the user won't actually see, but that the browser automatically downloads. A mobile user's data plan will still suffer and the overall page load time will still be excessive.
Optimization is not a one-size-fits-all approach. Some optimization practices are not a good fit depending on the project.
Serving resources that aren't needed to render a page causes the browser to perform unnecessary work. This involves much more than just the download time of a resource. Resources are not just downloaded, but they are also parsed. Even if a 500kb stylesheet is compressed to 100kb, the CSS parser still evaluates unused CSS rules.
Optimization requires a consideration of all these factors. A small blog could likely use the bundle + caching approach while an enterprise website needs a more precise approach.
An effective approach to optimization involves selectively loading critical, compressed resources to reduce the overall work the browser has to perform.