Jason Deacon Team : Web Development Tags : Technology Performance Featured

Performance review: profiling a website's speed and performance

Jason Deacon Team : Web Development Tags : Technology Performance Featured

Recently we were asked to create a report on the performance of one of our client’s website. We used a number of tools to analyse and report on the various aspects of the website to determine the dominant factors contributing to the slow page load times. We follows the logical structure of the website (or rather, any website really) and came to conclusions based on analysis and data. When our conclusions identified inefficient parts of the site, we gave specific recommendations which are designed to fix the issues that were identified.

The below list of conclusions and recommendations are based on the actual report given to our client and indicates the sort of effort we take to ensure that our clients websites are the quickest that they can be.



Tests were performed on the client's live site (client-side tests) as well as a version on a developers machines (stress testing and code profiling).

The testing and analysis included the following items:

  • Server side database performance
  • Server side code performance
  • Network latency and bandwidth performance
  • Client-side asset efficiency (file size etc)
  • Client-side code performance

Based on our analysis it is clear that the client side elements (images, javascript, tracking systems) are the primary cause of the excessively high page loading times, and our recommendations below reflect our findings. I have included a set of conclusions which outline the elements listed above which are not contributing to the slowness of the page load (or are not contributing in any meaningful way) and therefore have no recommendations for improvement at this time.


Conclusion 1 – Server side database performance

Through detailed code profiling we were able to ascertain that the number of database calls for such pages as the homepage and product detail pages does not exceed what we consider to be a reasonable number. This number did not exceed 13 database calls per page and these are all made by the CMS we use, Umbraco, to fetch content as part of any standard page. The slowest measurable page execution observed during testing (3.1 seconds, which is an outlier in itself) resulted in 100ms of time dedicated to performing database queries, which is just 3.2% of the total page execution time.

These findings clearly show that database performance is not the cause of the 8+ second page times we are seeing.


Conclusion 2 – Server side code performance

Multiple stress tests were performed on the server-side code (locally at Wiliam, not against the live site) to ascertain the performance of the server-side code when under stress. Our tests used 20 concurrent virtual users each hitting 6 different pages in sequence, tested with a random 0-1 second wait time and also with no wait time. No wait time means that each virtual user will request the next page in the list as soon as the current one finishes. This configuration usually results in complete resource utilisation during the test and is a good indicator of just how the server side code will perform under such conditions.

The idea behind these tests are to ensure that the server-side code is not significantly adding to the performance degradation that is being observed on the live site.

The results show that over time, the page execution speed remains extremely stable, only increasing from ~50ms under typical load, to ~260ms only at complete resource utilisation on the computer running the tests (in this case, a 4 core CPU) caused by the zero wait time of the second testing profile.

First run, staggered 20 virtual users, 0-1 random wait time, homepage:

Performance Graph 1

Second run, staggered 20 virtual users, no wait time, homepage:

Performance Graph 2

The results of these tests confirm that the server side code is not contributing to the overall page load time in any way that is noticeable by the end user at this point in time.


Conclusion 3 - Network latency and bandwidth performance

Network latency and bandwidth limitations can impact page loads in different ways, but both will appear to the end user as content coming through “slowly”. Network latency will impact the time it takes for the site to receive a request by the users browser for a piece of content (image, javascript, css, html) and if this is large (>250ms) then it will have a noticeable impact on the experience.

Likewise, bandwidth limitations will result in the user taking longer to download a specific piece of content, for example it may take 10 seconds to download a 250KB image instead of 1 second, depending on bandwidth.

Network latency and bandwidth are difficult to analyse and diagnose with respect to consistent page load impact due to the way that network infrastructure is designed and built as well as the protocols that the data conforms to in order to reach its destination. For example, a user may experience a 100ms latency to a particular website on one day, and then experience a 10ms latency the next day, without anything changing on the user’s premises or the website's  infrastructure.

Given these limitations, I’ve decided to test network latency and bandwidth combined, from two separate locations inside Australia and then come to our conclusions based on the average of those two results.

The test was the time taken to download a non-cached image which is 118KB, and happens to be the largest single image on the page.

  Average (MS)
Location 1 72.5
Location 2 463.75

Despite these two locations having very comparable basic stats (both received a ping within 10ms of each other to the website, both are within 1Mbit/s downstream bandwidth of each other) the results to download the same file were quite different. 


Conclusion 4 - Client-side asset efficiency

Having generated reports on the client-side assets, it is evident that there is both a high number of files and some quite large files which are impeding the ability for the browser to download all content in a timely manner.

There are approximately 50 images on the homepage which contribute roughly 2.5MB (69%) to the overall weight of the page.

Other static assets such as javascript files (jQuery et al), while contributing to overall page weight, are not the primary concern due to the majority of the weight being used by images as indicated above.

Recommendations 1, 4, 5 and 6 detail our recommendations on how to address these issues and improve load times by mitigating how much static content the browser has to download upon a cold (non-cached) page load.


Conclusion 5 - Client-side code performance

Through detailed page load reporting tools we were able to confirm that a significant portion of the page load process was being dedicated to loading and executing third party tracking scripts. You can see in this image below that the static content loading for the website finishes at approximately the 4 second mark and then the third party tracking content is loaded and executed for the next ~3.5 seconds. For a ~7.5 second page load time, ~3.5 seconds is an extremely large amount of time (46%) for something which is not critical to the users of the site.

Performance Graph 3

As a result, Recommendation 2 addresses this conclusion in a limited capacity given that the implementation/operation of these third party scripts is outside of our control.



The following are a set of recommendations that we have come up with in order to reduce the total page load time in order to improve the user experience on the website. These recommendations are largely technical in nature and therefore require Wiliam to implement whichever solutions are agreed upon. The recommendations with the highest impact are listed first, to the ones with the least impact last.


Recommendation 1

It is evident that due to the large number of static content files that pages on the website contains (especially the homepage), that browsers are stalling the downloading of static content such as images and scripts because they have reached their internal concurrent connection limit for the primary website domain.

The recommendation is to utilise different hosts (domain names) to spread the asset loading out in order to get around the concurrent connection limits of browsers. These will need to be top-level domains and not subdomains. For example, you could initially use two domains named like:



These domains would be configured to point to the same IP as the main site (or optionally be configured to a different server down the track in order to scale if needed). These domains would be configured in IIS to serve static content only with all other features (cookies, sessions, etc) turned off so that IIS does not execute any ASP.NET pipeline code in order to serve the static content. Also caching parameters can be looked at during this step such as setting a far-future expiry date and providing an ETag, but a caching strategy must be put in place to allow in order to avoid permanent caching issues of content that will actually change (CSS files, Product Images, etc).

When the pages are being rendered on the server for each request, image paths will be constructed in order to round-robin the CDN domains so spread requests out evenly among the CDN domains, maximising the amount of concurrent requests the browser can utilise to download the content as quickly as possible.


Recommendation 2

An analysis of the loading patterns of the homepage it is evident that the inclusion of many third-party javascript files on the page is dramatically increasing the page load time. These scripts need to be first downloaded from the remote servers which the performance of that request cannot be guaranteed and may stall the browser from downloading static content due to the concurrent connection limit mentioned in Recommendation 1.

A load of the page with javascript disabled (and therefore no post-page load javascript activity) saw a page load time (including assets) of 1.31 seconds. The difference is very large. Of course the site was built using javascript so this number of 1.31s is unrealistic to achieve just by disabling the third party scripts.

Our recommendation is to remove any extraneous client-side tracking systems and only use the most useful one (typically Google) in order to improve users experience.


Recommendation 3

Further analysis revealed that the execution of multiple javascript files (outside the tracking scripts as mentioned in Recommendation 2) are in fact blocking the page load process which is increasing page load times. By utilising async/deferred javascript execution, these scripts could be executed at the same time as the page continues to load which would decrease page load times. Work would need to be done to ensure that making these scripts execute in parallel (or after page load) would not have any impact on site functionality where certain page features require scripts to have been run by the time their section of the page is loaded.


Recommendation 4

Many images have been identified as having room to be optimised in order to reduce their file size while retaining the original image quality (the process is lossless). Our recommendation here would be to either manually process the images before they are uploaded to the site via the CMS. Since the image upload function within the CMS is part of the CMS platform that we use (Umbraco), customising it to automatically optimise the images as they are uploaded is not a process we would be able to do.

Optionally, there may be a way to process the images periodically once they are uploaded to the server, but this would cause potential caching problems with relation to recommendation 1, where we take an aggressive caching policy for all static content.

Utilising all optimised images could see a reduction of up to 59% in file sizes which will in turn reduce the amount of time to load the page.


Recommendation 5

Another recommendation is to modify the behaviour of the homepage and category pages in order to use “lazy loading” for images. This means that as the user scrolls down the page the images will be loaded in. This will not only significantly decrease the amount of images that the user’s browser downloads on page load (thus reducing the page load times) but also has the added benefit of reducing the server bandwidth in the case that the user does not scroll down the page.

This would be a technical change that a developer would need to spend time implementing.


Recommendation 6

The final recommendation which arguably has the same impact as Recommendation 5, is to configure the site to simply show less content on the homepage. Instead of showing 38 products (which is 38 image downloads), we could limit it to 8 for example. This will operate much like Recommendation 5 in that it will reduce the static content that is required to complete a page load and therefore decrease page load times.