Andreas Grabner About the Author

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

How we improved our Web Site Performance Rank from D to A

Timed with our new product launch we also updated our corporate website. Not only did we update the content, we also applied some of the Best Practices that Alois and I have been talking about in the past 12 months. When we introduced the new Performance Report in dynaTrace AJAX Edition 2.0 Beta we got feedback from several community members who tested this feature on our website. They showed us that our website didn’t really do that well in the ranking. The Rank we had back then was somewhere between C and D. The first thing I therefore did after the site was launched was to look at the Performance Report of the main landing pages of our new site to see whether our improvements made a difference:

dynaTrace AJAX Performnance Report giving us good grades on the new web site

dynaTrace AJAX Performnance Report giving us good grades on the new web site

Looks much better than what we had before. As I mentioned before - in order to improve web site performance we implemented some of the Best Practices described in our Best Practices on Web Site Performance Optimization. In this blog I want to give you a quick overview of what we have done:

Using Sprites

First Best Practice that we follow is to merge images in order to reduce roundtrips. This technique is called CSS Sprites. We use it for certain menu elements as you can see in the following image:

We use CSS Sprites for some of our graphic menu elements

We use CSS Sprites for some of our graphic menu elements

Using sprites reduces the number of resources a user has to download from our site and therefore reduces the roundtrips from the browser to the server. Especially for users with high latency or users with browsers that only open a 2 physical connections per domain this change greatly improves page load time.

Using Images Domains

The next thing we did is to use Domain Sharding. Domain Sharding means that you host your resources on multiple domains instead of just your www.mycompany.com domain. Best Practices suggest using one or more unique domains for images. Maybe one more domain for CSS and JavaScript files and also a separate one for dynamic calls via AJAX. All of this obviously depends on how many resources you really have and if it makes sense. We decided to use a separate domain for images that are shared across all pages. The following image shows that the browser can download images from both www.dynatrace.com and s3.dynatrace.com:

A separate Image Domain allows the browser to download more resources in parallel

A separate Image Domain allows the browser to download more resources in parallel

Depending on how many images you have on your site this change can have a significant performance improvement. We do not have too many images but using a separate domain saves us several hundred milliseconds.

Optimized JavaScript Menus

We used to use Superfish, a jQuery Plugin, for our dynamic JavaScript menus. As described in my blog post on the performance impact of dynamic JavaScript menus we also ran into minor performance problems with larger menu structures. It was nothing severe but we have seen users with older browsers taking up to 1s in the initialization of the JavaScript menu. We switched to a different menu implementation and with that gained another several hundred milliseconds.

Leverage Browser Caching

Content that doesn’t change frequently can be cached on the users desktop. HTTP allows us to even specify for how long these objects can be cached before the browser should re-request them again. We now cache almost every resource on the page with an expiration time of 10 years. In case we really need to change a resource, such as an image or JavaScript file, we manage this by changing the name of the file in order to force the browser to download it.

The following image shows a request to one of our JavaScript files. You can already see that we use version information in the filename itself. In case we have a new version we give the file a new name forcing the browser to download it. We also see that through the Expires Header we allow the browser to cache the file for the next 10 years:

Browser Caching speeds up Web Site Performance for re-visiting users

Browser Caching speeds up Web Site Performance for re-visiting users

Leveraging the Browser Cache not only speeds up web site performance for revisiting users. It also reduces unnecessary roundtrips and therefore reduces the load on the server and network.

Content Delivery Network

For or larger multimedia files, e.g.: flash content we decided to use a content delivery network (CDN) in order to speed up download of these larger resources for users that are not close to our web server. CDN’s are great as they can be much closer to the end-user as your servers and are therefore faster in delivering static content that doesn’t require interaction with your application server.

Using a Content Delivery Network for Flash Content

Using a Content Delivery Network for Flash Content

There are many providers for CDNs out there allowing you to serve your static content faster to your remote users and therefore again speeding up web site performance

Reduced SSL Connections

Due to a coding issue we ended up referencing certain content through SSL even though it was not necessary. In the particular case it was content that we embedded from a 3rd party provider. It was one JavaScript file that was included from salesforce.com via HTTPS. The cost of establishing the SSL connection for one single resource is rather high – and in our case – was not necessary for users that are not logged into the site. Once logged in we switch all traffic to SSL – but- for anonymous user the SSL download is not necessary.

Fixing this include to use HTTP instead of HTTPS saved us 500ms on average. This time was mainly saved in the Connect Time which also includes the SSL Handshake.

Overhead with SSL Handshake for a single resource was rather high

Overhead with SSL Handshake for a single resource was rather high

Fixing this problem using protocol relative URLs saved almost 600ms on our public non-secure pages. It is therefore worth double checking all embedded resources on both secure and non-secure pages of your site.

Conclusion

All measures we have taken were rather simple but had a great positive impact on page load time. For further reading I recommend the blog on Top 10 Client-Side Performance Problems. Once you worked over your site and fixed the main problems it is time to constantly analyze your page to make sure that changes don’t negatively impact the end-user. For that you can find more on Automate Performance Analysis in Test and CI as well as End-User Experience Management.

Comments

  1. Thanks for the tips. Going to explore these for my website too.

  2. Great tips..

  3. great info will help me alot

  4. Excellent tips

  5. What menu implementation did you switch to?

  6. @Lisa: it’s a homegrown jquery (and hoverIntent) driven menu.
    Inspired from this one: http://www.sohtanaka.com/web-design/examples/mega-dropdowns/

Comments

*


eight − 1 =