Andreas Grabner About the Author

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

Top 8 Performance Problems on Top 50 Retail Sites before Black Friday

The busiest online shopping season is about to start and it’s time to make a quick check on whether the top shopping sites are prepared for the big consumer rush or whether they are likely going to fail because their pages are not adhering to Web performance best practices.

The holiday rush seems to start ramping earlier and earlier each year, the Gomez Website Performance Pulse showed an average degradation in performance satisfaction of +13% among the top 50 retail sites in the past 24 hours when compared to a typical non-peak period. For some, like this website below, the impact began a week ago.  It’s important to understand not only whether there is a slowdown, but also why – is it the Internet or my app?

Nobody wants their site to respond with a page like that as users will most likely be frustrated and run off to your competition

Nobody wants their site to respond with a page like that as users will most likely be frustrated and run off to your competition

Before we provide a deep dive analysis on Cyber Monday looking at the pages that ran into problems we want to look at the Top 50 Retail Pages and see whether they are prepared for the upcoming weekend. We will be using the free SpeedOfTheWeb Service as well as deep dive dynaTrace Browser Diagnostics. We compiled the Top Problems we saw along the 5 Parts of the Web Delivery Chain that are on these analyzed pages today, 3 days before Black Friday 2011. These top problems have the potential to lead to actual problems once these pages are pounded by millions of Christmas shoppers.

High Level Results of one of the Top Retail Sites with Red and Yellow Indicators on all Parts of the Web Delivery Chain

High Level Results of one of the Top Retail Sites with Red and Yellow Indicators on all Parts of the Web Delivery Chain

 

User Experience: Optimizing Initial Document Delivery and Page Time

The actual User Experience can be evaluated by 3 key metrics: First Impression, OnLoad and Fully Loaded. Looking at one of the Top Retail sites we see values that are far above the average for the Shopping Industry (calculated on a day-to-day basis based on URLs of the Alexa Shopping Index):

4.1 seconds until the user gets the first visual indication of the page and a total of 19.2 seconds to fully load the page

4.1 seconds until the user gets the first visual indication of the page and a total of 19.2 seconds to fully load the page

 

Problem #1: Too many Redirects result in delayed First Impression

The browser can’t render any content until there is content to render. From entering the initial URL until the browser can render content there are several things that happen: Resolving DNS, establishing connections, following every HTTP redirect, downloading HTML content. One thing that can be seen on some of the retail sites is the excessive use of HTTP Redirects. Here is an example. From entering the initial URL until retrieved from the initial HTML document the browser had to follow 4 redirects taking 1.3seconds:

4 HTTP Redirects from entering the initial URL until the browser can download the initial HTML Document

4 HTTP Redirects from entering the initial URL until the browser can download the initial HTML Document

Problem #2: Web 2.0 / JavaScript impacting onLoad and blocking the browser

The Timeline view of one of the top retail sites makes it clear that JavaScript – the enabler of dynamic and interactive Web 2.0 applications – does not necessarily improve User Experience but actually impacts User Experience by blocking the browser for several seconds before the user can interact with the site:

Many network resources from a long list of source domains as well as problematic JavaScript code impacts User Experience

Many network resources from a long list of source domains as well as problematic JavaScript code impacts User Experience

The following problems can be seen on the page shown in the timeline above:

  • All JavaScript Files and certain CSS Files are loaded before any images. This delays First Impression Time as the browser has to parse and execute JavaScript files before downloading and painting images
  • One particular JavaScript block takes up to 15 seconds in in order to apply dynamic styles to specific DOM Elements
  • Most of the 3rd Party content is loaded late – which is actually a good practice

Browser: JavaScript

As already seen in the previous Timeline View – JavaScript can have a huge impact in User Experience when it performs badly. The problem is that most JavaScript actually performs well on the desktops of Web developers. Developers tend to have the latest browser version, have blocked certain JavaScript sources (Google Ads, any types of Analytics, etc …) and may not test against the full blown web site. The analysis in this blog was done on a Laptop running Internet Explorer 8 on Windows 7. This can be considered an average consumer machine.

Here are two common JavaScript problem patterns we have seen on the analyzed pages:

Problem #3: Complex CSS Selectors failing on IE8

On multiple pages we can see complex CSS Selectors such as the following that take a long time to execute:

Complexe jQuery CSS Lookups taking a very long time to execute on certain browsers

Complex jQuery CSS Lookups taking a very long time to execute on certain browsers

Why is this so slow? This is because of a problem in IE8′s querySelectorAll method. The latest versions of JavaScript helper libraries (such as jQuery, Prototype, YUI, etc …) take advantage of querySelectorAll and simply forward the CSS Selector to this method. It, however, seems that some of these more complex CSS Lookups cause querySelectorAll to fail. The fallback mechanism of JavaScript helper libraries is to iterate through the whole DOM in case querySelectorAll throws an exception. The following screenshot shows exactly what happens in the above example:

Problem in IEs querySelectorAll stays unnoticed due to empty catch block. Fallback implementation iterates through whole DOM

Problem in IEs querySelectorAll stays unnoticed due to empty catch block. Fallback implementation iterates through whole DOM

Problem #4: 3rd Party Plugins such as Superfish

One plugin that we found several times was Superfish. We actually already blogged about this two years ago that this can lead to severe client side performance problems. Check out the blog from back then: Performance Analysis of dynamic JavaScript menus. One example this year is the following Superfish call that takes 400ms to build the dynamic menu:

jQuery Plugins such as Superfish can take several hundred milliseconds and block the browser while doing the work

jQuery Plugins such as Superfish can take several hundred milliseconds and block the browser while doing the work

Content: Size and Caching

In order to load and display a page it requires the browser to load the content. That includes the initial HTML Document, images, JavaScript and CSS files. Users that come back to the same site later in the holiday season not necessarily need to download the full content again but rather access already cached static content from the local machine. Two problems related to loading content are the actual size as well as the utilization of caching:

Problem #5: Large Content leads to long load times

Too much and large content is a problem across most of the sites analyzed. The following is an example of a site with 2MB of Total Page time where 1.5MB was JavaScript alone:

This site has 2MB in size which is far above the Industry average of 568kb

This site has 2MB in size which is far above the Industry average of 568kb

Even on high-speed Internet connections a size that is 4 times the industry average is not optimal. When accessing pages of that size from a slow connection – maybe from a mobile device – loading all content to display the site can take a very long time leading to frustration of the end user.

Problem #6: Browser Caching not leveraged

Browser Side Caching is an option web-sites have to cache mainly static content on the browser to improve page load time for revisiting users. Many of the tested retail sites show hardly any cached objects. The following report shows one example where caching basically wasn’t used at all:

Client Side Caching is basically not used at all on this page. Caching would improve page load time for revisiting users

Client Side Caching is basically not used at all on this page. Caching would improve page load time for revisiting users

For more information check out our Best Practices on Browser Caching.

Network: Too many resources and slow DNS Lookups

Analyzing the network characteristics of the resources that get downloaded from a page can give a good indication whether resources are distributed optimally across the domains they get downloaded from.

Problem #7: Wait and DNS Time

The following is a table showing resources downloaded per domain. The interesting numbers are highlighted. There seems to be a clear problem with a DNS Lookup to one of the 3rd party content domains. It is also clear that most of the content comes from a single domain which leads to long wait times as the browser can’t download all of them in parallel:

Too many resources lead to wait times. Too many external domains add up on DNS and Connect Time. Its important to identify the bottlenecks and find the optimal distribution

Too many resources lead to wait times. Too many external domains add up on DNS and Connect Time. It’s important to identify the bottlenecks and find the optimal distribution

Reducing resources is a general best practice which will lower the number of roundtrips and also wait time per domain. Checking on DNS and Connection times – especially with 3rd party domains – allows you to speed up these network related timings.

Server: Too many Server-Requests and Long Server Processing Time

Besides serving static JavaScript, CSS and Image files, eCommerce sites have dynamic content that gets delivered application servers.

Problem #8: Too much Server Side Processing Time

Looking at the Server Request Report gives us an indication on how much time is spent on the Web servers as well as application servers to deliver the dynamic content.

Server Processing Time and the number of dynamic requests impact highly dynamic pages as we can find them on eCommerce sites

Server Processing Time and the number of dynamic requests impact highly dynamic pages as we can find them on eCommerce sites

Long server-side processing time can have multiple reasons. Check out the latest blog on the Impact of 3rd Party Service Calls on your eBusiness as well as our other server-side related articles on the dynaTrace Blog for more information on common application performance problems.

Waiting for Black Friday and Cyber Monday

This analysis shows that, with only several days until the busiest online shopping season of the year starts, most of the top eCommerce sites out there still have the potential to improve their websites in order to not deliver a frustrating User Experience once shoppers go online and actually want to spend money. We will do another deep dive blog next week and analyze some of the top retail pages in more detail providing technical as well as business impact analysis.

Comments

  1. Host your static files on CDN networks, they will be faster than most hosting providers.

  2. Hey Andi – great rundown…here’s #9 for you: you are conducting load testing outside the firewall (which is a good thing) in preparation for the big launch, but the testers didn’t randomize their test data and reused the same login and product for massive load tests which results in a persistent cart object has 100,000+ items (which is a bad thing).

    Take a guess at what happens in the appserver performance when you have a single cart with that many items.

    Bad testing practice is also a problem around the holidays.

    -mt

  3. Good analysis.
    But your own page (just checked in my Chrome) loads >1.5Mb and the browser had to do 93 requests. =)

    • Thats correct – we could definitely do a much better job in applying the Best Practices on our web site :-)
      Fortunately we are not an eCommerce site as are the pages that I’ve analyzed :-) I think whats really interesting is to compare these sites against each other, and, against the industry average. SpeedOfTheWeb.Org calculates the industry metrics by analyzing sites from Alexa.

Comments

*


2 + four =