Andreas Grabner About the Author

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

Get involved in the dynaTrace AJAX Edition 2.0 Beta Program

Velocity was probably the best stage for this announcement and it was great that we got a slot on the Lightening Demo’s to show what’s new from dynaTrace.

Velocity Video - Presentation of dynaTrace AJAX Edition 2.0

The dynaTrace Labs team has worked hard on the next version of the dynaTrace AJAX Edition. Since its inception, dynaTrace AJAX Edition got a lot of great and positive feedback – especially due to the depth of data that it collects which includes all Network Traffic, full JavaScript traces, full DOM traces, tracing XmlHttpRequests (XHR) and deep insight into the browser’s rendering engine. And all of that for Internet Explorer 6, 7 and 8 which still happens to be the number 1 used browser out there but not the number 1 browser when it comes to good tooling support for developers and especially performance engineers.

What is new in dynaTrace AJAX Edition 2.0 Beta 1?

Even though we got a lot of requests to support Firefox, we saw a bigger need for making the data more easily accessible, providing more automatic data analysis and extending the performance metrics that we already provided. This does not mean we don’t work on building support for Firefox – we in fact do – but the real pain of our growing user base we want to solve is making it easier to make sense of the collected data.

Automatic Performance Analysis based on dynaTrace, Google and Yahoo Best Practices

You may have seen my recent blog posts where I analyzed sites like FIFA World Cup 2010, Winter Olympics in Vancouver 2010 or the Masters golf tournament. I did these analysis based on Best Practices from Yahoo, Google, Steve Souders and our own dynaTrace Best Practices that we have collected over time. We are in the fortunate situation of interacting with many big web shops around the world helping them to solve their performance problems using both the dynaTrace AJAX Edition to solve their Client-Side problems and dynaTrace continuous APM to solve their Server-Side problems. The feedback we collect in these interactions moved into our Best Practices and heavily influenced this next version of the AJAX Edition.

I’ve recently been asked how long it took me to write the FIFA blog and whether I could do the same type of analysis for other web sites too. It took me a good day to analyze the data and then write the blog. The goal with dynaTrace AJAX Edition 2.0 is to eliminate this time and get all the same results on Key Performance Indicators, Usage of Browser Caching, Reducing and Optimizing Network Roundtrips, Optimizing Application-Server-Side Transactions and JavaScript/AJAX Performance automatically. This will allow everyone out there to analyze their web sites in a fraction of the time it took me -> that’s a great performance improvement :-)

The following illustration shows the new Performance Report which is the successor of our previous Summary View. It automatically opens when double clicking a stored or live session in the dynaTrace AJAX Edition Cockpit:

Performance Report showing Key Performance Indicators, Ranks and Remarks for every page

Performance Report showing Key Performance Indicators, Ranks and Remarks for every page

The Performance Report analyzes every single URL in the dynaTrace AJAX Session based on Key Performance Indicators and grades the page in 5 major performance areas:

The 5 links on these 5 performance areas lead you to our dynaTrace Best Practice documents on Web Site Performance Optimization. The documents explain what our criteria are for fast web sites and they also give a detailed explanation and an example calculation of the underlying page ranking system. The Ranking itself is aligned with tools such as YSlow and PageSpeed. It is a rank from 100 (best) to 0 (worst) and it corresponds with a Grade from A (best) to F (worst).

The difference from YSlow and PageSpeed however is that the overall page rank is mainly impacted by the actual page-load times (Time to First Impression, Time to onLoad and Time to Fully Loaded). We know that these values heavily depend on your local connectivity to the web site you are testing, but in the end it is the speed of the page as it is experienced by the end user that defines whether the users stay on the page or not. We also not only look at the activity till the page is fully loaded. We look at all activity you did on the page which includes any user interaction that triggered JavaScript/XHR, downloaded additional content or manipulated the DOM. The grading is aligned with YSlow and PageSpeed. For that reason it is likely that your grades get worse on single url applications with heavy user interactions as we e.g.: negatively grad the number of network requests if they exceed a certain threshold. Therefore – as also stated in the Best Practice Documents – you have to handle the grades with care when dealing with highly interactive pages.

Extended Key Performance Indicators

Besides Key Performance Indicators (KPI’s) such as Page Load Time and Total Page Size we feel there is much more you have to consider when looking at the speed of a web site. The time it takes to give the user the first visual impression and the time it takes to fully load the page will define whether users actually stay on your page and you generate revenue. dynaTrace’s unique capability in Internet Explorer gives us the insight into rendering activities as well as JavaScript/AJAX activities that allow us to provide all these metrics for you.

The KPI tab on the Performance Report shows detailed metrics centred around End User Performance Experience such as Time to First Impression, Time to onLoad, Time to Fully Loaded), Server vs. Client Time vs. Interactive Time, Network Breakdown in DNS, Connect, Transfer and Wait, …  . The following screenshot shows the KPI tab with all the new metrics to analyze:

KPI Tab shows detailed performance metrics for every page

KPI Tab shows detailed performance metrics for every page

A description on how we calculate most of these KPI’s can be found in our Best Practice on Web Site Performance Optimization. Certain KPI’s are color coded based on whether they violate our thresholds for being good, acceptable or not acceptable. These thresholds are also explained in the Best Practices paper. There are some interesting KPI’s that I want to mention here as they are probably new to most of you:

  • First Impression: this is the time it takes from entering the URL till the user actually gets a visual indication of the page. We take advantage of capturing rendering activity and this is the time of the first Drawing event
  • Fully Loaded: a page is not always fully loaded when the browser triggers the onLoad or DocumentComplete event. onLoad event handlers are used to delay load additional components or modify the DOM. This time includes all these onLoad event activities such as executing JavaScript handlers and network downloads
  • On Server and On Client: These two measures indicate how much time until the page is fully loaded is spent on server activity (network downloads) vs. client activity (JavaScript processing). The ultimate goal is to bring down the fully loaded time. Knowing the split between Client and Server Time until that point helps you to focus your performance efforts on either side of the application stack.
  • Avg. Interactive: It is good when a web site loads fast – but we also need to ensure that our users have a fast interactive response when exploring dynamic menus or navigating through other features on the same page. JavaScript made truly interactive web sites possible by implementing mouse or keyboard event handlers. This measure tells you the average execution time of all mouse and keyboard event handlers after the page has been fully loaded. The value tells you how fast or slow it is for your users to interact with the page
  • Avg Wait: Browsers use a limited number of physical connections per domain to download resources such as images, css and javascript files. The more resources that have to be downloaded from the same domain the more wait time a single resource has because it has to wait for a physical connection to become available. This value tells you how long a resource has to wait on average to be downloaded. Reducing the number of network resources or using multiple domains helps to bring this number down.
  • Single Resource Domains: Domains that only serve a single resource (image, js, css) pay a high penalty on DNS and Connect Time. dynaTrace AJAX Edition lists how many resources are delivered per domain and what the individual network times (DNS, Wait, Connect, Server, Transfer) are.

We believe that these metrics will help you to get a better understanding of the performance characteristics of your individual web pages

Integration to showslow.com

If you don’t know about ShowSlow.com you should check it out. Sergey Chernyshev built this performance repository solution and hosts one public service instance on http://www.showslow.com. Tools like YSlow and PageSpeed already had an integration to ShowSlow where the performance analysis results can be uploaded to a ShowSlow instance – allowing you to compare your web site’s performance ranks over time. ShowSlow also lists the Top 100 Alexa Sites which makes it easier for us to compare these biggest sites to one another as well as compare the results of our own sites with the big ones.

dynaTrace AJAX Edition 2.0 Beta 1 now provides the same integration by allowing you to upload your performance ranks to the public ShowSlow.com instance. The Beta 1 only allows you to upload it to the public instance but this is likely going to change to be configurable for the final release so that you can upload your results to your internal ShowSlow instance as well.

There are two ways of uploading the data:

a)      Click on the “Upload your results to showslow.com” link on the Summary Tab

b)      Use the context menu on the list of URLs on the top of the Performance Report

You get prompted whether you really want to upload the data when clicking the link or the context menu in order to avoid accidental uploads to the public ShowSlow.com instance.

Once the data is uploaded you will see a new column called dynaTrace Rank next to the YSlow and PageSpeed columns. When clicking on the details you will see that we upload the following metrics: Overall Page Rank, Rank for Browser Caching, Rank for Network Resources, Rank for Server-Side Activity, Rank for JavaScript/AJAX and Total Page Size

Join the Community

This is dynaTrace AJAX Edition 2.0 Beta 1 – a big milestone for us and hopefully also for you J – so get your hands on it and download it here

The development efforts have largely been driven by feedback from our dynaTrace AJAX Edition Community. In order to move forward we need more feedback on our Best Practices and especially on the calculation of our KPI’s and the Ranking. Please read through all 5 Best Practice Documents: Best Practice on Web Site Performance Optimization, Best Practice on Browser Caching, Best Practice on Network Roundtrips, Best Practices on Application-Server-Side Transactions and Best Practices on JavaScript/AJAX Performance. All these documents contain a detailed description on how we calculate the KPI’s and how we grade the pages. Please post your comments and feedback on the following Community Discussion Thread: Feedback on dynaTrace AJAX Edition 2.0 Beta 1.

Complete Walk-through Webinar

In order to give you a detailed insight into the new features of dynaTrace AJAX Edition 2.0 Beta 1 we are hosting a webinar where we give you a detailed walk-through of the new KPI’s, the Best Practices and the Performance Report. Sign up for this free webinar which will take place on Wednesday, Jun 30th at 12 AM EST (9 AM PST, 6PM CET). The webinar will also be recorded and made available shortly after.

ENJOY IT!!!


Comments

  1. Looking forward to the recorded webinar.

Comments

*


+ six = 7