Why you can’t compare cross browser execution times of Selenium Tests
I am currently working on a blog where I am going to explain how to do cross-browser testing and performance analysis using dynaTrace. Before I am going to blog that How-To I want to share with you one thing that I have noticed when executing my tests in Internet Explorer 8 and Firefox 3.6. Test execution times are very different – but – this is not because one browser is slower than the other. It is because Selenium has different synchronization mechanisms in IE and FF to e.g: wait for a page to be loaded.
I run a simple test on the latest 2.0 Beta build of Selenium. The test executes the following steps:
- Opens the homepage of my test application
- Enters username and password and hits the Submit Button
- Then clicks through 3 different pages by clicking on standard links
Execution Times over Time
I executed this test multiple times. In every test iteration I first execute the test on Internet Explorer 8 and then on Firefox 3.6. I use dynaTrace AJAX Premium Extensions to track performance on all these test executions. Looking at the Test Automation View allows me to compare different metrics, over time and also across the two browsers. The following screenshot shows the execution time of the test scenario of the last 6 test iteration in both IE and FF. The green line represents IE with an execution time that is about 1.5s slower than FF.
So – looking at the execution time alone would let me assume that IE is just so much slower than Firefox. But – that is actually not the case.
Detailed Timeline Analysis
What this really tells us is that looking at execution times of tests is not good idea when we want to compare performance of tests across browsers. It seems Selenium has different ways to wait for certain events to happen. Depending on the browser this can have a significant impact on execution time.
Does this mean we can’t do performance tests with Selenium?