From time to time I run my benchmark tests. To see if technical improvements did materialize in tangible performance benefits. It has been a while I ran these, but with the general availability of IP15.1/14.7 I reckoned it would be a great moment to take a step back and compare apples again. The primary browsers are Chrome (R43) and 11.0.9600.17832CO Update 11.0.20 (that means including the June 2015 cumulative updated).
At this stage I compared three apples:
- Patchset 14.4 (March 2015)
- Patchset 15.0 (May 2015)
- Patchset 15.1/14.7 (June 2015)
IP15.1 and 14.7 technically are build on the same code base. That means that all delivered code in these patch sets is 100% identical. But still, these are two different applications, right? Right. Because of the different repositories, other seed-data and items like additional themes the functionality is different. If you’d look at performance of both, it would make little sense to test them both. So I don’t. To make the comparison match, I will use the same Aurora theme (since the Synergy theme is non-existent in Innovation Pack 2014).
As a testing strategy I use my set of 4 views. These views range from little complexity to very complex. Still the simplest view is a parent-child view. Parent is a form applet with 99 (!) controls. The child is a list applet with about 80 (!) list columns. The most complex view has 4 form applets with each 99 controls, and 4 list applets with 80 controls. I would not recommend creating anything like this for production use, but as matter of comparison it works just fine.
To measure the time it takes to build the view, I use a measurement framework. Which basically times the lapse between “preload” and “postload”. The time lapse between these two events consumes the majority of the time spent in the Open UI framework. It does exclude CSS processing though. I take the measurements always on an unloaded system. And I have a clear look at the task manager too, to ensure CPU shows low levels of activity. To get accurate figures, I take a number of samples. Looking at the Standard Deviation will tell whether the measurements have potential outliers or not.
So what are the results? First of all, the difference between IE11 and Chrome remains about a factor two. This is quite a stable factor over the past year. All the efforts MS has put into IE11 has not moved it much closer to its rival.
That is what the graph tries to tell. For example: IE11 consumes 209% of the time Chrome does for the most complex view view on IP15.0.
That said, if we set all measurements side-by-side you get a grasp on the evolution. There is clearly a declining line between 14.4, 15.0 and 15.1. Interestingly, there is a big improvement for IE11 on 15.1/14.7. I did take these measurements a couple of times to be sure. But true, there is quite a significant improvement for IE. Hooray!
This all demonstrates the efforts Oracle development is putting in materializing improvements in performance. This really is not a simple task, since the Open UI framework heavily relies on JQuery. And there are many JQuery patters (e.g. JQuery selectors) which can be extremely costly for JQuery. But also smaller things like “for” loops, which tend to be 20% slower than “do-while” loops. Imagine if you have loops over large objects or arrays, that can count up (and it does).
Here are the results.
Enjoy the weekend or your holidays!