It all began on the day when a bunch of three performance testers were given a task of migrating the performance testing activities of multiple applications of a client to an open source tool. This included architectural level migration, resource knowledge transition and testing-operation transition from a licensed tool (LoadRunner) to an open source tool. Yes, you guessed it right, I was amongst those.
We started with the most common & age old practice – Understanding the migration requirements and finding the tool in the market that could serve them all. We studied the different applications that need to be tested, their architectures, mode of communication between clients and servers and most important of all – the technical expertise available.
Considering ourselves as statisticians, we created a complex matrix that had following details for different open source tools in the market-
- Mode of replay
If command prompt based tool, then the interface involved and expertise required / available for the same
If script based tool, then the scripting language involved and expertise required / available for the same
- Infrastructure required
- Ease of scripting / coding & test execution
- Stability of the tool-software and feedback on forums
- Number of user load that the tool can support (business speak ‘concurrent number of users’ terminology)
- Ease of migration of scripts and other project items associated with LoadRunner to the scripts (and other project items) on the new tool. This would have avoided a lot of rework.
- Popularity of the tool, availability of source code and frequency of upgrades
- Accuracy of results
- Different counters available as a part of results
- Flexibility in data manipulation
- Flexibility in using user defined functions
- Team’s skillset and ease of learning a new tool / technology
- IT applications to be tested and if the tool could meet the protocol requirements of those applications
After creating the matrix, we shortlisted some tools that met our criteria. Those were OpenSTA, Pylot and JMeter. The next step was to understand these tools as first time users, to test these tools for the in-house web applications and to do a feasibility study on them. Removing Pylot from the list was easy, we faced a lot of issues understanding a new language (python in this case) and noticed that the tool was not efficient for ‘https’ based applications (a must requirement for SSO authentication page).
The next activity that made us think a lot and invest a lot of time was comparing OpenSTA and JMeter. Both provided replay feature for web based applications, could support more than a 500 concurrent users test (with no blockages and issues during the test run), provided excellent data manipulation, dynamic data management and webpage text check features and amazing test result statistics.
An initial thought was to create the infrastructure of both and ask the performance testers to select their own tool. In this way they could work on the tool they favoured. Over time we found JMeter gained more popularity in the project and was being used by most of the performance testers in the team. Nobody preferred OpenSTA. With further investigation we found –
- JMeter was based on simple record-customize-playback concepts. For basic features like co-relation, content check the testers were not required to write their own functions. On the contrary, for OpenSTA, one had to write his own functions for these functionalities
- JMeter, being based on Java concepts, was easier for the team to learn (since the team was well acquainted with Java and OOPs concepts)
- JMeter tool was more Semenax robust as compared to OpenSTA (less stack up issues observed at run-time)
- JMeter offered ease in test data maintenance and script issue resolution
- It also provided detail graphical results and features to export the graphs
- JMeter has a large user base globally and hence fast upgrades are available. Large user base also ensures quick issue resolutions over technical forums
The team was slowly migrated towards the Jmeter architecture. Not only did we have a huge savings in licensing costs, we observed the following:
- Testers started reporting bugs to JMeter’s bugzilla (defect management tool) and also started contributing in its enhancements by writing code
- Testers started conducting / attending JMeter technology conferences and also started training other performance testers in the market
- Knowledge transfer: Staff started resolving other performance testers’ issues over the forums and started gaining confidence and expertise over the tool
- Staff published papers on performance test concepts, advanced statistical concepts to be used in test result analysis and JMeter as a tool
This all happened because the team started thinking. And the process of thinking started since no support was available. No wonder they say –
“Necessity is the mother of every invention”
As a team lead I continued my activities. I learned and gained experience on some more performance test tools (LoadRunner, SilkPerformer, Rational Perf tester, etc), started giving free advices to projects, sponsors, testers and developers to select the best performance test tool for their projects and started speaking on open source conferences, guiding performance testers in their projects and resolving their queries.
I am making one such attempt here :
If you are a performance test architect and are thinking about moving to open source tool like Jmeter, then it is worth considering following points –
Advantages of working on Jmeter –
- As mentioned above, the most important advantage of working on any open source tool is that the team does a lot of thinking and researching. Testers have to find ways to resolve issues and to get the job done. Based on a recent paper published by a recognized IT consulting firm, software engineers working on open source technologies are conceptually 2.75 times stronger than those working on licensed tools.
- Working on one open source tool may introduce other open source technologies in the project. So a small initiative may lead to a major operational change in a period of 2-5 years. A small transition can eventually save millions in terms of licensing costs.
- If you work extensively on web protocols, then Jmeter is THE thing for you.
- Jmeter is easy to learn for newbies.
- Jmeter supports regular expressions and hence string handling is comparatively easy.
Advantages of working on Loadrunner –
- If you need faster resolutions on tool / scripting issues, then you need support and HP does a great job maintaining LR and managing its issues.
- If you are working on legacy systems and have a lot of scripts developed, then the licensing cost advantage may not significant when compared with the re-scripting effort required.
- The record-replay features that Loadrunner provides are unmatched when compared to any open source tool in the market. Loadrunner is very stable and robust in terms of its replay and run-time features.
- If you are someone who has to work on different types of IT applications in your projects, dealing with different protocols, then sticking to one tool (Loadrunner that supports many protocols) may be a good idea.
- Based on a recent study, efficiency of software professionals working on Licensed tool is more than those working on open source tools.
- The analysis and monitoring features provided by Loadrunner tool are still market leading