“Performance testing is only as accurate as the model you simulate – time invested in the requirements is time well spent” – Jason
In terms of time and duration – most of the time is spent executing performance tests, but in terms of importance not enough onus is put into spending time investigating and confirming requirements. When given a set of requirements I always ask what the figures are based on, I ask to see evidence – then I investigate the accuracy of it. This can be extremely time consuming – you spend a lot of time chasing small bits of important information, particularly if the team is geographically distributed. Here you will find the DBA or system Admin is your friend, you can gather actual live statistics and match them with your requirements – or use them as a basis for your requirements. Why invest all this time confirming requirements ?… because your performance testing can only be as accurate as your requirements model. 80% of the time is spent scripting, executing and re-executing. If an incorrect requirement is stated then this time is wasted. Spend and invest time checking and confirming requirements – here are a few reasons why.
Incorrect Requirements cause:
- Exponential Impact: A small mistake in the requirements stage can cause an exponential amount of wasted downstream effort. 80% of time is spent within the execution phase –which is driven by the requirements stage.
- Inaccurate model: You are not modelling live – so when the system goes live it may not perform as expected and may crash because you have simulated a model with inaccurate requirements. It doesn’t matter that you have modelled what you have been given – performance testing will be perceived as ineffective.
- Resource Drag: If a performance requirement is stated to high – and performance testing reveals the system cannot handle this, then resources from other areas of the project are pulled into solving this issue. Code is profiled, changed, rebuilt and re-tested – this can add weeks onto a project as well as drag resources from other areas. There have been a few times when I buried into the detail of the underlying requirement and found it to be completely inaccurate. Technically this isn’t the responsibility of the Performance Tester – but it helps to be aware of this. If you find you this type of situation occurring investigate and authentic the requirements driving the extra work.
The ability to define, agreed and validate requirements is really what makes a performance test effective.
Each step of the Performance Requirements Phases should be documented and agreed with stakeholders – this will form the basis of a contract between your performance requirements and results. It also helps tackle/answer these questions at the end reporting stage:
- “Why has this been performance tested”
- “Why didn’t we performance test this”
- “What is the purpose and risks addressed by having these performance tests”
- “I didn’t ask for this to be performance tested, I asked for this”
- “I don’t understand what I am being asked to sign off”
I’ve been surprised at how many places I’ve visited and the performance tests have been executed with no real documented sign off or tie into the business objectives. In part its usually because of the pressure of timelines – performance testers just want to get on and write scripts to performance test the system. Writing up requirements, business flows and arranging meetings saps valuable time – and its an easy task to let slip or hope someone else, such as the Project Manager picks up. Instead, what can happen is some sort of informal agreement with stakeholders is reached and scripting begins – with no documented backup. I’ve found this to be dangerous and potentially damaging at the end of a phase.
Simple agreements or statements of what you intend to test should always be drawn up and distributed. This helps focus attention of stakeholders – once it is written and they are asked to agree they will pay more attention. Some stakeholders will get back to you – you find you have not understood the requirements and they will educate you (this is always good). Sometimes stakeholders will not get back to you and will not contribute and will not sign off – which is fine, because having something is better than having nothing. At least you have given them the opportunity. Drawing up these artefacts is professional and always beneficial – and I’ve also found them valuable as an effective defence against the above statements.