I’ve been working with JMeter of late and after speaking with some people at conferences and the like, I thought it would be good to share what I’ve learned along the way in building, structuring and executing JMeter test plans.
One of the first things I like to do when performance testing is establish in reasonable detail what the Application Simulation Model (ASM) will entail. You can create an ASM visually using something like UCML. This is a good starting point for performance testers looking to structure their test planning efforts.
UCMLâ„¢ is a set of symbols that can be used to create visual system usage models and depict associated parameters. When applied to Performance Testing these symbols can serve to represent the workload distributions, operational profiles, pivot tables, matrixes, and Markov chains that performance testers often employ to determine what activities are to be included in a test and with what frequency they’ll occur.
More often than not though, I create an ASM using a spreadsheet format as I find this easiest to communicate with people from the business teams. If people are interested I will share some of the spreadsheet templates. The following screenshot gives you an idea of what I’m talking about:
The pertinent points for an ASM, is that it should:
e.g. “we have 3 user groups with a total of 150 users, 80% of them are helpdesk support, 10% are team leads and 10% are management”
e.g. “we’re aiming for around 240 trans/hour, of which 50% are searches, 25% are updates and 25% are reports”
Read on to discover how to turn ASMs into working JMeter test plans.
One of things I’m aiming for is to use relative random distributions of transactions within a loop controller which simulates the transaction distributions in real life. The following screenshot shows a Loop Controller, with three children Throughput Controllers (renamed to 20%, 50% and 30%). Each Throughput Controller then has a Transaction Controller, which is the parent container for all subsequent requests that make up that transaction (there are typically tens or hundreds of requests here depending on the complexity of the web application). Note I have also renamed the Transaction Controllers to something more representative of the transactions being simulated.
The Loop Controller will effectively control the number of iterations and I often set this to run forever when threads are shut down manually during test runs. I also use a Constant Throughput Timer, which alleviates the need to set iteration counts and pacing between iterations which LoadRunner users would be familiar with. When using a Constant Throughput Timer, you can simply set the targeted transaction rate in terms of samples/minute (in this case it’s measured on collective Transaction Controller’s throughput). JMeter will then self-throttle the load to try and match that transaction rate. If you want more granular control, you can move CTT to be child elements of selected Transaction Controllers, thus applying different transaction rates for different transactions. The benefit in using Throughput Controllers with Percent Execution settings specified, is that transactions will be randomly distributed according to those percentages down to each subsequent Transaction Controller. The following screenshot demonstrates those settings used for the 20% Throughput Controller:
This means that if I specified 100 iterations in my Loop Controller, then 20 transactions will be distributed to the 20% Throughput Controller. This is done in a relatively random fashion, it does not process transactions in sequence which is good for random simulations.
Once you’ve established your transaction groups with percentage distribution, you can then create your user groups using the Thread Group element. In my example I’m using two user groups (Group A and B), each with differing user quantities. The purpose of these is just to pigeon hole my user groups and quantities as detailed in the corresponding ASM. Each user group is run in parallel, and is subject to the startup (ramp-up) and loop count settings for the threads (users) created. The following screenshot shows some typical settings:
You can see two user groups in there, along with Loop Controllers and Throughput Controllers. You can also see all the sample requests under each of the Transaction Controllers. In my example I’m using HTTP request samplers, but this could just as easily be a web service, ftp or some other form of JMeter sampler. I often start with this template in mind, then record and edit transactions in the Workbench before importing back into my Transaction Controllers.
Basically JMeter will run through the test plan from the top to the bottom. For this reason I often keep my configuration elements such as HTTP cookie managers, user parameters and reporting listeners towards the top of the test plan so they get initialized. Then, as it hits the Thread Groups it will launch those according to the settings specified. You can also see in the screenshot some results from the View Results Tree listener for Thread Group A. Essentially this demonstrates the randomized distribution of Transaction Controllers according to their parent Throughput Controllers.
In coming posts I’ll walk through some other features of JMeter that help you control thread execution and delve deeper into the reporting functionality that comes with JMeter. You can download a copy of the jmeter test plan referred to in this post here.