Overview
Load tests scenarios consists of Web performance tests or unit tests. A scenario is the container within a load test where user can specify load pattern, test mix, browser mix, and network mix. Scenarios are important because they provide the flexibility to user in configuring test characteristics that allow for simulation of complex, realistic workloads.
Load Pattern – It specifies the number of virtual users active during a load test and the rate at which new users are ramping-up. For example: step, constant, and goal-based.
Test Mix – The test mix is the selection of Web performance tests that are contained within the scenario and the distribution of those tests within the scenario.
Browser Mix – It simulates that virtual users examine a Web site through a variety of Web browsers.
Network Mix – It simulates that virtual users examine a Web site through a variety of network connections. The Network Mix offers options that include LAN, cable modem, and other options.
Plan – Load Test Scenario
Load Test Planning Overview
For any type of testing, the very first essential step for successful testing is the well-defined test plan. And when performing for Non-functional testing, especially the load test (also referred to as a performance test) then the importance of this test plan is automatically doubled then the normal ones. Like for high-trafficked websites, the most important challenge for QA is to see if the website is ready for peak traffic via a load test; the goal of the load test should be either:
- Validate that the website is capable of handling the requests of high volumes of users or
- Identify the breakpoints and bottlenecks where the current infrastructure fails.
And the Load Test Plan helps the user in performing Load Testing in the manner to:
- Build test scenarios that accurately emulate working environment
- Make a clear picture of resources which are required for testing
- Define success criteria in measurable terms
Load Testing Objectives
Load test plan should be based on the clearly defined testing objectives like:
- Determine if the application complies with contracts, regulations, and service level agreements (SLAs).
- Detect bottlenecks to be tuned.
- Assist the development team in determining the performance characteristics for various configuration options.
- Provide input data for scalability and capacity-planning efforts.
The following table reflects the common application testing objectives that a performance test tool, NetStorm helps in load test.
Objectives | Definition |
Measuring end-user response time | To measure the duration for completing a business process |
Defining optimal hardware configuration | To define the hardware configuration that provides the best performance |
Checking reliability | To ensure how hard/long can the system work without the occurrence of errors/failures |
Checking hardware or software upgrades | To check whether and how the upgrade effects performance or reliability |
Evaluating new products | To identify the server hardware or software to be chosen |
Measuring system capacity | To measure the load the system can handle without significant performance degradation |
Identifying bottlenecks | To identify those elements which slows down the response time |
Table 1: Common Application Testing Objectives
Measuring End-User Response Time
To check how much time a user takes to perform a business process and receive a response from the server. For example, if the user wants to verify while the system operates under normal load conditions, whether the end users receive responses to all requests within 20 seconds or not.
Defining Optimal Hardware Configuration
To check how the web performance is affected by various system configurations, such as memory, CPU speed, cache, adaptors, and modems. Once user recognizes the system architecture and checks the application response time, he/she can measure the application response for different system configurations to find out which settings can achieve the desired performance levels. For example, user set up three different server configurations and run the same tests on each configuration to determine performance variations.
Checking Reliability
To identify how much the system is stable if heavy or continuous workloads are applied. User can use Performance Test Tool say NetStorm to create stress on the system. User can force the system to handle extended activity in a compressed time period to simulate the kind of activity a system would normally experience over a period of time.
Checking Hardware or Software Upgrades
User can perform regression testing for comparing a new release of hardware/software to an older one. User can check how a software/hardware upgrade affects response time and reliability. Application regression testing does not check new features of an upgrade. It only compares the efficiency and reliability of the new release with the older one.
Evaluating New Products
User can evaluate individual products and subsystems during the planning and design stage of a product’s life cycle by running tests. For example, user can identify which hardware for the server machine or the database package based on evaluation tests.
Measuring System Capacity
To measure system capacity and find out the excess capacity the system can handle without performance humiliation. For checking the system capacity, user can compare performance with load onthe existing system and identify from where significant response-time degradation begins to occur.
Identifying Bottlenecks
User configures the Performance Test Tool, say NetStorm monitoring components to determine bottlenecks on the system and identify the elements which are causing performance degradation. For example, file locking, resource contention, and network overload. Use NetStorm in conjunction with the new network and machine monitoring tools to create load and measure performance at different points in the system.
Plan a Load Test
This section describes the key points on how to plan a load test.
- Analyze the application thoroughly
- List down and define the load testing objectives
- Plan the implementation of Performance Test Tool say NetStorm
Analyze the Application/System
This section describes the key things on how to go ahead to analyze the application under test as a part of load test planning.
- Identify the application/system components
- Describe the application/system configuration
- Define the system/application usage
- Analyze the work flow and task distribution
Define the Load Test Objectives
This section describes the key points. On the basis of those, user can define the objectives of load tests.
- Finalize the general objectives
- State the objectives in a measurable term
- Decision on when to test
Plan the NetStorm Implementation
This section describes the key points, which describes the way how and why to plan the implementation of NetStorm.
- Define the scope of performance measurements
- Define the virtual user activities
- Select virtual user
- Choosing testing hardware/ software
Design – Load Test Scenario
A scenario is the overall definition of a test. A scenario defines the events that occur during each testing session. It defines and controls the number of users to emulate, the actions that they perform, and the machines on which they run their emulations.
A scenario has one or more scenario groups. A scenario consists of groups of Virtual Users or VUsers, which emulate human users interacting with application executing a test script. Virtual User is the software simulation of real world user and a session represents what it does.
Once the scenario is created, NetStorm saves the scenario details in a scenario file.
NetStorm have scenario profiles to keep default configurations specific to a subproject at a common place. This is in replacement to existing file which is used as a default configuration file for the controller.
Scenario Types
User can design a scenario in one of the following types:
- Fix Concurrent Users (FCU):The number of concurrent users accessing the SUT is fixed.
- Fix Session Rate (FSR): The session rate (per minute) to which the SUT is exposed is fixed.
- Mixed Mode: This mode is a combination of FCU and FSR.
- Fix Mean Users: The concurrent users would vary around a mean. It is different than Fix Concurrent Users where Concurrent users are always fixed.
- Fix URL Hit Rate (Hits/Min): Target URL hit rate at SUT is fixed.
- Fix Page Hit Rate (Pages/Min): Target Page hit rate on the SUT is fixed.
- Fix Transaction Rate (Transactions/Min): Target Transaction rate is fixed.
- Meet Specified Service Level Agreements (SLA)
Schedule Types
Schedule Type has two options:
- Simple: Simple schedule is a schedule where there are only four phases and optionally start phase. In ramp up and ramp down, all users or session rates are ramped up or ramped down. There is no control or need to specify number of users or session rate as user is going to do all users or all session rate.
- Advanced: Advanced schedule is more complex and user defined schedule where there can be any number of phases in the schedule. In each ramp up phase, specify number of users or session rate to be ramped up. In each ramp down phase, specify number of users or All or session rate or All to be ramped down.
![]() |
Scenario Schedule phases are described in detail later in this section. |
Schedule By
There are two types of ‘Schedule By’:
Scenario: In “Schedule by Scenario”, all phases (start, ramp up, stabilize, duration, and ramp down) will be executed as a whole scenario. There can be multiple groups in a scenario. In “Schedule by Scenario” all of them will be combined and executed as a part of whole scenario.
Group: In “Schedule by Group”, all phases (start, ramp up, stabilize, duration, and ramp down) will be executed in each group separately. In “Schedule by Group” all phases will be executed for each group.
Based on the Schedule Type and Schedule By settings, user has the following 4 types of scenarios:
- Simple Scenario
- Simple Group
- Advance Scenario
- Advance Group
Note:
In ‘Schedule By’ scenario, if any user is in ramp-up or ramp-down phase and wants to pause the test, then a message is displayed as:
Pausing test run for ‘<time>/<indefinite_time>’ at <time> …
“Please wait for some time as some users are in initializing stage and it will take time to pause”
In addition, if no user is in ramp-up or ramp-down phase, then a message is displayed as:
Pausing test run for ‘<time>/<indefinite_time>’ at <time> …
Any additional message is not displayed if these is no user(s) in ramp-up or ramp-down phase.
Distribute Users across Groups
Users can be distributed across groups by following modes:
- By Percentage: Users in Group(s) are classified by Percentage. For example, Group A has 80% users and Group B has 15% and Group C has 5%.
- By Numbers: Users in Group(s) are classified by Numbers. For example, Group A has 80 users and Group B has 20 users.
- Auto Mode: This mode is operational only for advanced scheduling in which users are equally distributed according to schedule. It is applicable for “Schedule by scenario” only.
Creating/Setting Scenario Profile
This section describes how to create a scenario profile and configure it for use in other scenarios. If user selects that particular scenario profile while creating a scenario, the settings are reflected in the created scenario. There is no need to perform the settings again and again, user just need to select the scenario profile in which required settings has already been done.
User needs to follow below-mentioned steps for creating a scenario profile:
- Login to LOAD TEST. On the left pane, click Scenarios icon, and then click Scenario Profiles.
Figure: Scenario Profile Selection
- The Scenario Profile window is displayed.
Figure: Scenario Profile Window
- In the Scenario Profile window, click the Add button. The Create Scenario Profile dialog box is displayed.
Figure: Create Scenario Profile Window
- Specify the Project Name, and SubProject Name, enter the Profile Name and click Next. The Scenario Profile is created and gets added to the Scenario Profile The Schedule Settings window is displayed.
Figure: Scenario Profile Page
- The sections for setting different options (in Global settings, Group based settings etc.) is displayed. Here, user can set up scenario profile and save the settings for later use in scenarios.
Run – Load Test Scenario
Running Scenarios Overview
Load Test Plan helps the user in performing Load Testing in the manner to:
- Start of Run: When user instructs NetStorm to begin the scenario run, the Controller checks the scenario configuration information, invokes the applications that were selected to run with the scenario, and then distributes each Vuser script to its designated load generator. When the Vusers are ready, they start running their scripts. As the scenario starts, in the Scenario Groups pane, user can watch Vusers gradually start running.
- During Run: During the scenario run, user can see a synopsis of the running scenario in the Scenario Status pane. User can also drill down to see which Vuser actions are causing the application problems. The Dashboard GUI’s online graphs display performance data collected by the monitors. User can use this information to isolate potential problem areas in the system.
- End of Run: The scenario ends when all the Vusers have completed their scripts, when the duration runs out, or when it is terminated. At the conclusion of the test run, the Scenario Status pane shows the Down status. This indicates that the Vusers have stopped running.
To Run a Scenario
This task describes how to run a scenario.
- Open an existing scenario or create a new one.
- Prepare to run the scenario. Before running a scenario, provide details for the scenario and the other run-time related configurable settings.
- To run the scenario, click the Start button corresponding to a scenario.
- Once it is clicked, a new window, Start Scenario displays where user has to provide the Test Name and click the Start Scenario button.
Note: While starting the test from the scenario, the test name should start with an Alphabet. Allowed characters are: alpha, numeric, hyphen, full stop, underscore, and angular bracket. If the user tries to enter other special characters except these, an error message is displayed.
Run/Stop Scenario
This section describes the key points on how to plan a load test.
- Analyze the application thoroughly
- List down and define the load testing objectives
- Plan the implementation of NetStorm
Second Level Authorization
To avoid unwanted hits on server, the user (with required privileges) first needs to set (6-digit) authorization key (via Admin > Second Level Authorization) which is used while starting a test on that machine and saving the scenario schedule settings.
- On starting a test, the user is required to provide the authorization key. If the key matches with the one as set by the privileged user, the test starts and ‘Test Initialization Screen‘ is displayed, else an error message is displayed.
- In addition, on saving the scenario schedule settings, the user is required to provide the authorization key. If the key matches with the one as set by the privileged user, the user is allowed to save the scenario schedule settings else and error message is displayed.
Note: When a new scenario is created, if user starts the test without saving the scenario, an alert pop-up is displayed which states that “Test cannot be started because scenario is not saved.” On saving the scenario, the user can start the test. This behavior is valid only for new a scenario not for already created scenario (i.e. scenario in edit mode).
Setting Authorization Key
- Go to Admin > Second Level Authorization.
Figure: Second Level Authorization
2. Select the check box and provide the current and new password. Re-enter the password and click Save. This saves the authorization key and a confirmation message is displayed.
Note: It is prompted to user when starting a test or apply scheduling from the Scenario Schedule window.
Starting a Test
- For starting a test, when the user clicks the
button, provides the test name, and clicks the Start Scenario button, the user is prompted to provide the activation key.
Figure: Authentication Domain
2. On matching the activation key, the test gets started and Test Initialization window is displayed.
Figure: Test Initialization Window
Analysis – Load Test Scenario
Analysing Process Overview
For any type of testing, there is the very first essential step for successful testing is the well-defined test plan. And when performing for Non-functional testing, especially the load test (also referred to as a performance test) then the importance of this test plan is automatically doubled then the normal ones. Like for high trafficked websites, the most important challenge for QA is to see if the website is ready for peak traffic via a load test; the goal of the load test should be either
- Validate that the website is capable of handling the requests of high volumes of users or
- Identify the breakpoints and bottlenecks where the current infrastructure fails.
And the Load Test Plan helps in performing Load Testing in the manner to:
- Build test scenarios that accurately emulate the working environment
- Make a clear picture of resources which are required for testing
- Define success criteria in measurable terms
Objectives – Scenario Run Analysis
Load test plan should be based on the clearly defined testing objectives like
- Determine if the application complies with contracts, regulations, and service level agreements (SLAs).
- Detect bottlenecks to be tuned.
- Assist the development team in determining the performance characteristics for various configuration options.
- Provide input data for scalability and capacity-planning efforts.
On-line mode Scenario Analysis
This section describes the key points on how to plan a load test.
- Analyze the application thoroughly
- List down and define the load testing objectives
- Plan the implementation of NetStorm
Off-line mode Scenario Analysis
This section describes the key things on how to go ahead to analyze the application under test as a part of load test planning.
- Identify the application/system components
- Describe the application/system configuration
- Define the system/application usage
- Analyze the work flow and task distribution