Analytics

There are following sections within this menu:

  • Compare
    • Enable Compare
    • Disable Compare
    • Update Compare

Compare

Compare option is used to compare values of two test runs/sessions. Using compare feature of Dashboard, current running test/session can be compared with existing baseline test run/session. Graphs can be compared in both online and offline modes. In online mode, two graphs are displayed, one static baseline graph and one current running test run/session graph. Compare is allowed only for single graphs. User can set a baseline test run/session number from the existing sessions. User can change or remove baseline test run/session number.

Note

In NetStorm, user can compare test runs and sessions of a test run both, but in NetDiagnostics, user can compare different sessions of a test run only.

Key Points in Comparing Test Runs/Sessions

There are following key points in comparing test runs/sessions:

  • Current test run/session can be compared with other test runs/sessions respectively.
  • Different instances of current test run/session can be compared.
  • User can compare same or different instances of different test runs/sessions.
  • Current test run/session is always included in compare. No need to select current test run/session in compare window.
  • If compare is done and user changes the time from Graph Time or View by phase, then data changes in all graph panels for current test run/session only.

Enable Compare

User needs to follow the below mentioned steps for comparing test runs/sessions:

  1. Go to Analytics > Compare > Enable Compare. The Compare Settings window is displayed.

2. In the Advance Settings, section, specify whether to include current test run/session or not using the Include Current Test Run check box or Include Current Session respectively. It is used to include/exclude current test run/session in compare. The current test run/session is included by default.

3. Enter the Measurement Name in the specified box. Measurement Name is a unique name, which is assigned for the test run measurement. It is an alias of one compare settings.

Measurement name can be maximum of 25 characters. All characters are allowed except ‘|’ as it is used as separator between fields. In addition, it cannot be duplicated.

4. Select the time duration from Preset. On selecting Custom, two options are displayed – Absolute and Elapsed. In case of Absolute, enter Start Date, Start Time, End Date and End Time. In case of Elapsed, enter start time and end time. The specified date/time should be in range with current session.

5. Next, select the Color by clicking on it. By default, there are colors defined for each compare setting but user can change the color of any compare setting.

6. Then, click Add. This button adds values of compare settings from input fields to table. Delete button, at right side, is used for deleting the data added regarding a particular session.

7. Click the Apply button. After comparison, graphs are displayed.

Operation on Compared Graphs

Following operations can be performed on compared graphs in Dashboard UI:

  • Zoom: User can apply zoom on the compared graph panel.
  • Drag single graph: If user drags single graph on compared graph, then that graph is compared if graph is present in both test run. If graph is not present in baseline test run, then an alert message is displayed “Graph is not present in Baseline Test Run. So cannot perform comparison”.
  • Drag multiple graphs: If user drags multiple graphs (All graphs of a group) on compared graph then only first graph is compared.
  • Load Favorite: If favorite is loaded on compared graph then only first graph of each panel of favorite is compared with baseline Test Run.
  • Change Color: User can change color of Baseline from lower panel.

 Compare Changes for Auto Scaling

This feature allows a user to compare test runs even if the indices are not available in measurements. In this case, indices are mapped with indices of other measurements.

To use this feature, we are representing an example.

Graphs of first test run:

In this test Run Indices for Sys Stats Linux Extended are:

  • Cavisson>Ubuntu54
  • Cavisson>NSAppliance
  • GUIDevTier>Ubuntu52
  • GUIDevTier>Ubuntu51

Graphs of second test run:

In this, test run Indices for Sys Stats Linux Extended are:

  • Cavisson>Ubuntu54
  • Cavisson>NSAppliance
  • QATier>Ubuntu47
  • QATier>Ubuntu48

Now, on applying compare between these two test runs, the following window is displayed:

In the highlighted section, it can be seen that if indices are not available in measurements then system maps it to other indices.

Disable Compare

This feature is used to disable the comparison applied in the test runs. On clicking Disable Compare menu-item, the graphs in the panel is displayed in its original form (without comparison).

Update Compare

This feature is used to update the comparison parameters, such as time, addition/removal of measurements, and so on.