Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

How to build and run regression tests is described below. There are also tips for troubleshooting errors and upgrading tests. The final two steps of the process are undertaken by the Useruser. 

Building regression tests

To build a regression test, you must first set up the test project and create the results that will provide the baseline for comparison. Once this is done, the regression test is built using the Test Builder.The following steps illustrate this process: 

  1. Open or create a project with the scenarios that you want to test.
  2. For each scenario that you wish to test: 
    1. Before running, ensure that you are only recording the minimum number of parameters relevant to what you would like to test in your regression test. 
    2. Configure and run your scenario, noting you must run at least one analysis type (e.g., Single analysis) from any of the test scenarios, and
    3. Ensure that all the recorded results are giving you the answers you expect.
  3. Open the Regression Test Builder (Figure 1) using Tools » Regression Tests » Test Builder.
  4. Click on the ellipsis button (…) to open a Select Folder window and navigate to the folder (e.g., C:\) where you want to save your baseline data and test project for the regression test. Click Select Folder button on the Select Folder window.
  5. Select one option (e.g., Fast) from the Configuration Type dropdown on Figure 1, select any available options from Scenario and Analysis box (e.g.Automatic Calibration -Single Analysis) and leave the Ignore checkbox unchecked.
  6. Click Save to save the Test Builder files to the chosen folder location. 
Figure 1. Regression tests, Test Builder

Image Modified

Figure 1 is an example from a Source project (e.g., RegressionTests_example_V520.rsproj) with two scenarios: (a) Automatic Calibration and (b) Manual Calibration. The scenario called Automatic Calibration is run under analysis type Single analysis while the scenario called Manual Calibration is run under Flow calibration analysis 

...

Scenario and analysis list: this box will display all the available scenarios combined with all possible run configuration analysis types in your Source project file (information on run configuration analysis types can be found at: https://wiki.ewater.org.au/display/SD540/Configuring+Scenarios at Configuring Scenarios. The user needs to select the applicable Scenario (such as Automatic Calibration) and applicable analysis type (such as Single Analysis), Multiple scenario and run analysis type combinations can be selected. Scenario and run analysis type combinations which are not relevant for a project (such as Run with warm up which is not applicable for the catchment model in this example) will not be available. 

...

Figure 2. Regression tests, Test Runner

Image Modified

The Regression Test Runner interface (Figure 2) options are explained below: 

...

Figure 3. Regression tests, Test Runner results

Image Modified

The error message helps the user to identify the problem. For example, in Figure 3 the error messages in the first 2 lines show there is one failed item, which is from the scenario Automatic Calibration running under Single Analysis. The modelled flow, the downstream flow, at the gauge node of Gauge site was changed. Note thatin this case abig difference (50%) was manually changed modified for demonstration purposes. 

Figure 4. Regression tests, the Chart for Diff. Chart icon

Image Modified

If no error is found during the test running, thegreen tick with the label Successwill be displayedat the bottom left of the Test Runner window instead of a red cross., and the Diff. Chart icon will be not displayed. 

...