Reactis Tester#

What is Reactis Tester?#

Reactis Tester automatically generates test suites from Simulink®/Stateflow® models of embedded control software. The test suites provide comprehensive coverage of different test-quality criteria, while at the same time minimizing redundancy in tests.

Can I restrict the values that a top-level inport assumes during a test?#

Yes, for details on how to do this please see Chapter The Reactis Info File Editor. In brief, do the following:

  1. Load your model in Reactis.

  2. Select menu item Edit > Inports…

  3. A window titled Reactis Info File Editor: Inports will appear. In this window, double-click on the row corresponding to the inport you wish to constrain. A dialog will appear for specifying the set of values the port may assume.

Is it possible to generate tests only for a subsystem instead of the whole model?#

Yes, Reactis supports two alternative ways to generate tests for a subsystem.

Using a Harness#

  1. Load your model in Reactis.

  2. Right-click on the subsystem for which you want to generate tests and select Create New Harness…

  3. Enter a name for your harness.

  4. Now when you generate tests, tests will be generated for the harness subsystem.
    You can switch back to the top-level harness by selecting Edit > Harness.

Extracting a Subsystem Into a Separate Model#

  1. Load your model in Reactis.

  2. Right-click on the subsystem for which you want to generate tests and select Extract Subsystem.

  3. Specify a filename for the extracted model.

  4. Reactis will then create a new .slx file containing the extracted subsystem as the top-level and load it into Reactis.

I have added/removed top-level inports from my model. How do I make these changes show up in the Port Type Editor in Reactis?#

Whenever you add/remove top-level inports to/from your model, or change the type of a top-level inport, you should:

  1. Load your model in Reactis.

  2. Select menu item Edit > Inports…

  3. A window titled Reactis Info File Editor: Inports will appear. In this window, select menu item Tools > Synchronize Inports, Outports, and Test Points.

I have added/removed top-level inports or outports to/from my model. How do I reuse old test suites created for the model that contain a different set of inputs and outputs?#

Whenever you add/remove top-level inports or outputs to/from your model, you can convert old test suites to the new input/output signature as follows:

  1. Load your model in Reactis.

  2. Start Simulator.

  3. Select menu item Test Suite > Import…

  4. A window titled Import Tests will appear. Set the file type to Reactis Test Suite (*.rst) and select the old test suite to import. A wizard will appear that will aid you in remapping the test suite data to the current model.

A workspace data item is used to specify one of several configurations of my model. Depending on the value of this item, various parts of my model are unreachable; however, all parts of the model are reachable when all configurations of the model are considered. Is it possible for Reactis to generate tests for all the different configurations and give cumulative coverage information?#

Yes, Reactis uses a facility called configuration variables to do this. The idea is that you may tag a workspace data item as a configuration variable, and specify the different values it may assume. When generating tests, Reactis will change the values of configuration variables only between tests (not during a test). To tag a workspace data item as a configuration variable, do the following:

  1. Load the model in Reactis.

  2. Select menu item Edit > Configuration Variables…

  3. A window titled Reactis Info File Editor: Configuration Variables will appear. In this window, select menu item Edit > Add… and then select the data item to become a configuration variable.

  4. The newly added configuration variable will now appear in the list of configuration variables. Double-click on it to obtain a dialog for specifying the set of values the variable may assume.

I created a test suite from a model using Reactis, but when I run the tests within Reactis, the output values are different than those stored in the tests. Why is this?#

The probable explanation is that your model was modified after the tests were generated, but before they were run. In this case, input values from the test might yield different responses from the model. To update a test suite, so that the new model responses are recorded in the tests, load the model and test suite in Simulator and then select Test Suite > Update Outputs…

Another possible explanation for the differences is that the model contains an S-function that maintains its own state from one call to the next. (For example, the S-function might modify global or statically allocated variables.) During the test-generation process Reactis repeatedly restarts simulations from intermediate model configurations. In order to restore configurations for S-functions appropriately, Reactis must be able to extract any persistent state information the function might have. Reactis can do this if the S- function adheres to the constraints described in Chapter Preparing Models for Use with Reactis of the Reactis User’s Guide and only uses storage allocated by mdlInitializeSizes() to store information between calls to the S-function.

When I run Reactis Tester multiple times on the same model, I get different test suites. Why is this?#

The reason for this is that the test generation algorithm employed by Tester includes a pseudo-random component – during guided simulation, some inputs are selected using a random number generator. Therefore, different test steps result when different random numbers are generated.

Is it possible to make Tester generate the same test suite on different runs on the same model? In other words, is it possible to get reproducible results with Tester?#

Yes, you can do this by specifying that the random number generator use the same seed for different runs. For a model foo.slx, this is done as follows:

  1. Run Tester for the first time to create a test suite foo.rst.

  2. In Reactis, select Test Suite > Browse to load the suite in the Reactis Test Suite Browser, then select the Test History tab.

  3. In the test history, there is a line starting with --- Created by Tester, select that entry to see the full text displayed in the bottom panel. It will have the form:

   --- Created by Tester: 'rtest adptive_cruise.slx ...  -s 3670373.596'

The -s flag gives the random number generator seed (3670373.596 in the above example).

  1. Run Tester a second time, but in the Additional Parameters text entry box of the Tester launch dialog enter -s followed by the seed value (e.g., -s 3670373.596).

Reactis can also automatically save all relevant Tester parameters (including the random seed) to a file when generating a test suite. To enable this feature, select File > Settings > General in Reactis and make sure the When creating a test suite, also create a parameters file (extension ‘.rtp’) option is enabled. To use the .rtp files generated by this feature, click the Load button in the Tester launch dialog and select the .rtp created by an earlier Tester run.

Why do test suites with the same coverage level have different numbers of steps?#

This is due to a couple of factors. First, Tester employs a random number generator to select some inputs. Also, Tester includes a “pruning” phase during which test steps that do not improve coverage are eliminated. So even though two different test suites may achieve the same level of coverage, they could be quite different.