Chapter 3 Getting Started with Reactis
This chapter provides a quick overview of Reactis. It contains a brief description of each main component of the tool suite in the form of an extended “guided-tour” of the different features of the tool suite. Each “stop” in the tour is indicated by a §. The tour uses as a running example a simple Simulink / Stateflow model of an automobile cruise control that is included in the Reactis distribution.
3.1 A Note on Model Preparation
The cruise-control example included with the distribution does not require any special processing before you run Reactis on it; you may load the file immediately and start the guided tour. However, there is an important step that you should undertake when you are preparing models for use with Reactis. This section describes this preparatory step and discusses the Simulink operations needed to perform it.
Reactis supports a large portion of the discrete-time subset of Simulink and Stateflow. As it processes a Simulink / Stateflow model, it also interacts with MATLAB in order to evaluate MATLAB expressions, including references to workspace data items, that a model may contain.
In order for Reactis to process a model, Reactis must be able to automatically set up the MATLAB workspace data used by the model. For this reason, any workspace data items that a model uses must be initialized within one of the following locations (for more details see Section 14.1.1):
Below we describe how the workspace initialization was established in the cruise-control model file included in the Reactis release. In that example the MATLAB file cruise_constants.m defines two workspace variables that are used in the Simulink / Stateflow model file cruise.slx. The cruise_constants.m file was attached to cruise.slx using the following steps.
This saved cruise.slx file is distributed with Reactis, so you do not need to undertake the above steps yourself in order to load and process this file in Reactis. The above steps are cited only for illustrative purposes.
3.2 Reactis Top Level
The Reactis top-level window contains menus and a tool bar for launching and
controlling verification and validation activities. Reactis is invoked as
You now see a Reactis window like the one as shown in Figure
3.1. A model may be selected for analysis as
Loading the model causes the top-level window to change as shown in Figure 3.2. The panel to the right shows the top-level Simulink diagram of the model; the panel to the left shows the model hierarchy. In addition, the title bar now reports the model currently loaded, namely cruise.slx, and the Reactis info file (.rsi files ) cruise.rsi that contains auxiliary model information used by Reactis. Info files are explained in more detail in the next section (Section 3.3).
It is also worth noting that if during installation you choose to associate the .rsi file extension with Reactis, then you can start Reactis and open cruise.slx in a single step by opening cruise.rsi in Windows Explorer.
A node in the hierarchy panel may be tagged with a plus sign (+) to indicate the node contains subsystems not currently displayed. The hierarchy may be expanded or collapsed by clicking the + or - tags. Subsystems in the hierarchy panel are also tagged with icons indicating whether they are Simulink (SL), Stateflow ( SF), Validator objectives ( , ), virtual sources ( ), or configuration variables ( ). Validator objectives, virtual sources and configuration variables are discussed in more detail later. Additionally, if you are using the Reactis for C Plugin (see Chapter 16 for details), then you may also see C source files (C), C libraries (LIB), and S-Functions (FN) in the hierarchy panel.
Reactis provides a signal-tracing mechanism which allows the path taken by a signal to be quickly identified. To trace a signal, left-click on any part of the signal line.
As shown in Figure 3.3, signals are highlighted in yellow when left-clicked on. The route of the highlighted signal can then be easily identified. To turn off the highlighting, left-click on empty space in the main window.
3.3 The Info File Editor
Reactis does not modify the .slx files for a model. Instead the tool stores model-specific information that it requires in a .rsi file. The primary way for you to view and edit the data in these files is via the Reactis Info File Editor, which is described briefly in this section and in more detail in Chapter 5.
The next stop in the guided tour explains how this editor may be invoked on
This starts the Reactis Info File Editor, as shown in Figure 3.4. Note that the contents of the .rsi file may only be modified when Simulator is disabled. When Simulator is running, the Info File Editor operates in read-only mode as indicated by “[read only]” in the editor window’s title bar.
.rsi files contain directives partitioned among the following panes in the Info File Editor.
The types that may be specified are the base Simulink / Stateflow types extended with notations to define ranges, subsets and resolutions, and to constrain the allowable changes in value from one simulation step to the next. More precisely, acceptable types can have the following basic forms.
Table 3.1 gives examples of types and the values they contain. For vector, matrix or bus inports, the above types can be specified for each element independently.
If any change is made to a port type, then “[modified]” appears in the title bars of the Info File Editor and the top-level Reactis window. You may save changes to disk by selecting File -> Save from the Info File Editor or File -> Save Info File from the top-level window.
If no .rsi file exists for a model, Reactis will create a default file the first time you open the Info File Editor, or start Simulator or Tester. The default type for each inport is the base type inferred for the port from the model. If you add or remove an inport to your model you can synchronize your .rsi file with the new version of the model by selecting Tools -> Synchronize from the Info File Editor.
Reactis Simulator provides a rich array of facilities for viewing the
execution of models. To continue with the guided tour:
This causes a number of the tool-bar buttons that were previously disabled to become enabled.
Simulator performs simulations in a step-by-step manner: at each simulation step inputs are generated for each top-level inport, and resultant outputs reported on each outport. You can control how Simulator computes top-level inport values using two different mechanisms:
To set the input source for an inport not controlled by a virtual source, use the Source-of-Inputs dialog located to the left of in the tool bar (see Figure 3.5). The next part of the guided tour illustrates the use of each of these input sources; interspersed with this discussion are asides on coverage tracking, data-value tracking, and other useful features.
3.4.1 Generating Random Inputs
As the random input source is the default, no action needs to be taken
to set this input mode.
During simulation, blocks, signals, states, and transitions in the diagram are highlighted in green as they are entered and executed. The simulation stops automatically when the number of simulation steps reaches the figure contained in the entry box to the left of the Source-of-Inputs dialog. Before then, you may pause the simulation by clicking the (Run/Pause Simulation) button a second time. Note, that simulation will likely pause in the middle of a simulation step. You may then click:
3.4.2 Tracking Model Coverage
While Simulator is running you may also track
coverage information regarding which parts of your model
have been executed and which have not.
These coverage-tracking features work for all input-source modes.
The next portion of the guided tour illustrates how these features are used.
A dialog summarizing coverage (which blocks/states have been entered
and which transitions have fired) now appears, with elements of the
diagram not yet exercised drawn in red, as shown in Figure
3.6. Note that poor coverage is not uncommon with
random simulation. You may hover over a covered element to determine
the (1) test (in the case being considered here, the “test” is the
current simulation run and is rendered as . in the hovering
information) and (2) step within the test that first exercised the
item. This type of test and step coverage information is displayed
with a message of the form
The Coverage Details dialog, shown in Figure 3.7, has two tabs: Decision and MCC. The Decision tab displays details for decision coverage, condition coverage, and MC/DC. The table in this figure gives information for the decision:
This decision contains two conditions:
Conditions are the atomic boolean expressions that are used in decisions. The
first two columns of the table list the test/step information for when the
decision first evaluated to true and when it first evaluated to false. A value
MC/DC Coverage requires that each condition independently affect the
outcome of the decision in which it resides. When a condition has
met the MC/DC criterion in a set of tests, the sixth and seventh
columns of the table explain how. Each element of these two columns
has the form bb:test/step, where each b reports the
outcome of evaluating one of the conditions in the decision during the
test and step specified. Each b is either
The MCC tab of the Coverage Details dialog displays details of multiple condition coverage (MCC) which tracks whether all possible combinations of condition outcomes for a decision have been exercised. The table includes a column for each condition in the decision. The column header is the condition and each subsequent row contains an outcome for the condition: True, False, or x (indicating the condition was not evaluated due to short-circuiting). Each row also contains the outcome of the decision (True of False) and, when covered, the test and step during which the combination was first exercised.
It should be noted that the previous scenario relied on randomly generated input data, and replaying the steps outlined above will yield different coverage information than that depicted in Figure 3.7
An alternative way to query coverage information is to invoke the Coverage-Report Browser by selecting Coverage -> Show Report. This is a tool for viewing or exporting coverage information that explains which model elements have been covered along with the test and step where they were first exercised.
A simulation run, and associated coverage statistics, may be reset by clicking the Reset to First Step button ( ) in the tool bar. Note that in the Coverage Summary window, not all percentages will be zero, since some items are covered in the model’s initial configuration.
3.4.3 Reading Inputs from Tests
Simulation inputs may also be drawn from tests in a Reactis test
suite. Such a test suite may be generated automatically by Reactis
Tester, constructed manually in Reactis Simulator, or
imported from a file storing test-data in a comma-separated-value format. By
files storing Reactis test suites have a .rst file-name extension.
A Reactis test suite may be loaded into Simulator as follows.
This causes cruise.rst, the name of a test suite file generated
by Reactis Tester, to appear in the title bar and the contents of the
Source-of-Inputs dialog to change; it now contains a
list of tests that have been loaded. To view this list:
Each test in the suite has a row in the dialog that contains a test
number, a sequence number, a name, and the number of steps in the test.
Clicking the “all” button in the lower left corner specifies that
all tests in the suite should be executed one after another. To
execute the longest test in the suite:
If you look at the bottom-right corner of the window, you can see that the test is being executed (or has completed), although the results of each execution step are not displayed graphically. When the test execution completes, the exercised parts of the model are drawn in black. If the Run Simulation button ( ) is clicked instead, then the results of each simulation step are rendered graphically, with the consequence that simulation proceeds more slowly.
You may specify that a subset of tests should be run by holding down the control key and clicking on each test to be run with the left mouse button. The tests will be run in the order they are selected. As tests are selected the sequence number column is updated to indicate the execution order of the tests.
You may also use the Source-of-Inputs dialog to change the name of a test. To do so, select the test by single clicking on it, then click on the name and, when the cursor appears, type in a new name and press return.
Whenever tests are executed in Simulator, the value computed by the model for each top-level outport and test point is compared against the corresponding value stored in the test suite. The tolerance for this comparison can be configured in the Info File Editor. An HTML report listing any differences (as well as any runtime errors encountered) can be generated by loading a test suite in Simulator and then selecting Simulate -> Fast Run With Report.
3.4.4 Tracking Values of Data Items
When Simulator is paused3, you may
view the current value of a given data item (Simulink block
or signal line, Stateflow variable, or C variable) by hovering over
the item with your mouse cursor.
You may also select data items whose values you wish to track during
simulation using the watched-variable and scope facilities of
The bottom of the Simulator window now changes to that indicated in Figure 3.8.
The watched-variable panel shows the values of watched data items in the current simulation step, as does hovering over a data item with the mouse. Variables may be added to, and deleted from, the watched-variable panel by selecting them and right-clicking to obtain a menu. You may also toggle whether the watched-variable list is displayed or not by selecting the View -> Show Watched Variables menu item.
Scopes display the values a given data item has assumed
since the beginning of the current simulation run. To open a scope:
Scopes also have a zoom feature which is particularly useful for viewing the details of long signals. The zoom feature uses the mouse to select an initial region of interest, which can be subsequently repositioned using the mouse. See Section 7.5.3 for more details on scopes.
3.4.5 Querying the User for Inputs
The third way for Simulator to obtain values for inports
is for you to provide them. To enter this mode of
Upon selecting this input mode, a Next Input Values dialog appears, as shown in Figure 3.10, that allows you to specify the input values for the next simulation step. Each top-level inport of the model has a row in the dialog containing five columns; these determine the next input value for the corresponding inport as follows.
When Run Simulation ( ) or Run Fast Simulation ( ) is selected, the inport values specified are used for each subsequent simulation step until the simulation is paused.
3.4.6 Controlling Inputs with Virtual Sources
Previous sections described three ways to control the model inputs during simulation: random generation, reading values from tests, and querying the user. Virtual sources offer an alternative mechanism to control one or more inputs with a Simulink subsystem (outputs of the virtual source feed into inputs of the model). Signal Builder blocks are especially convenient for implementing virtual sources. Virtual sources are created using Simulink and saved in a separate library. They are then wired into your model within Reactis, leaving your model unmodified by the testing instrumentation.
After selecting the new .rsi file, two new items will appear at the top level of the model (to the left): a virtual source ActiveCheckScenario and an assertion ActiveCheck as shown in Figure 3.11. The virtual source is implemented with a Signal Builder block that specifies sequences of values for the first six inputs of the model and a sequence of expected responses to this scenario for the active output of the cruise control. The assertion then compares the expected value of output to the actual value computed by the model.
The following steps show how to enable and disable virtual sources:
Perform the following to observe how a virtual source controls inports
3.4.7 Other Simulator Features
Simulator has several other noteworthy features. You may step both forward and backward through a simulation using toolbar buttons:
You may specify the number of steps taken when , , or are pressed by adjusting the number in the text-entry box to the right of . When a simulation is paused at the end of a simulation step (as opposed to in the middle of a simulation step), the current simulation run may be added to the current test suite by selecting the menu item Test Suite -> Add/Extend Test. After the test is added it appears in the Source-of-Inputs dialog. After saving the test suite with Test Suite -> Save, the steps in the new test may be viewed by selecting Test Suite -> Browse. A model (or portion thereof, including coverage information) may be printed by selecting File -> Print....
The symbol is drawn on a model item when a break point is set. During a simulation run, whenever a breakpoint is hit, Simulator pauses immediately.
Tester may be used to generate a test suite (a set of tests) automatically from a Simulink / Stateflow model as shown in Figure 3.13. The tool identifies coverage targets in the model and aims to maximize the number of targets exercised by the generated tests.
To start Tester:
This causes the window shown in Figure 3.14 to appear. Tester employs a three-phase algorithm to generate test suites. In the Preload Phase, a set of previously constructed test suites may be loaded; Tester then attempts to extend the pre-loaded suite(s). In the Random Phase, Tester uses a fast random search strategy to generate test cases. In the Targeted Phase, the tool uses sophisticated algorithms to build tests that exercise previously uncovered parts of the model.
The Tester launch dialog consists of several sections. The first section (Preload Files) lists previously-generated test suites to be pre-loaded. The second section (Run for) determines how long tester should run. The third section (Coverage Objectives) lists the metrics which Tester will focus on during the targeted testing phase. In the fourth section, you specify the name of the output file in which Tester will store the new test suite (Output File).
There are three options provided in the Run for section. The default option, as shown in Figure 3.14, is to run Tester for 20,000 steps, with Tester choosing how many of these steps will be spent during random and targeted testing. Alternatively, you may choose to run Tester for a fixed length of time by clicking on the top radio button in the Run for section, after which the steps entry box will be disabled and the hours and minutes entry boxes will be enabled. These values are added together to determine the total length of time for which Tester will run, so that entering a value of 1 for hours and 30 for minutes will cause Tester to run for 90 minutes.
The bottom radio button in the Run for section lets you specify how many steps Tester spends on random testing and targeted testing. When selected, there are three parameters which can be edited. The first two (tests in random phase and steps per random test) determine the number of tests and steps per test to be generated in the random phase. It should be noted that, upon completion of the random phase, redundant steps are pruned from the tests, so the lengths of the final tests are usually shorter than the value entered for steps per random test. Note that the random phase is far less optimized than the targeted phase; therefore, the random phase should be allocated no more than a few thousand steps, leaving the vast majority of test-generation time for the targeted phase. The third parameter (steps in targeted phase) specifies the total number of execution steps which Tester will take during the targeted testing phase.
The Coverage Objectives section contains check boxes which are used to select the kinds of targets Tester will focus on during the targeted testing phase. Chapter 6 describes the different types of coverage tracked by Reactis.
To generate a test suite in the guided tour, retain the default
The Tester progress dialog, shown in Figure 3.15, is displayed during test-suite generation. When Tester terminates, a results dialog is shown, and a file cruise.rst containing the suite is produced. The results dialog includes buttons for loading the test suite into the Test-Suite Browser (see below), Reactis Simulator, or a coverage viewer.
3.6 The Test-Suite Browser
The Test-Suite Browser is used to view the test suites created by Reactis. It may be invoked from either the Tester results dialog or the Reactis top-level window.
A Test-Suite Browser window like the one shown in Figure 3.16 is
then displayed. The Test Data tab of the browser displays
the test selected in the button/dialog located near the center of the
browser’s tool bar. The main panel in the browser window shows the
indicated test as a matrix, in which the first column gives the names of
input and output ports of the model and each subsequent column lists the
values for each port for the corresponding simulation step. The simulation
time is displayed in the output row labeled “
The Filter entry box on the right side of the toolbar lets you search
for test steps that satisfy a given condition. You enter a boolean expression
in the search box and then select the filter check box to search for test steps
in the suite for which the expression evaluates to true. For example, to see
each step where the cruise control is active enter “
The Test-Suite Browser may also be used to display the entire set of values
passing over a port during a single test or set of tests.
A dialog similar to that shown in Figure 3.17 appears. In the figure, each value assumed by the inport drag is represented by a yellow dot.
Reactis Validator is used for checking whether models behave correctly. It enables the early detection of design errors and inconsistencies and reduces the effort required for design reviews. The tool gives you three major capabilities.
Conceptually, assertions may be seen as checking system behavior for potential errors, user-defined targets monitor system behavior in order to check for the presence of certain desirable executions (tests), and virtual sources generate sequences of values to be fed into model inports. Syntactically, Validator assertions, user-defined targets, and virtual sources have the same form and are collectively referred to as Validator objectives.
Validator objectives play key roles in checking a model against requirements given for its behavior. A typical requirements specification consists of a list of mandatory properties. For example, a requirements document for an automotive cruise control might stipulate that, among other things, whenever the brake pedal is depressed, the cruise control should disengage. Checking whether or not such a requirement holds of a model would require a method for monitoring system behavior to check this property, together with a collection of tests that would, among other things, engage the cruise control and then apply the brake. In Validator, the behavior monitor would be implemented as an assertion, while the test scenario involving engaging the cruise control and then stepping on the brake would be captured as a user-defined target.
3.7.1 Manipulating Validator Objectives
This section gives more information about Validator objectives.
After performing these operations, you now see a window like the one depicted in Figure 3.18. The three kinds of objectives are represented by different icons; assertions are denoted by a lightning bolt , targets are marked by a cross-hair symbol , and virtual sources are represented by .
Validator objectives may take one of two forms.
To see an example expression objective, do the following.
A dialog like that shown in Figure 3.19 now appears. This dialog shows an expression intended to capture the following requirement for a the cruise control: “Activating the brake shall immediately cause the cruise control to become inactive.” Note that the dialog consists of five parts:
To see an example diagram objective:
A dialog like the one in Figure 3.20 now appears. This dialog contains five sections.
Wiring information may be viewed by hovering over a diagram objective in the main Reactis model-viewer panel.
Diagram-based objectives may be viewed as monitors that read values from the model being observed and raise flags by setting their outport values appropriately (zero for false, non-zero for true). A diagram-based assertion in essence defines one “check” for each of its outports, with such an outport check being violated if it ever assumes the value zero. Similarly, a diagram-based target objective in essence defines one “target” for each outport; such a target is deemed covered if it becomes non-zero.
To view the diagram associated with SpdCheck:
These operations display the Stateflow diagram shown in Figure 3.21. This diagram encodes the following requirement: “While it is engaged, the cruise control shall not allow the desired and actual speeds to differ by more than 1 mile per hour for more than three time units.”
To understand how this diagram captures this requirement, note that the SpdCheck diagram has two top-level states, one corresponding to when the cruise control is active and one when it is inactive. The active state has three child states:
Note that the transition action executed as state Error is entered sets the outport ok to 0. This is how an assertion violation is flagged.
Diagram objectives give you the full power of Simulink / Stateflow to formulate assertions and targets. The objectives may use any Simulink blocks supported by Reactis, including full Stateflow. The diagrams are created using Simulink and Stateflow in the same way standard models are built; they are stored in a separate .slx file from the model under validation.
Diagram wiring is managed by Reactis, so the model under validation need not be modified at all. As this information is stored in the .rsi file associated with the model, it also persists from one Reactis session to the next. After adding a diagram objective to a model, the diagram is included in the model’s hierarchy tree, just as library links are. See Chapter 9 for more details on using Reactis Validator.
3.7.2 Launching Validator
To use Validator to check for assertion violations:
A dialog like the one in Figure 3.22 now appears. The dialog is similar to the Tester launch screen in Figure 3.14 because the algorithms underlying the two tools are very similar. Conceptually, Validator works by generating thorough test suites from the instrumented model using the Tester test-generation algorithm and then applying these tests to the instrumented model to see if any violations occur. Note that when a model is instrumented with Validator objectives, the test-generation algorithm uses the objectives to direct the test construction process. In other words, Reactis actively attempts to find tests that violate assertions and cover user-defined targets. Validator stores the test suite it creates in the file specified in this dialog. The tests may then be used to diagnose any assertion violations that arise.