Reactis User's Guide   Contents  |  Index
 Chapters:  1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | A | B | C

Chapter 7   Reactis Simulator

Simulator provides an array of features — including single and multi-step forward and backward execution, breakpoints, and simulations driven by Tester-generated tests or user inputs — for simulating Simulink / Stateflow models (and C code if using the Reactis for C Plugin). The tool also allows visual tracking of coverage data and the values data items in the model assume during simulation.


Figure 7.1: The Reactis Simulator toolbar controls.
images/simToolbarD_web.png

Figure 7.1 contains an annotated screen shot of a portion of the top-level Reactis window when Simulator is enabled. Some of the buttons and pull-down menus on the leftmost part of the window have been elided; Chapter 4 contains descriptions of these items. The next section describes the labeled items in Figure 7.1, while the section following discusses the Simulator-related menu entries. The subsequent sections discuss the different modes for generating inputs during simulation, ways to track data values, how to monitor model coverage, importing and exporting test suites, and the different model highlighting styles used by Simulator.

7.1  Labeled Window Items

  1. Disable Reactis Simulator.
  2. Enable Reactis Simulator.
  3. Reset the simulation; the model is returned to the start state, and coverage information is appropriately reset.
  4. Take n steps back, where n is specified by window item 10. Coverage information is updated appropriately upon completion of the last backward step.
  5. Take one step back. Coverage information is updated appropriately.
  6. Advance by one mini-step. In Simulink, a mini-step evaluates the next block in the evaluation order. In Stateflow, a mini-step evaluates the next transition segment condition or transition action. In C, a mini-step executes a single C statement, stepping into a function when at a function call.
  7. Advance forward by one full step; that is, values are read on the harness inports, the model’s response is computed and values are written to the harness outports. If a step has been partially computed using the step-into button (window item 6), then execution picks up with the current partially computed step and continues until the end of the step, at which point values are written to the harness outports.
  8. Execute n slow simulation steps, where n is specified by window item 10. The diagram in the main panel (window item 17) is updated during simulation to reflect the currently active Simulink block, Stateflow state / transition, or C statement. When Coverage > Show Details is selected, coverage targets will change from red to black as they are covered during the simulation run. If a single test is selected and the end of the test is reached before n steps execute, then simulation stops. When executing multiple tests (including an entire suite) the tests are executed one after another and simulation stops when n steps have executed or the end of the test suite is reached.

    When a slow simulation is running, clicking this button pauses the simulation.

  9. Execute n fast simulation steps, where n is specified by window item 10. The diagram in the main panel (window item 17) is not updated while the simulation is in progress but is updated when simulation halts. If the end of the current test or test suite is reached then simulation halts.

    When a fast simulation is running, clicking this button pauses the simulation.

  10. This window item determines how many steps are taken when buttons corresponding to window items 48 , or 9 are clicked. When the Source-of-Inputs dialog (window item 11) is set to a test or test suite, the number of steps may be set to 0 to indicate that the entire test or test suite should be executed.
  11. The Source-of-Inputs dialog determines what values are fed into inports to drive a simulation: random values, values from the user, or values from a test. See Section 7.4 for details.
  12. Create a new, empty test suite. The name of the .rst file containing the suite is initially “unnamed.rst” and is displayed in the title bar of the Reactis window.
  13. Open a dialog for selecting a test suite (.rst file) to be loaded into Simulator. After it is loaded, the test suite’s name is displayed in the title bar, and the tests are listed in the Source-of-Inputs dialog (window item 11).
  14. Save the current test suite.
  15. View Reactis Simulator help.
  16. The model hierarchy panel (not shown explicitly) supports the structure-based navigation of a model, as described in Section 4.1. Right-clicking on an item in the navigation panel brings up a menu that also allows you to view data items and set breakpoints. Data viewing is covered in more detail in Section 7.5. Breakpoints may be set by right-clicking on a subsystem or Stateflow state in the hierarchy panel and selecting Toggle Breakpoint. The name in the hierarchy panel is then decorated with a “stop sign” icon ( images/stopSignBtn_web.png ). When a subsystem breakpoint is set, simulation pauses whenever an item in the subsystem (Simulink block or Stateflow transition) executes. When a Stateflow state breakpoint is set, simulation pauses whenever the state is entered or exited.
  17. The main panel displays the currently selected Simulink or Stateflow diagram, C code if you are using the Reactis for C Plugin or Embedded MATLAB code if you are using the Reactis for EML Plugin. You may interact with the diagram in a number of different ways using the mouse including hovering over model items, double-clicking on items, or right-clicking in various parts of the panel. Section 4.1 describes how you interact with the main panel when Simulator is disabled. The following mouse operations are available when Simulator is enabled:
    Hovering...

    • over a data item (Simulink block or signal line, Stateflow variable, or C variable) will display its current value and type.
    • over a Goto block will cause it and its associated From block(s) to be highlighted in yellow.
    • over a From block will cause it and its associated Goto block to be highlighted in yellow.
    • over any Tester coverage target will display the test and step within the test during which the target was first executed. This information is presented in a message of the form “Covered: test/step”. A “.” in the test location indicates the current simulation run. “-/-” indicates the target has not yet been covered. For more details on querying coverage information see Section 7.6, Chapter 10, and Chapter 6.
    • over a Validator objective will cause its wiring information to be drawn in blue.
    • over an output of a Logical Operator block that roots a multi-block decision coverage group will cause blue lines to be drawn to the conditions of the group (when multi-block decision coverage is enabled). See Section 6.3.1 for a description of Multi-block decision coverage.
    Double-Clicking...

    • on a Scope block will open a scope window for that block (see Section 7.5.3).
    • on a Display block will add that block as a watched variable (see Section 7.5.2).
    • on a From or Goto block will open a dialog listing all other From or Goto blocks in the model associated with the block.
    • on a Data Store Memory, Data Store Read, or Data Store Write block will open a dialog listing all matching Data Store blocks.
    • on a Simulink subsystem will cause the subsystem diagram to be displayed in the main panel.
    • on a Stateflow state will cause the state’s diagram to be displayed in the main panel.
    • on a harness input port while running in user-guided simulation mode will bring up a panel to modify that inport’s current input value.
    • on a configuration variable in the Configuration Variable Panel (see Section 4.4) while Simulator is in the initial state (no simulation steps have been taken) will bring up a panel to modify the variable’s current value.
    • on a line number within a C source file will toggle a breakpoint on that line.
    • on any other Simulink block will display the block’s parameters.
    Right Clicking...

    Causes different pop-up menus to be displayed. The contents of the menus vary based on where the click occurs and whether or not Simulator is enabled. A summary of the menu items available when Simulator is enabled follows. For descriptions of the menu entries available when Simulator is disabled, see Section 4.1.
    Right-Click Location
    Menu Entries (when Simulator is enabled)
    Simulink signals, Simulink blocks, Stateflow variables
    Inspect value
    Inspect the value of a bus (see Section 7.5.1).
    Add To Watched
    Add item to watched variables list (see Section 7.5.2).
    Open Scope
    Display item in scope (see section 7.5.3).
    Open Distribution Scope
    Display item in distribution scope (see Section 7.5.4).
    Add To Scope
    Add item to previously opened scope. This item only appears when other scopes are open.
    User defined target or assertion
    Run to Violation
    If an assertion is violated, this menu entry becomes enabled. Selecting it directs Reactis to switch to the test that caused the violation and run to the step where the violation occurs.
    View Properties
    View assertion, user defined target, or virtual source properties in read-only mode.
    Logical Operator block, Lookup Table, harness Inport, non-else outport of If block, or Stateflow transition segment, Stateflow state
    View Coverage Details
    Display dialog containing detailed coverage information for the item. (see Section 6.1.3, 6.3.1, 6.3.2, 6.2, and 7.6.2)
    Simulink blocks
    View Block Parameters
    Display Simulink block parameters, mask parameters, mask initialization.
    Logical Operator block, Branching block, Lookup table, harness Inport, Stateflow transition segment, Stateflow state, or a test/step within a Coverage Details dialog
    Track Coverage
    Open a dialog from which the target can be excluded from coverage (with or without assertion), or included in coverage (see Section 6.5.3).

    Configuration variable in Configuration Variable Panel (see Section 4.4)

    Change Value
    Modify current value of configuration variables. Note that configuration variables may only be updated in between tests (not during a test).
    Edit Type
    Open the Type Editor to view the type of the selected inport/configuration variable. Note that the type cannot be changed while Simulator is enabled.
    Harness outport.
    Open Difference Scope
    This menu item is enabled when a test suite is loaded. The feature is used when differences exist between the value stored in the test for the outport and the value computed by the model for the outport. The resulting scope plots the expected value (from the test) against the actual value (from the model) as shown in Figure 7.13.
    Edit Properties
    Open the Info File Editor to view or change the output tolerance and interval targets for the selected outport. Note that outport properties cannot be changed while Simulator is enabled.
    Non-virtual Simulink block or Stateflow transition
    Toggle Breakpoint
    Enable or disable breakpoint for an item.
    Simulink subsystem
    Extract Subsystem
    Extract a subsystem and save it in a separate model file (see section 4.5).


    else

    Left-Clicking on Signals...

    Causes the signal wire to be highlighted in yellow. The highlighting travels in both directions: back to its block source and forward to one or several block destinations. To make it easy to identify the relevant signal path, the subsystems the signal penetrates are highlighted as well. The signal highlighting will travel through virtual blocks such as Subsystems, Froms, Gotos, Inports, Outports, Data Store Reads, Data Store Writes, Bus Creators, and Bus Selectors. To continue tracing a signal through a block, click on the wire on the non-highlighted side of the block. To remove signal highlighting, left-click the mouse button in empty space.

    The following five labeled window items are only available in the toolbar if you are using the Reactis for C Plugin to step through C code.

  18. Step backward to the point just before the current function was called.
  19. Step backward one statement. Any function calls performed by the statement are stepped over.
  20. Step backward one statement. If the statement performed any function calls, stop at the end of the last function which was called.
  21. Step forward one statement. Any function calls performed by the statement are stepped over.
  22. Advance until the currently executing function returns.

7.2  Menus

Except for the documented exceptions related to editing .rsi files 1, the menus described in Section 4.2 work in the same manner when Simulator is enabled. The following additional menu items are also active when Simulator is enabled.

View menu.
The following entries become enabled when Simulator is “on”.
Show Watched Variables.
Toggle whether or not watched-variable list is displayed. The default is not to show them; adding to the list automatically causes the list to be displayed.
Add Watched Variables...
Add data items (Simulink blocks or signal lines, Stateflow variables, or C variables) to the watched-variable list. Selecting this entry brings up a list of data items. You can toggle whether or not an entry in the list is selected by control-left-clicking on it; clicking OK causes the selected items to be added to the watch list.
Clear Watched Variables.
Remove all items from the watch list.
Open Scopes...
Open scopes for data items (Simulink blocks or signal lines, Stateflow variables, or C variables). Selecting this entry brings up a list of items. You can toggle whether or not an entry in the list is selected by control-left-clicking on the variable; clicking OK causes scopes to be opened for each selected item.
Open Distribution Scopes...
Open distribution scopes for data items (Simulink blocks or signal lines, Stateflow variables, or C variables). Selecting this entry brings up a list of items. You can toggle whether or not an entry in the list is selected by control-left-clicking on the variable; clicking OK causes distribution scopes to be opened for each selected item.
Close All Scopes.
Close all open scopes.
Open Scope (Signal Group).
Open a scope for the specified signal group. A signal group is created by clicking the save button ( images/saveBtn_web.png ) in a scope to save the current configuration of the scope as a signal group (set of signals along with the scope settings for displaying them).
Delete Signal Group.
Delete the selected signal group.
Save Profile as...
Save the current view profile under a new name. The view profile contains the currently opened scopes and watched variables. Profiles are saved in a file with the .rsp suffix.
Load Profile...
Load a different view profile (.rsp file). This will automatically open all scopes and watched variables stored in the profile.
Simulate menu.
The following entries are available when Simulator is enabled. Note that some entries (Step Over, Step Out Of, Reverse Step Into, Reverse Step Over, Reverse Step Out Of) are only available when using the Reactis for C Plugin with a model containing C code.
Simulator on/off.
Enable or disable Simulator. When disabled Simulator behaves as a model viewer; that is, the model can be viewed but simulation capabilities are disabled.
Fast Run with Report.
Execute a fast simulation and produce a report. See Section 7.3 for details.
Fast Run.
Same as window item 9.
Run.
Same as window item 8.
Step.
Same as window item 7.
Step Into.
Same as window item 6.
Step Over.
Same as window item 21.
Step Out Of.
Same as window item 22.
Stop.
Stop a fast or slow simulation run.
Reverse Step Into.
Same as window item 20.
Reverse Step Over.
Same as window item 19.
Reverse Step Out Of.
Same as window item 18.
Back.
Same as window item 5.
Fast Back.
Same as window item 4.
Reset.
Same as window item 3.
Toggle Breakpoint.
Sets a breakpoint for the currently selected item in the model-hierarchy panel if none exists, or clears the breakpoint if one has already been set. Simulation will halt when the item becomes active, which may be in the middle of a simulation step. The simulator controls may then be used to continue execution of the model.
Clear Breakpoints.
Removes all breakpoints.
Set Animation Delay...
When running a slow simulation, this value specifies the duration of the pause between the evaluation and highlighting of different model elements.
Switch Configuration Variable Set When hovered or clicked on, a list of available configuration variable sets will be displayed, from which one set may be selected. See Section 5.5.1 for more details.
Update Configuration Variable...
Initiates a dialog for changing values of configuration variables, which are workspace variables whose values can only change between tests/simulation runs (but not during a test/simulation run). The simulation must be reset to the start state (by clicking the reset button images/resetBtn_web.png , window item 3) before the value of a configuration variable may be updated. Note also that whenever inputs are read from a test, the configuration variable values from the test will be used. In other words, manual updates to a configuration variable using this menu item will only have effect when in random or user input mode.
Test Suite menu.
New.
Same as window item 12.
Open.
Same as window item 13.
Save.
Same as window item 14.
Save and Defragment.
Removing tests from a test suite can cause the test suite to become fragmented, meaning that space within the file becomes unused. Reactis will reuse those gaps when you add tests. Selecting this menu item will save the current test suite and reorganize it, removing all gaps.
Save As...
Save current test suite in an .rst file. A file-selection dialog is opened to determine into which file the test suite should be saved.
Import...
Import tests and add them to the current test suite. Importing is described in more detail in Section 7.7.2.
Export...
Export the current test suite in different formats. Exporting is described in more detail in Section 7.7.1.
Create...
Launch Reactis Tester. See Chapter 8 for details.
Update...
Create a new test suite by simulating the current model using inputs from the current test suite, but recording values for outputs and test points generated by the model. This feature is described in Section 7.8.
Browse...
Open a file selection dialog, and then launch the Test-Suite Browser on the selected file. See Chapter 11 for details.
Browse Current.
Launch the Test-Suite Browser on the currently loaded test suite. See Chapter 11 for details.
Add/Extend Test.
At any point during a simulation, the current execution sequence (from the start state to the current state) may be added as a test to the current test suite by selecting this menu item. After the test is added, it will appear in the Source-of-Inputs dialog (window item 11). Note that the new test will not be written to an .rst file until the current test suite has been saved using window item 14 or the Test Suite > Save menu item.
Remove Test.
Remove the current test from the current test suite. Note that the test will not be removed from the .rst file until the current test suite has been saved using window item 14 or the Test Suite > Save menu item.
Compare Outputs.
Specify whether or not Simulator should compare the simulation outputs against the outputs contained in the test suite being executed. When enabled, a warning is reported for each significant difference between the computed value and the value stored in the test suite. A difference scope may then be opened by right-clicking on a harness outport and selecting Open Difference Scope (see Figure 7.13). The tolerance used to determine which differences are significant may be specified as described in Section 5.7.
Validate menu.
See Chapter 9 for a description of this menu.
Coverage menu.
The Coverage menu contains the following entries. Details about the different coverage objectives may be found in Chapter 6. The coverage information available from the various menu items is for the current simulation run. If a test suite is being executed, the coverage data is cumulative. That means all targets covered by the portion of the current test executed so far, plus those targets exercised in previous tests are listed as covered.
Show Summary.
Open the coverage summary dialog shown in Figure 7.15.
Show Details.
Report coverage information by coloring diagram elements as defined in the Line Style dialog shown in Figure 7.24. Generally, uncovered targets are drawn in red.
Show Report.
Start the Coverage-Report Browser. See Chapter 10 for details.
Show Quick HTML Report
Shows a HTML coverage report. This is the same as selecting Coverage > Show Report..., selecting Report > Export... within the report and then clicking the “Preview” button. The window for this report stays open but does not get updated for changes in coverage when working with Simulator. To get an updated report after taking steps in Simulator, re-execute the Show Quick HTML Report item.
Highlight Subsystems,
Branches, Lookup Targets, States, CSEPT, Condition Actions, Transition Actions, Decisions, Conditions, MC/DC, MCC, Boundaries, User-Defined Targets, Assertion Violations, C Statements. Each of these menu entries corresponds to one of the model coverage metrics tracked by Reactis and described in Chapter 6. When a menu entry is selected and Show Details is selected, any uncovered target in the corresponding coverage metric will be colored.
Select All.
When Show Details is selected, show coverage information for all metrics.
Deselect All.
When Show Details is selected, show no coverage information.
Highlight Unreachable Targets.
When Show Details is selected, color unreachable targets. A target is unreachable if it can be determined (without executing the model) that the target will never be covered regardless of the input values used during testing. The analysis used is conservative: marked items are always unreachable, but some unmarked items may also be unreachable.

7.3  Creating Test Execution Reports

Fast Simulation Run with Report executes all tests within the current test suite and produces a report which lists all runtime errors (divide-by-zero, overflow, memory errors, missing cases, assertion violations, etc.) and significant differences between outport values stored in the test suite and those computed by the model.


Figure 7.2: The Reactis Test Execution Report Options dialog is used to select which items appear in a test execution report.
images/testExecReportOptionsA_web.png

When Simulate > Fast Run with Report... is selected, the dialog shown in Figure 7.2 will appear. This dialog is used to select the items which will appear in the report. The following items are labeled in Figure 7.2:

  1. The Report Options panel is used to select optional report items. These include the date, pathnames of input files, coverage information, and plots of test data. If file paths are included in the report, the model version will also be included within parentheses following the model name. By default, the version number is the return value after calling the MATLAB function:
        get_param('modelName','ModelVersion');
      
    where modelName is the name of your model. If you wish to include a different version number, you can redefine the workspace variable reactis_model_version in the Reactis Post-Load Function. For example, when your model is loaded, select Edit > Callbacks..., then in the Post-Load entry box enter:
        reactis_model_version = my_function_to_construct_version();
      

    If Include step-to-covered-targets map is selected, then the generated report will include a table similar to that shown below for each test. The table lists each step in the test that covers some target along with the targets covered by the step.

    images/stepToCoveredTgtsMap_web.png
  2. When Include coverage report is selected in the Report Options panel, the Coverage Metrics panel is used to select which coverage metrics are included in the test execution report. There are three choices for each metric:
    • Summary & Details. Targets of the metric will appear in both the coverage summary and coverage details sections of the report.
    • Summary Only. Targets of the metric will appear in the coverage summary only.
    • None. Targets of the metric will be omitted from the report entirely.
    Note that due to dependencies between metrics, some combinations are not allowed. For example, Summary & Details cannot be selected for Condition targets unless Summary & Details is also selected for decision targets.
  3. The Output Format panel is used to select the format of the exported report (HTML or RRX/XML). If Preview before saving is checked, the exported report will be displayed and you will have the option of saving or discarding it.
  4. The Output panel is used to specify where the report will be stored. A test execution report can be stored in a single file, or it can be stored in multiple files (one file per test).
  5. When you are satisfied with the selected report options, clicking on this button will close the dialog and start the simulation run.
  6. Clicking on this button will close the dialog without initiating a simulation run.

Figure 7.3: A test execution report can be generated by loading a test suite in Simulator and selecting Simulate > Fast Run with Report...
images/testExecutionReport_web.png

Once the simulation run begins, it does not stop until all tests have been executed. During each test, if a runtime error is encountered, the remaining steps of the test are skipped and Simulator continues execution with the following test. After the last test is executed, a window containing the test execution report will appear, as shown in Figure 7.3. An HTML version of the report can be saved by clicking the Save button in the report window.

An HTML test execution report will contain some or all of the following sections, depending on which options are selected:

  1. A report header listing the data, input files, Reactis version, etc.
  2. A test summary listing the tests/steps which were executed and the number of errors and differences detected for each test. Non-zero error and difference totals can be clicked-on to jump to a detailed description of the error or difference.
  3. The tolerance used to test the value of each output and test point.
  4. The error detection settings (e.g. integer overflow) used when generating the report.
  5. A list of test details. For each test, includes the details for each error and difference that occurred, and plots of test data. The plots for a test are hidden by default, but they can be viewed by either clicking on the ± to the left of the signal name, or by clicking on Open all. See section 7.3.1 for details.
  6. The model hierarchy. The name of each member of the hierarchy can be clicked on to jump to the coverage details for that member.
  7. Coverage details for each component of the model. The coverage details for a model component begin with a summary of the local and cumulative coverage, followed by details for each metric. The details for a metric show, for each target, whether or not the target was covered, and if the target was covered, the test step when coverage occurred. The contents of this section are identical to a coverage report (Section 10.4).

7.3.1  Test Data Plots


Figure 7.4: An output plot from a test execution report.
images/testExecutionReportPlot_web.png

Figure 7.4 shows a typical plot from a test execution report. Test data (inputs, outputs, or test points) are plotted with the simulation time on the x-axis and the data value(s) on the y-axis. For outputs and test points, two values are shown: the test value (green), and the computed value (blue). The test value is the value stored in the test being executed for the output. The computed value is the value computed by the model for the output while executing the test. The acceptable tolerance between the two values is shaded yellow. Regions where the difference between the two signals is larger than the tolerance are shaded red.

Plots can be inspected when viewed from within a web browser or the preview dialog. The current focus of inspection is indicated by the intersection of two gray dashed lines. The focus can be moved either by the mouse, or the left and right arrow keys on the keyboard. Pressing S will move the focus to the start of the plot, and pressing E will move the focus to the end.

There are six values 2 displayed at the top of the plot for the current focus. These are (1) the step number, (2) the simulation time, (3) the test value (y value of green line), (4) the computed value (y value of blue line), (5) the difference between the test value and the computed value, and (6) the maximum difference between the test and computed values which is tolerated. These six values are updated whenever the focus is moved.

7.4  Specifying the Simulation Input Mode


Figure 7.5: The Source-of-Inputs dialog enables you to specify how Simulator computes input values.
images/inputsSource4_web.png

Reactis Simulator performs simulations in a step-by-step manner: at each simulation step inputs are generated for each harness inport, and resultant outputs reported on each harness outport. You can control how Simulator computes input values using the Source-of-Inputs dialog (window item 11 in Figure 7.1) shown in Figure 7.5. This dialog always includes the Random Simulation and User Guided Simulation entries; if a test suite has been loaded, then the dialog also includes an entry for each test and the All button becomes enabled. The dialog is used to specify how input values are generated as follows.

Random Simulation.
For each inport, Reactis randomly selects a value from the set of allowed values for the inport, using type and probability information contained in the associated .rsi file. See Chapter 5 for a description of how to enter this information using the Reactis Info File Editor.
User Guided Simulation.
You determine the value for each inport using the Next Input Values dialog, which appears when the User Guided Simulation entry is selected. See Section 7.4.1 below for more information on this mode.
Individual Tests.
When a test suite is loaded, each test in the suite has a row in the dialog that contains a test number, a sequence number, a name and the number of steps in the test. Selecting a test and clicking OK will cause inputs to be read from the test.
Subset of Tests.
You may specify that a subset of tests should be run by holding down the control key and clicking on each test to be run with the left mouse button. You can also hold the shift key while clicking to select a block of tests to be executed. The tests will be run in the order they are selected. As tests are selected, the sequence number column is updated to indicate the execution order of the tests. When a new test is started, the model is reset to its starting configuration, although coverage information is not reset, thereby allowing you to view cumulative coverage information for the subset of tests.
All Tests.
Clicking the All button in the lower left corner specifies that all tests in the suite should be executed one after another. The tests are executed sequentially. When a new test is started, the model is reset to its starting configuration, although coverage information is not reset, thereby allowing you to view cumulative coverage information for the entire test suite. Section 7.4.2 contains more information on this mode.

You can change the sorting order of the tests in the table by clicking on the column headers. For example, to sort the tests by the number of steps, simply click on the header of the “Steps” column. Clicking again on that header will sort by number of steps in descending order.

You may also use the Inputs Source dialog to change the name of a test. To do so, select the test by clicking on it, then click on the name and, when the cursor appears, type in a new name.

7.4.1  User Input Mode

When the User Guided Simulation mode is selected from the Source-of-Inputs dialog, you provide values for inports at each execution step. This section describes how this is done.


Figure 7.6: The Next Input Values dialog lets you control the simulation by specifying the next value for inputs (item 4) and clicking the stepping buttons (item 10).
images/guidedSimB_web.png

To enter the user-guided mode of operation, select User Guided Simulation from the Source-of-Inputs dialog (window item 11 in Figure 7.1). Upon selecting user-guided mode, a Next Input Values dialog appears, as shown in Figure 7.6, that allows you to specify the input values for the next simulation step. Initially, each harness inport of the model has a row in the dialog. You can remove inputs from the dialog or add outputs, test points, and configuration variables by clicking the gear button ( images/gearBtn_web.png ) in the toolbar of the Next Input Values dialog. Each row in the dialog contains 6 items (labeled 1-6 in Figure 7.6). The toolbar for the dialog contains items 7-13. The elements of the dialog work as follows:

  1. The name of an item (inport, outport, test point, or configuration variable).
  2. This checkbox toggles whether the item is included in a scope displaying a subset of the signals from the Next Input Values dialog.
  3. This pull-down menu has two entries that determine how the next value for the port is specified:
    Random
    Randomly select the next value for the inport from the type given for the inport in the .rsi file.
    Entry
    Specify the next value with the text-entry box in column four of the panel.
    Min
    Use the minimum value allowed by the inport’s type constraint.
    Max
    Use the maximum value allowed by the inport’s type constraint.
    Test
    Read data from an existing test suite (see Section 7.4.1.4).
  4. If the pull-down menu in column three is set to “Entry”, then the next input value is taken from this text-entry box. The entry can be a concrete value (e.g. integer or floating point constant) or a simple expression that is evaluated to compute the next value. These expressions can reference the previous values of inputs, the simulation time, or the current values of other inputs. For example, a ramp for input drag can be specified by pre(drag) + 0.0001. A sine wave can be generated by sin(t) * 0.001. If x should always be the opposite of y, this can be generated by setting the entry for x to be !(y). For the full description of the expression notation see Section 7.4.1.1 below.
  5. If the pull-down menu in column three is set to “Entry”, then clicking the history button (labeled H) displays recent values the inport has assumed. Selecting a value from the list causes it to be placed in the text-entry box of column four.
  6. The arrow buttons in this column enable scrolling through the possible values for the port. The arrows are available for inports or configuration variables:
    • having a base type of integer, boolean or fixed point; or
    • having a base type of double or single and either a resolution or subset of values constraint.
  7. When you enter a search string in this box, Reactis displays only the rows for items whose names contains the given search string.
  8. When you check this box, all signals in the Next Input Values dialog are plotted in a scope. When you uncheck this box, all signals are removed from the scope and no scope is displayed.
  9. This pull-down menu sets the input mode for all inports at once to either “Random”, “Entry”, “Min”, “Max”, “Test”, or “Zero”. When “Zero” is selected, each inport is set to “Entry” with a value of zero.
  10. These buttons control the simulation stepping exactly as they do in the top-level main Simulator window.
  11. The entry in this box is a positive integer which specifies how many steps to take when clicking one of the stepping buttons that triggers multiple steps (e.g. fast simulation button).
  12. Open a dialog to select the set of signals (inputs, outputs, test points, configuration variables) to be included in the Next Input Values dialog.
  13. Save the current configuration of the Next Input Values dialog for future use or load a previously saved configuration.

When “run fast simulation” (window item 9 in Figure 7.1) is selected, the inport value specifications in the Next Inputs Values dialog are used for each step in the simulation run.

7.4.1.1  Syntax of Next Input Value Expressions

The value an input should assume in the next simulation step can be specified from its row by selecting Entry in column 3 and then entering an expression in the box in column 4. We now describe the language used to define an expression.

Assume foo and bar are inputs. Then the following examples demonstrate some possible expressions to specify the next value of foo.

ExpressionValue foo will have in next step
4.04.0
pre(foo)The value foo had in the previous step
pre(foo,2)The value foo had two steps back
pre(foo) + 1Add 1 to the value of foo in the previous step
pre(u)Shorthand denoting the value of foo in the previous step
tThe current simulation time
sin(t)The sine of current simulation time (i.e. generate a sine wave)
!(bar)If bar is true then foo is false, otherwise foo is true.

The complete syntax of a next input value expression NIV is specified by the following grammar.

  NIV:numericConstant | true | falseConstant value
 |inportNameName of an input
 |uShorthand for current input (must be within pre)
 |pre(NIV)Go one step back when retrieving values for inputs in NIV
 |pre(NIV,n)Go n steps back when retrieving values for inputs in NIV
 |tSimulation time
 |NIV relOp NIVRelational operator
 |! NIVLogical not
 |NIV && NIVLogical and
 |NIV || NIVLogical or
 |- NIVNegation
 |NIV arithOp NIVArithmetic operation
 |function ( NIVL )Function call
 |[ rowL ]Matrix
 |{ fieldL }Record
 |NIV . fieldNameAccess field of record
 |if NIV then NIV else NIVIf then else
 |( NIV )Parentheses
 
relOp:<    |    <=   |   ==   |   !=   |   >=   |   > 
 
  arithOp:+   |   -   |   *   |   / 
 
field:fieldName = NIV 
 
function:testval_step | testval_timeRead data from test suite (see below)
 |port()Name of the input port whose value is being specified by this expression
 |step()Current step number within the test being constructed
 |abs | fabsAbsolute value
 |sin | cos | tan | asin | acos | atan | atan2 | sinh | coshTrigonometric functions
 |floor | ceilRounding functions
 |hypot(a,b)Calculate length of hypotenuse c, if a and b are lengths of non-hypotenuse sides of right triangle.
 |ln | log | log10 | pow | expLog and exponent functions
 |rem | sgn | sqrt 
 
NIVL:list of NIV delimited by , 
 
  rowL:list of NIVL delimited by ; 
 
  fieldL:list of field delimited by , 
  

7.4.1.2  Reading Data from Existing Test Suites in Expressions

Within the expression for specifying the next input value for a port the testval_step and testval_time functions can be used to read values from an existing test suite.

testval_step(suitefilename, portname, testnum, stepnum)
Reads a value from a test suite, specified by the step number within the test suite.
testsuitefilename
The file name of the test suite from which data should be read. If a relative path is used then it is relative to the directory in which the current model resides. Must be enclosed in double-quotes.
portname
The name of the port from which data should be read. This can be an input port, test point or output port, as they appear in the Reactis Test Suite browser. You can use the function "port()" here to use the same name as the port whose input you are specifying. The name must be enclosed in quotes.
testnum
The number of the test in the test suite from which data should be read. This number is 1-based, i.e. first step is 1.
stepnum
The step number within the test. Use the "step()" function to refer to the current simulation step. You can use arithmetic expressions to adjust the step number, for example to add an offset.

Example: testval_step("cruise.rst", "onOff", 3, step()+5) will read data from the 3rd test of test suite "cruise.rst", port "onOff", offset by 5 steps, i.e. the first step read from the test suite will be step 6.

testval_time(suitefilename, portname, testnum, time, interpolate)
Reads a value from a test suite, specified by the simulation time in the test suite.
testsuitefilename
The file name of the test suite from which data should be read. If a relative path is used then it is relative to the directory in which the current model resides. Must be enclosed in double-quotes.
portname
The name of the port from which data should be read. This can be an input port, test point or output port, as they appear in the Reactis Test Suite browser. You can use the function "port()" here to use the same name as the port whose input you are specifying. The name must be enclosed in quotes.
testnum
The number of the test in the test suite from which data should be read. This number is 1-based, i.e. first step is 1.
time
The simulation time within the test. Use "t" to refer to the current simulation model time. You can use arithmetic expressions to adjust the time value, for example to add an offset or scaling.
interpolate
If the time value specified falls between the time for two steps in the test suite then this parameter defines how the value is computed. If the interpolate parameter is 0 then the value from the test suite corresponding to the step before the time value is used. If interpolate is 1 then the value is interpolated (linear) between the steps before and after the simulation time.

Example: testval_time("cruise.rst", port(), 4, t/10, 1) will read data from the 4th test of test suite "cruise.rst" matching the current port name. Time will be slowed down by factor 10 and values will be interpolated between steps.

7.4.1.3  Select Signals Dialog

The Select Signals dialog lets you choose which signals to include in the Next Input Values dialog during user-guided simulation. Initially all harness inputs are included, but you can remove inputs or add outputs, test-points, and configuration variables by clicking the images/gearBtn_web.png button in the Next Input Values dialog and then using the Select Signals dialog to specify the desired subset of signals. Note that in the case of outputs and test points the signal values are only observed, not controlled.


Figure 7.7: The Select Signals dialog lets you choose the subset of signals to include in the Next Input Values dialog during user-guided simulation.
images/selectSignals_web.png

The labeled items in Figure 7.7 work as follows. Note that since outputs and test points are only observed and not controlled, those tabs include only a subset of the columns described below.

  1. This tab lets you select harness inputs.
  2. This tab lets you select harness outputs.
  3. This tab lets you select test points.
  4. This tab lets you select configuration variables.
  5. Port number.
  6. Signal name.
  7. Toggles whether or not the item is included in the Next Input Values dialog.
  8. Generate a random next input value.
  9. Generate the smallest allowed next input value.
  10. Generate the largest allowed next input value.
  11. Reads a value from an existing test suite (see Section 7.4.1.4).
  12. Use an expression to specify the next input value (item 12).
  13. An expression used to generate the next input value.

Note that if item 7 is checked for an input, then the settings specified by items 8-10 will determine the initial configuration of the Next Input Values when it is first opened. The settings can subsequently be modified while stepping. If window item 7 is not checked for an input, then the settings specified by items 8-10 will be the only ones used during stepping.

7.4.1.4  Reading Data from Existing Test Suites

In user-guided simulation mode you can set up any number of inputs to read data from one or more existing test suites:

  • To set all inputs to receive data from the same test suite, either select the “Test” entry in the “all ports” drop-down box in the Next Input Values window (item 9 in Figure 7.6) or click on the header of the “Test” column in the Select Signals dialog (item 11 in Figure 7.7).
  • To set a single input to receive data from a test suite, either select the “Test” entry in the drop-down box for the desired input in the Next Input Values window (item 3 in Figure 7.6) or click the radio button in the “Test” column for the desired input in the Select Signals dialog (item 11 in Figure 7.7).
  • To set a range of inputs to receive data from the same test suite, open the Select Signals dialog (Figure 7.7) and click in the “Test” column of the first input to be set to a test. Make your selections in the test selection dialog (see below) and click OK. Then hold down the shift key and click the last input to be set to a test, which will again bring up the test selection dialog, pre-configured with your previous selection. Click the OK button again.

Doing any of the above will bring up the dialog shown in Figure 7.8, the labeled items work as follows.


Figure 7.8: Selecting data to be read from a test suite.
images/selectTest_web.png

  1. Test data will be read from the test suite currently loaded in Simulator. Note that the test suite must be saved before it can be selected here.
  2. Test data will be read from the test suite selected here.
  3. Selects the test from which data will be read
  4. Data item (input port, output port, configuration variable or test point) in the test suite that the data will be read from. If set to “[Current]” then data will be read from the port in the test suite whose that matches the current port name. This choice is disabled when selecting test data for multiple inputs.
  5. If this mode is selected then data will be read from the test suite on a step-by-step basis, disregarding simulation time. This mode is more efficient and avoids possible rounding errors if the sample rate of the model and the data in the test suite are identical. If test data is read for steps past the end of the test data, the last step value will be repeated.
  6. This allows to set an offset to the step number in the test suite. For example, if set to 5 then the first step read from the test suite will be step number 6. If this is a negative value then the data read before step 1 will be the same as the data for step 1.
  7. If this mode is selected then data will be read from the test suite based on the current simulation time. This is useful if either the sample rate of the test data does not match the model’s sample rate or to perform time scaling (see below).
  8. If this is checked and the time from which data is to be read from the test falls between two time steps in the test suite then linear interpolation will be used to calculate the test data value. If this is not checked then the data from the previous step in the test suite will be used. This also influences the behavior if test data is read for time less than zero or past the end of the test. If checked then the two first or last steps in the test data are used to extrapolate the test data value for the given time. Otherwise the test data for the first or last step is used.
  9. Provide a time offset when reading data from the test suite. This value will be added to the current simulation time to determine the time for which data is read from the test suite.
  10. Provide a time scaling when reading data from the test suite. The current simulation time will be multiplied by this value (before adding the offset) to determine the time for which data is read from the test suite. This allows to speed up or slow down time. For example, if set to 0.5 then data from the test suite will be read at half the speed at which the model simulation time progresses.

Note that this dialog is meant to provide a more user-friendly way to specify the parameters to the testval_step() and testval_time() functions described above in Section 7.4.1.2. Using the functions directly will provide more flexibility in how to read test data. If you set up a port to read data from a test using this dialog and subsequently switch the test data selector (item 3 in Figure  7.6) to “Entry” then you will see the function expression that was constructed to reflect your dialog choices.

7.4.2  Test Input Mode

Simulation inputs may also be drawn from tests in a Reactis test suite. Such tests may be generated automatically by Reactis Tester, constructed manually in Reactis Simulator, or imported using a comma separated value file format. By convention files storing Reactis test suites have names suffixed by .rst.

A Reactis test suite may be loaded into Simulator by clicking the images/openBtn_web.png in the tool bar to the right of Source-of-Inputs dialog (window item 13 in Figure 7.1) or by selecting the Test Suite > Open... menu item.

When a test suite is loaded, the name of the test suite appears in the Reactis title bar and the tests of the suite are listed in the Source-of-Inputs dialog.

When executing in test input mode while Test Suite > Compare Outputs is selected, Simulator compares the values computed by the model for test points and harness output ports against the values stored in the test suite for those items. These comparisons are performed after each simulation step. A difference is flagged if it exceeds the tolerance specified for that port. See Section 5.7 for more information on specifying tolerances for output ports.

If a value in the test differs from that computed by the model for a harness outport, the difference may be observed (as shown in Figure 7.13) by right-clicking on the outport and selecting Open Difference Scope.

7.5  Tracking Data-Item Values

Reactis Simulator includes several facilities for interactively displaying the values that data items (Simulink blocks or signal lines, Stateflow variables, or C variables) assume during simulation. The watched-variable list, or “watch list” for short, displays the current values of data items designated by the user as “watched variables.” You may also attach scopes to data items in order to display values as they vary over time. Scopes behave like Simulink Scope blocks except that they are not hard-wired into models and are instead opened and closed during simulation. Distribution scopes enable you to view the set of values a data item has assumed during simulation (but not the time at which they occur). Difference scopes may be opened for harness outports when reading inputs from a test in order to plot the values computed by the model against the values stored in the test for the outport.

You may add data items to the watch list, or attach scopes to them, as follows.

Using the View menu.
The View menu contains operations for adding data items to the watch list, opening scopes, and opening distribution scopes. These are described in more detail in Section 7.2.
Using pop-up menus in the model hierarchy panel.
Right-clicking on a subsystem in the hierarchy panel brings up a pop-up menu that includes the entries:
  • Add Watched Variables...
  • Open Scopes...
  • Open Distribution Scopes...
Selecting one of these entries will cause a dialog to appear listing data items in the subsystem which may be added to the watched variable list or to which scopes may be attached.
Using pop-up menus in the main panel.
Right-clicking on a data item in the main panel of Simulator invokes a menu that enables you to add the data item to the list of watched variables or open a scope or distribution scope to monitor the values of the data item during simulation. This menu also includes an entry Add To Scope that enables you to plot the data item on a previously opened scope.

You may save the current configuration of the data tracking facilities (the variables in the watch list and currently open scopes along with their locations) for use in a future Simulator session. Do so by selecting View > Save Profile As... and using the resulting file selection dialog to specify a file in which to save a Reactis profile (.rsp file). The profile may be loaded at a future time by selecting View > Load Profile....

7.5.1  Selecting Bus Components


Figure 7.9: Opening a scope on a bus component.
images/openScopeOnBus_web.png

When any of the standard value-inspection operations (Inspect value..., Add Watched Variables..., Open Scopes... , Open Distribution Scopes...) are applied to a bus, an intermediate menu appears which allows you to select a bus component of interest. An example of this is shown in Figure 7.9, where the second element of bus member S2 is about to be selected.

7.5.2  The Watched-Variable List


Figure 7.10: The watched variable panel tracks the current values of data items.
images/watchedA_web.png

The watch list is displayed in a panel at the bottom of the Simulator screen as shown in Figure 7.10. By default this panel is hidden, although adding a variable to the watch list causes the panel to become visible. Visibility of the panel may also be toggled using the View menu as described in Section 7.2. The panel displays a list of data items and their values. The values are updated when Simulator pauses.

The contents of the watch list may be edited using a pop-up menu that is activated from inside the watch-list panel. Individual data items in the panel may be selected by left-clicking on them. Once an item is selected, right-clicking invokes a pop-up menu that enables the selected item(s) to be deleted, have a scope opened, or have a distribution scope opened. If no item is selected, then these choices are grayed out. The right-click pop-up menu also includes an entry Add Variables which displays a list of all data items in the model which may be added to the watch list.

The View menu contains operations for displaying / hiding the watch list, adding data items to the watch list, clearing the watch list.

Right-clicking on a variable in the list brings up a menu containing the following options:

Add Variables...
Same as View > Add Watched Variables...
Remove Variable
Remove the currently selected variable from the list.
Set Significant Digits
Specify the number of significant digits to display for the currently selected variable in the list. Specifying -1 will reset to the default number of digits.
Inspect Value...
Add Component to Watched...
If the currently selected variable in the list is a vector, matrix or bus, selecting this menu item will bring up a selection dialog (see Section 7.5.1) that allows you to add a sub-element of the current selection as a watched variable.
Show Location
Highlights the location of the currently selected watched variable in the main panel.
Open Scope
Opens a scope for the currently selected variable.
Open Distribution Scope
Opens a distribution scope for the currently selected variable.
Highlight updated entries
By default, Reactis highlights variables in the watched variable list if their values have changed. This sub-menu allows you to modify the highlighting behavior:
Do not highlight
Never highlight any variables in the list
Highlight when updated
Always highlight a variable when it is assigned a new value, even if the new value is the same as the previous value.
Highlight when updated with different value
Highlight a variable if it is assigned a value different from its previous value. This is the default.

7.5.3  Scopes

Scopes appear in separate windows, an example of which may be found in Figure 7.11. The tool bar of each scope window contains nine or more items.


Figure 7.11: A scope window plotting desired speed (green) and actual speed (yellow).
images/scopeB_web.png

7.5.3.1  Labeled Window Items

  1. Reset the zoom level of the scope to fit the whole plot (see more on zooming below).
  2. Plot signal as solid lines.
  3. Plot signal as points.
  4. If a scope displays multiple signals, this button toggles whether or not all signals share the same y-axis or each is plotted on its own y-axis.
    images/multiSignalScope_web.png
  5. Save the current scope configuration as a signal group. A signal group is a set of signals along with the settings for displaying the signals in a scope. After saving a signal group, you can reopen a scope for the group in future Reactis sessions. You can add additional signals to a signal group by right-clicking on a signal in the main Reactis panel (when Simulator is enabled), selecting Add to Scope, and selecting the signal group to be extended. To reorder the signals in a group or remove a signal, open a scope for the signal group then click the Settings button (item 8).
  6. Export scope data as either text (csv) or graphics (png, gif, bmp, tif or jpg).
  7. Copy a screen shot of the scope to the clipboard.
  8. Configure the scope settings, including reordering the signals or deleting a signal.
  9. Display help for scopes.
  10. Toggle display of signal 1.
  11. Toggle display of signal 2.

To zoom in to a region of interest of the signal shown in the scope, left-click the top-left corner of the region, hold the button and drag to the lower right corner of the region. The scope will zoom in to the selected region. To zoom out, either click the zoom-to-fit button in the toolbar or right-click in the scope window. Right-clicking will return to the zoom level that was active before the last zoom.

When zoomed in, it is possible to move the displayed region within the scope window. To move the region, hold down the CTRL key and click-and-drag with the left mouse button.

If more than one data item is plotted on a scope, then a toggle button will appear in the tool bar for each data item (window items 10 and 11 in Figure 7.11). Turning one of these buttons off will hide the corresponding data item in the scope. Hovering over the button will display the data item to which the button corresponds.

7.5.4  Distribution Scopes

Distribution scopes also appear in separate windows, an example of which may be found in Figure 7.12. The values a data item assumes are displayed as data points distributed across the X-axis. Zooming in distribution scopes works the same as in regular scopes.


Figure 7.12: Distribution scopes plot the values a data item has assumed during simulation.
images/distributionScopeB_web.png

7.5.5  Difference Scopes

When executing tests from a test suite, a difference scope may be opened by right-clicking on a harness outport and selecting Open Difference Scope. The resulting scope plots the expected value (from the test) against the actual value (from the model) as shown in Figure 7.13. If the difference between the two values exceeds the tolerance specified for the output port (see Section 5.7), then a red background in the difference scope and a red bar on the X-axis highlight the difference.

After zooming into an area of difference, white and yellow and green background regions around the plotted values highlight the tolerance, as shown in Figure 7.14. The green region represents the overlap between the tolerance of the test and model values. A difference is flagged whenever the test or model value lie outside of the green region.


Figure 7.13: A difference scope may be opened by right-clicking on a harness outport and selecting Open Difference Scope. The scope plots the values stored in a test for an output and the values computed by the model for the output. Differences are flagged in red.
images/differenceScope_web.png


Figure 7.14: The white and yellow colored backgrounds around the value lines highlight the tolerance intervals of the test and model values. The overlap between the yellow and white regions is colored green. If either the test or model value lie outside the green region, a difference is flagged.
images/differenceScope2_web.png

7.5.6   Non-executed blocks and signals

Some blocks may not be executed at all during a simulation step, either because they are located within conditional (i.e. triggered or enabled) subsystems that do not execute during the step or because of conditional input branch execution (see section 15.1). Furthermore, when Simulator hits a breakpoint or while mini-stepping through a simulation step some blocks may not have been executed yet.

When hovering over a signal or if the output signal of a block is added as a watched variable, Reactis will show one of the following:

not computed
means that that the block outputting the signal has not (yet) been executed during this simulation step.
not fully computed
means that the signal consists of multiple elements (i.e. a bus or multiplexed signals) and at least one block whose output is part of the signal has not yet executed.
value
If the signal has been computed then Reactis will show its value.

For technical reasons the following blocks will always show a value and never show “not computed”:

  • Top-level input ports
  • Merge blocks
  • Outputs of conditionally executed (i.e. triggered or enabled) subsystems
  • Virtual blocks (From/Goto, subsystem input ports, virtual buses) receiving their input from other blocks in this list.

In scopes, signals only show a trace for times at which they were computed.

Note that Reactis can be configured to always show the signal value by switching the “Indicate when signal values were not computed” setting in Edit > General to “Never”.

7.6  Tracking Model Coverage

Chapter 6 describes the coverage metric that Reactis employs for measuring how many of a given class of syntactic constructs or coverage targets that appear in a model have been executed at least once. Simulator includes extensive support for viewing this coverage information about the parts of the model that have been exercised by the current simulation run. If a test suite is being executed, the coverage data is cumulative. That is, all targets covered by the portion of the current test executed so far, plus those targets exercised in previous tests are listed as covered.

7.6.1  The Coverage Summary Dialog

The Coverage Summary Dialog shown in Figure 7.15 may be invoked at any time Simulator is enabled by selecting Coverage > Show Summary. The dialog reports summary statistics for each coverage metric tracked by Reactis. Each row in the dialog corresponds to one of the metric and includes five columns described below from left to right.

  1. The name of the coverage metric reported in the row.
  2. The number of targets in the metric that have been exercised at least once.
  3. The number of targets in the metric that are unreachable. A conservative analysis is performed to check for unreachable targets. Any target listed as unreachable is provably unreachable; however, some unreachable targets might not be flagged as unreachable.
  4. The number of reachable targets in the metric that have not been exercised.
  5. The percentage of reachable targets in the metric that have been exercised at least once.

Figure 7.15: The Coverage Summary Dialog
images/cvgSummary_web.png

7.6.2  Coverage Information in the Main Panel

Selecting Coverage > Show Details causes uncovered targets to be drawn in red in the main panel. Targets that have been covered are drawn in black. Hovering over an exercised target will cause a pop-up to be displayed that gives the test and step in which the target was first executed. This type of test and step coverage information is displayed with a message of the form test/step. A “.” appearing in the test position ./step denotes the current simulation run which has not yet been added to a test suite. If -/- is displayed, the target has not yet been covered.


Figure 7.16: Viewing MC/DC-related coverage information for a Stateflow transition.
images/mcdcTableB_web.png

For items included in the MC/DC  coverage measure (Simulink Logic and If blocks and Stateflow transition segments whose label includes an event and/or condition) or CSEPT coverage (states, transition segments) , detailed coverage information may be obtained by right-clicking on the item and selecting View Coverage Details. A dialog similar to that in Figure 7.16 will appear and give coverage information for decision coverage, condition coverage, and MC/DC.

The table in this figure describes coverage for the decision:

set == 1 && deactivate == 0

This decision contains the following two conditions:

  • set == 1
  • deactivate == 0.

Conditions are the atomic boolean expressions that are used in decisions. The first two columns of the table list the test/step information for when the decision first evaluated to true and when it first evaluated to false. A value -/- indicates that a target has not yet been exercised. The third column lists the conditions that make up the decision, while the forth and fifth columns give test/step information for when each condition was evaluated to true and the false.

MC/DC Coverage requires that each condition independently affect the outcome of the decision in which it resides. When a condition has satisfied the MC/DC metric in a set of tests, the sixth and seventh columns of the table explain how. Each element of these two columns has the form bb:test/step, where each b reports the outcome of evaluating one of the conditions in the decision during the test and step specified. Each b is either T to indicate the condition evaluated to true, F to indicate the condition evaluated to false, or x to mean the condition was not evaluated due to short circuiting.


Figure 7.17: Viewing MCC coverage information for a Stateflow transition.
images/mccCvgDetails_web.png

In addition to MC/DC, Reactis can also measure Modified Condition Coverage, or MCC. MCC targets every possible combination of conditions within a decision, so that a decision containing N conditions can result in as many as 2N MCC targets, although the actual number may be less if short-circuited evaluation is used.

Figure 7.17 shows the MCC coverage details for a Stateflow transition. The MCC coverage details are in the form of a table where all except the last 2 columns correspond to conditions, and each row corresponds to a single MCC target. For each target, all conditions have one of three possible values:

True.
The condition is true.
False.
The condition is false.
x.
The condition is not evaluated due to short-circuiting.

The last two columns of the MCC details contain the result of the decision and the test/step when the target was covered.


Figure 7.18: Filtering MCC coverage information for a Stateflow transition.
images/mccCvgDetailsFilter_web.png

As shown in Figure 7.18, MCC coverage details can be filtered by clicking on the column headers. A filtered column header is indicated by a prefix of T:, F:, or x:, which correspond to the column values True, False, and x, respectively. Clicking on a column header advances the filter setting for that column to the next feasible value, eventually cycling back to unfiltered. The Covered column includes one additional prefix E: to display the targets that have been excluded (see Section 6.5.3). All columns can also be reset to the unfiltered state at any time by clicking on the Clear Filter button. Note that the individual filters for each column are combined exclusively (i.e., using the Boolean and operator), so that only targets which satisfy all active filters are shown. Figure 7.18 (c) shows the results of setting the filters for set==1.0 and deactivate==0.0 to true. In this case, the only target shown is the one where both of these conditions are true.


Figure 7.19: If a Stateflow state or transition segment has CSEPT targets, then you can right click on the state or transition and select View Coverage Details to view the relevant CSEPT coverage information.
images/CSEPTCvgDetails_web.png

Figure 7.19 shows the Coverage Details dialog for Child State Exit via Parent Transition (CSEPT) coverage . For the full definition of this metric see Section 6.2.4. Conceptually CSEPT tracks whether each child of a Stateflow state S has been exited by each transition that causes S to exit. In the figure, CSEPT tracks that each of the states Inactive, Active, and Init have been exited as a result of the transition from On to Off firing.

Right-clicking on a (non-top-level) state S and selecting View Coverage Details causes the display of a dialog that lists the transitions that cause the parent of S to exit. Clicking the Highlight buttons in the dialog lets you identify the state and transitions from the list in the main panel.

Right-clicking on a transition segment that is part of a transition T that causes a parent state to exit and selecting View Coverage Details causes the display of a dialog that lists the child states that can be exited as a result of T firing. Clicking the Highlight buttons causes each transition to be highlighted in the main panel.

7.6.3  The Coverage Report Browser

The Coverage-Report Browser enables you to view detailed coverage information and export the reports in HTML format. It is invoked by selecting Coverage > Show Report and is described in detail in Chapter 10.

7.7  Exporting and Importing Test Suites

7.7.1  Exporting Test Suites


Figure 7.20: The Reactis test-suite export window.
images/export_web.png

The export feature of Reactis allows you to save .rst files in different formats so that they may be processed easily by other tools. The feature is launched by selecting Test Suite > Export... when a test suite is loaded in Simulator. You specify the format and name of the exported file in the General tab of the Export Dialog (Figure 7.20). For some export formats, other tabs appear in the dialog to enable you to fine-tune exactly what is included in the exported file. In the case of .csv files, you may specify a subset of tests from the test suite to be exported as well as which data items (inputs, outputs, test points, configuration variables) should be included in each test step. The following formats are currently supported:

.m files:
Suites may be saved as MATLAB scripts so that they may be run using The MathWorks’ Simulink / Stateflow environment. Section 12.2 describes how to execute exported .m files in Simulink. That section describes how the rsRunTests utility distributed with Reactis enables you to load an exported .m file, execute the tests therein, and report any differences between the values computed by Simulink for outputs and the values stored in the tests. When exporting to this format you have a choice of exporting fixpoint values on harness input or output ports as either “double” values (easier to read) or Simulink fixpoint objects (more precise). Enumerated values can be exported as either their underlying integer values or instances of their enumeration objects.
.mat files:
Suites may be saved as .mat files so that they may be run using The MathWorks’ Simulink / Stateflow environment. This binary format enables values in tests to be represented with more precision than is possible in the ASCII-based .m file format. When running a Reactis-generated test suite on a Simulink model, the higher precision of test data helps avoid some rounding errors. Section 12.2 describes how to execute exported .mat files in Simulink. The rsRunTests utility works for .mat files exactly as described above for .m files. When exporting to this format, fixpoint values will always be exported as “double” values and enumerated values will be exported as their underlying integers.
.mat files (for FromWorkspace blocks):
Suites may be saved in an alternative .mat file format so that they may be run using The MathWorks’ Simulink / Stateflow environment on a modified version of the model that uses ’FromWorkspace’ blocks in place of harness inports. Section 12.2 describes the contents of these exported files and how to execute them in Simulink.
.csv files:
Suites may be saved as comma separated value (CSV) files. The different tabs of the export dialog enable you to specify which data from a test suite should be exported. Namely, you can indicate which tests should be exported and for each test step which inputs, outputs, test points, and configuration variables should have values recorded.

The first line of an exported file will contain a comma separated list of the names of the model’s input and output ports, test points, and configuration variables that were selected for export. A column recording the simulation time has the label ___t___. Any names containing non-alphanumeric characters will be surrounded by double quotes (") and newlines in names will be translated to \n. Subsequent lines contain either:

  • A comma-separated list of values that includes one value for each item appearing in the first row. The order of the values in a row corresponds to the order the items appeared in the first line. Each such line contains the values for one simulation step.
  • An empty line signaling the end of a test.

The options for configuring the CSV output work as follows:

Compress output.
If selected, then test steps will be omitted if no item that would be recorded in the step other than the simulation time is different from the corresponding value in the previously recorded step. This is especially useful when exporting only inport data for a test in which inputs are held constant for a number of steps.
Export vector, matrix and bus elements in separate columns.
If checked then each of these items will be placed in its own column. If not checked, when a port carries a vector or bus signal, then the values of the signal appear within double quotes (") as a comma-separated list.
Prepend ’|’ to configuration variable names (to prevent ambiguities).
This option helps avoid problems if a port and configuration variable have the same name.
Export .rsi file revision.
If checked, an extra column labeled RsiRevision is included in the exported CSV. In this column the first step of the first test lists the current revision of the .rsi file.
Export each test to a separate file.
Instead of exporting to a single file and separating tests by an empty line, each test is exported to a separate file. The file name specified in the Output File box will be used as the base file name to which N.csv will be appended to form the file name (where N is the test number).
Export test names to column.
If checked, a column with the specified name as a header is included. For each test step, the column contains the name of the test.
Export test point column headers as.
Selects the column header content for exported test points: “name” will show the test point’s name as specified in the test point parameter dialog, “location” will show the test point’s complete path within the model hierarchy. Select “legacy” for to use the same header format as Reactis V2016.2 and earlier for backwards compatibility.
Export boolean values as.
Select whether boolean values will be exported as integers (0, 1) or doubles (0.0, 1.0).
Export enumerated values as.
Lets you specify each enumerated value should be exported as its underlying integer or as its name.
Export fixed-point values as.
Lets you specify whether each fixed-point value should be exported as its real-world value (default) or as its stored integer.
Significant digits for floating-point values.
Lets you specify the number of significant digits to be used when exported floating point values.

7.7.2  Importing Test Suites

Reactis can also import tests and add them to the current test suite. Test suites may be imported if they are stored in the Reactis’s native .rst file format or in the comma separated value (CSV) format (described above) that Reactis exports. The import feature is launched by selecting Test Suite > Import... when Simulator is enabled.

To execute a test suite in Simulator, the test suite must match the executing model. A test suite matches a model if it contains data for the same set of inports, outports, test points, and configuration variables as the model. If an externally visible data item (inport, outport, test point, or configuration variable) is added to or removed from a model, then previously constructed test suites will no longer match the new version of the model. The import facility gives you a way to transform the old test suite so that it matches the new version of the model. Such remapping is also available when importing .csv files.


Figure 7.21: The Import Test Suite dialog allows you to import external test data (comma separated value format) and if necessary transform the data to produce a test suite that matches a model. The import facility is also used to transform an .rst file to make it match a model.
images/import_web.png

The Import Dialog, shown in Figure 7.21, is used to specify how test data should be remapped during import. The dialog contains a tab for each type of data item stored in a test suite (inputs, outputs, test points, configuration variables). In the case of .csv files, the import dialog also contains a tab Not Imported that lists items present in the CSV file that are not scheduled to be imported into the new test suite. When an .rst file includes an item not scheduled to be imported, it is placed at the bottom of the appropriate tab. For example, if a test suite contains an inport X and is being imported with respect to a model that has no inport X, then X will appear at the bottom of the Input Ports tab and be highlighted in yellow.

Each data item tab (e.g. Input Ports) includes a column (e.g. Model Port Name) listing the model items in that category. The Suite column lists items from the file being imported that map to the corresponding model item. In most cases a data item X in the test suite being imported will map to an item with the same name in the model. If the model contains an item not found in the test suite being imported, then the corresponding Suite column entry will be listed as Not Assigned and be highlighted in yellow (as shown in Figure 7.21 for inport brake). On the bottom of the import dialog, a drop-down box allows you to select Zero or Random. This specifies the value to be used for Not Assigned items. If a type (e.g. enumeration or fixed-point) or type constraint (e.g., the set 1,2,3) does not allow a zero value, then the value closest to zero allowed by the type is used. If Random is selected for unassigned items, then unassigned inputs will receive a new random value for each step. If Random is selected an unassigned items, then unassigned configuration variables will receive a new random value for each test, remaining unchanged throughout the test. When importing Buses/Arrays, all leaf elements are set to the Random/Zero value as selected. The range of the random values will be the full range of the type constraint of the input. If there is no type constraint, the random value will be the full range of the input type.


Figure 7.22: Importing test data into a bus.
images/importBus_web.png

If the model has bus input ports, Import Test Suite, these will be marked with a [+] or [-] to the left of the port name. If a name is preceded by [+], this indicates that there are hidden sub-components which are not currently being displayed. Conversely, if a name is preceded by [-], this indicates that there are sub-components which appear immediately below the name, and are indented to indicate their status as a sub-component. Left-clicking on the name of a bus input toggles the display of its sub-components. Figure 7.22 shows a test suite being imported into a model containing a bus input.

Finally, note that when loading a test suite stored in the pre-V2006 format, the test suite will be automatically converted to the new format.

7.8  Updating Test Suites

When a test suite is loaded in Simulator, this feature can be invoked by selecting menu item Test Suite > Update... to open the dialog shown in Figure 7.23. The dialog offers two methods for updating an existing test suite (.rst file):

  • The first method Update using Reactis executes a test suite in Reactis, updates items (inputs, configuration variables, test points, and outputs) as configured by the checkboxes, and writes the modified suite to a new .rst file.
  • The second method Update outputs using Simulink exports the test suite to .mat format, executes the exported suite in Simulink, updates the test suite to contain the output values computed by Simulink, saves the updated suite to a .mat file or .m file (in the format supported by the Reactis rsRunTests utility).

Figure 7.23: The Update Test Suite dialog offers several options to configure how a test suite is updated.
images/updateTestSuite_web.png

7.8.1  Update Using Reactis

This method creates a new test suite by simulating the current model using inputs from the current test suite, but recording values for outputs and test points generated by the model. Updating the outputs can be useful for updating test suites when a model is modified, but its input ports remain unchanged. The result of invoking this routine with these options selected is a new test suite with the same input values at each step, but with outputs and test points updated (as specified in the dialog) with values generated by the currently loaded version of the model.

Additionally, inputs and configuration variables can also be updated to conform with constraints specified in the .rsi file. When these options are selected, if an input or configuration variable has a test suite value not conforming to its constraint, then the test suite is updated to contain the closest value adhering to the constraint.

The five check-boxes specify what is written to the new test suite as follows:

 Contents of new test suite when:
 CheckedNot Checked
Adapt harness input port values to match constraints
If an input value in the test suite does not conform to the constraint specified for that input in the .rsi file, then convert the input value to the closest value adhering to the constraint. Do not change an input value if it violates the input’s constraint.
Update inputs controlled by virtual sources
Each input currently controlled by a virtual source shall be updated with the values computed by the controlling virtual source at each step as the tests execute. The values for each input shall be those from the original test suite.
Adapt configuration variable values to match constraints
If a configuration variable value in the test suite does not conform to the constraint specified for that variable in the .rsi file, then convert the value to the closest value adhering to the constraint. Do not change a configuration variable value if it violates the variable’s constraint.
Update test points
Each test point shall be updated with the value computed by the model for the test point as the tests execute. The values for each test point shall be those from the original test suite.
Update outputs
Each output shall be updated with the value computed by the model for the output as the tests execute. The values for each outport shall be those from the original test suite.

7.8.2  Update Outputs Using Simulink

This method creates a new test suite by loading the current model in Simulink, executing it (in Simulink) using the input values from the current test suite, and capturing the outputs produced by the model. The inputs and outputs are stored using the .m or .mat format supported by the Reactis rsRunTests utility.

7.9  Model Highlighting

Simulator renders model diagrams using a number of different colors and line styles to convey information during simulation. In this section, we describe these different drawing styles and their semantics.

Some of the default drawing colors are as follows. During slow, single-step, or mini-step simulation, a model element is drawn in green while it is being evaluated. Selecting Coverage > Show Details configures Reactis to highlight uncovered model elements in red and unreachable model elements in purple. Please refer to Chapter 6 for a description of the different coverage metrics tracked by Reactis.


Figure 7.24: The Select Line Style dialog.
images/lineStyle_web.png

The dialog shown in Figure 7.24 (invoked by selecting View > Select Line Styles...) enables you to configure how Simulator should draw various diagram elements. Each row in the dialog specifies the rendering of one group of model elements. The different groups of configurable diagram elements are:

Uncovered Block.
A Simulink block B is in this group if it has not been fully exercised, i.e. one of the following holds:
  • B is a block included in the branch coverage metric and at least one of B’s branches remains uncovered.
  • B is a conditional subsystem that has never been exercised.
  • B is a logical operator block that has not satisfied the requirements for all MC/DC-related coverage metrics (decision, condition, and MC/DC).
  • B is a relational operator block and some boundary value has not been exercised.
  • B is a harness inport and not all boundary values have been exercised.
  • B is a block with saturate on integer overflow enabled and some boundary value has not been exercised.
  • B is a Lookup table and has uncovered lookup targets.
  • B is an uncovered user-defined target.
  • B is a violated assertion.
Uncovered State.
A Stateflow state is in this group if it has never been entered.
Uncovered Condition Action.
A Stateflow transition segment is in this group if it has not met the requirements for condition action coverage. In other words, its condition action has never been evaluated. If a segment has no condition action, then it is considered uncovered according to condition action coverage if its condition has never evaluated to true. Note that a segment with an empty condition is assumed to evaluate to true whenever the segment is evaluated during simulation.
Uncovered Transition Target.
A Stateflow transition segment is in this group if it has met the requirements for condition action coverage, but has not met the requirements for one of the other coverage metrics associated with transition segments. These metrics include transition action coverage, decision coverage, condition coverage 3, MC/DC, and CSEPT.
Unreachable Block.
A Simulink block B is in this group if Reactis has determined that some aspect of the block’s behavior can never happen and that all behaviors that are possible have occurred. If some reachable behavior has not occurred, then the block is considered a member of the “Uncovered Block” group and rendered accordingly.
Unreachable State.
A Stateflow state is in this group if it can never be entered.
Unreachable Condition Action.
A Stateflow transition segment is in this group if it can never meet the condition action coverage requirement.
Unreachable Transition Target.
A Stateflow transition segment is in this group if it has met the requirements for condition action coverage, but cannot meet the requirements for one of the other coverage metrics associated with transition segments.
Active Block.
A Simulink block is in this group if it is currently being evaluated.
Active State.
A Stateflow state is in this group if it is currently active.
Active Condition Action.
A Stateflow transition segment is in this group after its condition evaluates to true and until the next model element is highlighted as active.
Active Transition Action.
A Stateflow transition segment is in this group as it is firing as a part of a transition.

1
Any operation which modifies the .rsi file is disabled when Simulator is enabled.
2
only the first three are shown for inputs
3
Note that “condition coverage” and “condition action coverage” are two distinct metrics.