Automated Testing and Validation with Reactis®

 
 Reactis for C User's Guide   Contents  |  Index
 Chapters:  1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17

Chapter 8   Reactis Simulator

Simulator provides an array of features — including single- and multi-step forward and backward execution, breakpoints, simulations driven by Tester-generated test suites, and interactive simulation — for simulating your source code. The tool also allows visual tracking of coverage data and the values data items assume during simulation.


simToolbarB_web.png
Figure 8.1: The Reactis Simulator toolbar.

Figure 8.1 contains an annotated screen shot of the the Simulator toolbar. Some of the buttons and menus on the leftmost part of the window have been elided; Chapter 4 contains descriptions of these items. The next section describes the labeled items in Figure 8.1, while the section following discusses the menu entries related to Simulator. The subsequent sections discuss the different modes for generating inputs during simulation, the ways to track data values, how to monitor code coverage, the importing and exporting of test suites, and the different code highlighting styles used by Simulator.

8.1  Labeled Window Items

  1. Disable Reactis Simulator.
  2. Enable Reactis Simulator.
  3. Clicking this button resets the simulation; the code execution is returned to the start state, and coverage information is appropriately reset.
  4. Clicking this button causes the simulation to take n steps back, where n is specified by window item 15. Coverage information is updated appropriately upon completion of the last backward step.
  5. Clicking this button causes the simulation to take one step back. Coverage information is updated appropriately.
  6. Step backward to the point just before the current function was called.
  7. Step backward one statement. Any function calls performed by the statement are stepped over.
  8. Step backward one statement. If the statement performed any function calls, stop at the end of the last function which was called.
  9. Clicking this button executes a single C statement, stepping into a function when at a function call. The button is disabled during fast simulation and reverse simulation.
  10. When paused at a function call, clicking this button steps over the function (executes the function and pauses at the following statement).
  11. Step out of the currently executing function.
  12. Clicking this button causes the simulation step to advance forward by one full step; that is, values are read on the harness inputs, the program’s response is computed and values are written to the harness outputs. If a step has been partially computed, then execution picks up with the current partially computed step and continues until the end of the step, at which point values are written to the harness outputs.
  13. When paused, clicking this button causes n forward simulation steps to be taken, where n is specified by window item 15. The diagram in the main panel is updated during simulation to reflect the currently executing code. When Coverage -> Show Details is selected, coverage targets will change from red to black as they are covered during the simulation run. If the end of the current test or test suite is reached or you click the Run/Pause button again (window item 13), then simulation stops at the end of the current simulation step.
  14. Clicking this button causes n simulation steps to be executed, where n is specified by window item 15. The diagram in the main panel is not updated while the simulation is in progress but is updated when simulation halts. If the end of the current test or test suite is reached then simulation halts.

    When a fast simulation is running, clicking this button pauses the simulation and the end of the currently executing step.

  15. This window item determines how many steps are taken when buttons corresponding to window items 413 , or 14 are clicked. When the Source-of-Inputs Dialog (window item 16) is set to a test or test suite, the number of steps may be set to 0 to indicate that the entire test or test suite should be executed.
  16. The Source-of-Inputs Dialog determines how input values are computed during simulation. See Section 8.4 for details.
  17. Clicking this button causes a new, empty test suite to be created. The name of the .rst file containing the suite is initially “unnamed.rst” and is displayed in the title bar of the Reactis for C window.
  18. Clicking this button displays a dialog for selecting a test-suite (.rst file) to be loaded into Simulator. After it is loaded, the test suite’s name is displayed in the title bar, and the tests are listed in the Source-of-Inputs Dialog (window item 16).
  19. Clicking this button causes the current test suite to be saved.
  20. View Reactis for C help.
  21. The hierarchy panel (not shown explicitly) supports the navigation of the project, as described in Section 4.1. It shows the root build file (.rsm file) at the top, and the C files and libraries below.
  22. The main panel displays the contents of the C or .rsm file currently selected in the hierarchy panel. You may interact with the panel in a number of different ways using the mouse. These include hovering over items in the code (e.g. variables, function names, macros) or right-clicking in various parts of the panel. The following mouse operations are available when Simulator is enabled:
    Hovering...

    • over a variable name (global or local) or function parameter displays1:
      • the current value of the variable or “[out of scope]” if the variable is not currently in scope.
      • where the variable is defined.
      • where the variable was last modified.
    • over a function name displays the return type and types of its parameters along with the location of the function definition.
    • over a macro name in a macro invocation displays the expansion of the macro.
    • over a typedef shows the typedef declaration.
    • over the argument of a #include directive displays the pathname of the file which was included.
    • over any Reactis coverage target will display the test and step within the test during which the target was first executed. This information is presented in a message of the form “Covered: test/step”. A “.” in the test location indicates the current simulation run. For boundary value coverage, you may hover over the definition of inputs or configuration variables. In the case of inputs, this is either the definition of a parameter of the entry function or a global variable declaration. In the case of configuration variables, this is a global variable declaration.

      For more details on querying coverage information see Section 8.6, Chapter 12, and Chapter 7.

    Right Clicking...

    Causes different pop-up menus to be displayed. The contents of the menus vary based on where the click occurs and whether or not Simulator is enabled. A summary of the menu items available when Simulator is enabled follows 2. For descriptions of the menu entries available when Simulator is disabled, see Section 4.1.
    Right-Click Location
    Menu Entries (when Simulator is enabled)
    Global variable or static local variable
    Add To Watched
    Add item to watched variables list (see section 8.5.1).
    Open Scope
    Display item in scope (see section 8.5.2).
    Open Distribution Scope
    Display item in distribution scope (see section 8.5.3).
    Add To Scope
    Add item to previously opened scope. This item only appears when other scopes are open.

    Decision coverage target

    View Coverage Details
    Display dialog containing detailed coverage information for the decision (i.e. status of decision, condition, and MC/DC targets).
    Harness output
    Open Difference Scope
    This menu item is enabled when a test suite is loaded. It opens a scope which displays the differences between the output value computed by the program under test and the output value stored in the test suite. See Section 8.5.4 for details.
    In the line number bar to the left of the main panel.
    Toggle Breakpoint
    Enable or disable breakpoint for the line.
    Double Clicking...

    • in the line number bar to the left of the main panel toggles a breakpoint for the line.
    • on any other item opens a separate window which contains the same information displayed when hovering on that item. This is useful when the information in in the hover window is clipped due to excessive length.

8.2  Menus

Except for the documented exceptions related to editing .rsh files 3, the menus described in Section 4.2 work in the same manner when Simulator is enabled. The following additional menu items are also active when Simulator is enabled.

View menu.
The following entries become enabled when Simulator is “on”.
Show Watched Variables.
Toggle whether or not watched-variable list is displayed. The default is not to show them; adding to the list automatically causes the list to be displayed.
Clear Watched Variables.
Remove all items from the watch list.
Open Difference Scopes...
Opens a dialog which allows you to select one or more harness outputs and opens a difference scope on each selected output. This item is only enabled when the input source is a test suite. See Section 8.5.4 for details.
Close All Scopes.
Closes all open scopes.
Save Profile as...
Save the current view profile under a new name. The view profile contains the currently opened scopes and watched variables. Profiles are saved in with the .rsp suffix.
Load Profile...
Load a different view profile (.rsp file ). This will automatically open all scopes and watched variables stored in the profile.
Simulate menu.
The following entries are available when Simulator is enabled.
Simulator on/off.
Enable or disable Simulator. When disabled, Simulator behaves as a source code viewer; that is, the code can be viewed but simulation capabilities are disabled.
Fast Run with Report...
Execute a fast simulation simulation and produce a report. See Section 8.3 for details.
Fast Run.
Same as window item 14.
Run.
Same as window item 13.
Step.
Same as window item 12.
Step Into.
Same as window item 9.
Step Over.
Same as window item 10.
Step Out Of.
Same as window item 11.
Stop.
Stop a fast or slow simulation run.
Reverse Step Into.
Same as window item 8.
Reverse Step Over.
Same as window item 7.
Reverse Step Out Of.
Same as window item 6.
Back.
Same as window item 5.
Fast Back.
Same as window item 4.
Reset.
Same as window item 3.
Clear Breakpoints.
Removes all breakpoints.
Set Animation Delay...
When running a slow simulation, this value specifies the duration of the pause between the evaluation and highlighting of different statements.
Update Configuration Variable...
Initiates a dialog for changing values of configuration variables, which are global or static local variables whose values can only change between tests/simulation runs (but not during a test/simulation run). The simulation must be reset to the start state (by clicking the reset button  resetBtn_web.png  , window item 3) before the value of a configuration variable may be updated. Note also that whenever inputs are read from a test, the configuration variable values from the test will be used. In other words, manual updates to a configuration variable using this menu item will only have effect when in random or user input mode.
Test Suite menu.
New.
Same as window item 17.
Open.
Same as window item 18.
Save.
Same as window item 19.
Save and Defragment.
Removing tests from a test suite can cause the test suite to become fragmented, meaning that space within the file becomes unused. Reactis will reuse those gaps when you add tests. Selecting this menu item will save the current test suite and reorganize it, removing all gaps.
Save As...
Save current test suite in an .rst file . A file-selection dialog is opened to determine into which file the test suite should be saved.
Import...
Import tests and add them to the current test suite. Importing is described in more detail in Section 8.7.2.
Export...
Export the current .rst file in different formats. Exporting is described in more detail in Section 8.7.1.
Create...
Launch Reactis Tester. See Chapter 9 for details.
Update...

Create a new test suite by simulating the current code using inputs from the current test suite, but recording values for outputs generated by the code. This feature is described in Section 8.8.

Browse...
Open a file selection dialog, and then launch the Test-Suite Browser on the selected file. See Chapter 11 for details.
Browse current...
Launch the Test-Suite Browser on the currently loaded test suite. See Chapter 11 for details.
Add/Extend Test.
At any point during a simulation, the current execution sequence (from the start state to the current state) may be added as a test to the current test suite by selecting this menu item. After the test is added it will appear in the Source-of-Inputs Dialog (window item 16). Note that the new test will not be written to an .rst file until the current test suite has been saved using window item 15 or the Test Suite -> Save menu item.
Remove Test.
Remove the current test from the current test suite. Note that the test will not be removed from the .rst file until the current test suite has been saved using window item 19 or the Test Suite -> Save menu item.
Compare Outputs.
Specify whether or not Simulator should compare the simulation results against the results contained in the test suite being executed. When enabled if a difference is detected then the difference between the computed value and the value stored in test suite is reported in a warning.
Coverage menu.
The Coverage menu contains the following entries. Details about the different coverage objectives may be found in Chapter 7. The coverage information available from the various menu items is for the current simulation run. If a test suite is being executed, the coverage data is cumulative. All targets covered by the portion of the current test executed so far, plus those targets exercised in previous tests are listed as covered.
Show Summary.
Open the coverage summary dialog shown in Figure 8.10.
Show Details.
Toggle the reporting of coverage information by coloring, underlining, or overlining code elements in the main panel.
Show Report.
Start the Coverage-Report Browser. See Chapter 12 for details.
Highlight
Decisions, Conditions, Decisions, MC/DC, MCC, Boundaries, User-Defined Targets, Assertion Violations, C Statements. Each of these menu entries corresponds to one of the coverage metrics tracked by Reactis and described in Chapter 7. When a menu entry is selected and Show Details is selected, any uncovered target in the corresponding coverage criterion will be colored.
Select All.
When Show Details is selected, show coverage information for all metrics.
Deselect All.
When Show Details is selected, show no coverage information.
Highlight Unreachable Targets.
When Show Details is selected, color unreachable targets. A target is unreachable if it can be determined that the target will never be covered prior to simulating the code. The analysis used is conservative: marked items are always unreachable, but some unmarked items may also be unreachable.

8.3  Creating Test Execution Reports

Fast Simulation Run with Report executes all tests within the current test suite and produces a report which lists all runtime errors (divide-by-zero, overflow, memory errors, missing cases, assertion violations, etc.) and significant differences between output values stored in the test suite and those computed by the program under test.


cTestExecReportOptsA_web.png
Figure 8.2: The Reactis Test Execution Report Options dialog is used to select the items which appear in a test execution report.

When Simulate -> Fast Run with Report... is selected, the dialog shown in Figure 8.2 will appear. This dialog is used to select the items which will appear in the report. The following items are labeled in Figure 8.2:

  1. The Report Options panel is used to select optional report items, such as the date, pathnames of input files, etc.
  2. When Include coverage report is selected in the Report Options panel, the Coverage Metrics panel can be used to select which coverage metrics are included in the test execution report. There are three choices for each metric:
    • Summary & Details. Targets of the metric will appear in both the coverage summary and coverage details sections of the report.
    • Summary Only. Targets of the metric will appear in the coverage summary only.
    • None. Targets of the metric will be omitted from the report entirely.
    Note that due to dependencies between metrics, some combinations are not allowed. For example, Summary & Details cannot be selected for Condition targets unless Summary & Details is also selected for Decision targets.
  3. The Output Format panel lets you choose between HTML and XML output formats. There is also an option to preview the results before saving the report.
  4. The Difference limit prevents reports for test runs with many output differences from becoming excessively long. Once the limit is reached, output differences are still counted but no details are included in the report.
  5. When you are satisfied with the selected report options, clicking on this button will close the dialog and start the simulation run.
  6. Clicking on this button will close the dialog without initiating a simulation run.

cTestExecReport_web.png
Figure 8.3: A test execution report can be generated by loading a test suite in Simulator and selecting Simulate -> Fast Run with Report...

Once the simulation run begins, it does not stop until all tests have been executed. During each test, if a runtime error is encountered, the remaining steps of the test are skipped and Simulator continues execution with the following test. After the last test is executed, a window containing the test execution report will appear, as shown in Figure 8.3. An HTML version of the report can be saved by clicking the Save button in the report window.

8.4  Specifying the Simulation Input Mode


inputsSource4_web.png
Figure 8.4: The Source-of-Inputs Dialog enables you to specify how Simulator computes input values.

Reactis Simulator performs simulations in a step-by-step manner: at each simulation step values are generated for each input, and resultant output values are reported. You control how Simulator computes input values using the Source-of-Inputs Dialog (window item 16 in Figure 8.1) shown in Figure 8.4. This dialog always includes the Random Simulation and User Guided Simulation entries; if a test suite has been loaded, then the dialog includes an entry for each test and the All button becomes enabled. The dialog is used to specify how input values are generated as follows.

Random Simulation.
For each input, Reactis for C randomly selects a value from the set of allowed values for that variable, using type and probability information contained in the associated .rsh file . See Chapter 6 for a description of how to enter this information using the Reactis Harness Editor.
User Guided Simulation.
You determine the value for each input using the Next Input Value dialog, which appears when the User Guided Simulation entry is selected. See Section 8.4.1 below for more information on this mode.
Individual Tests.
When a test suite is loaded, each test in the suite has a row in the dialog that contains a test number, a sequence number, a name and the number of steps in the test. Selecting a test and clicking on Ok will cause inputs to be read from the test.
Subset of Tests.
You may specify that a subset of tests should be run by holding down the control key and clicking on each test to be run with the left mouse button. The tests will be run in the order they are selected. As tests are selected the sequence number column is updated to indicate the execution order of the tests. When a new test is started, the code execution is reset to its starting configuration, although coverage information is not reset, thereby allowing users to view cumulative coverage information for the subset of tests.
All Tests.
Clicking the All button in the lower left corner specifies that all tests in the suite should be executed one after another. The tests are executed sequentially. When a new test is started, the code execution is reset to its starting configuration, although coverage information is not reset, thereby allowing you to view cumulative coverage information for the entire test suite. Section 8.4.2 contains more information on this mode.

You can change the sorting order of the tests in the table by clicking on the column headers. For example, to sort the tests by the number of steps, simply click on the header of the Steps column. Clicking again on that header will sort by number of steps in descending order.

You may also use the Source-of-Inputs Dialog to change the name of a test. To do so, select the test by clicking on it, then click on the name and, when the cursor appears, type in a new name.

8.4.1  User Input Mode

When the User Guided Simulation entry is selected from the Source-of-Inputs Dialog, users must provide values for input variables at each execution step. This section describes how this is done.


guidedSimB_web.png
Figure 8.5: The Next Input Values dialog.

To enter the user-guided mode of operation, select User Guided Simulation from the Source-of-Inputs Dialog (window item 16). Upon selecting user-guided mode, a Next Input Values dialog appears, as shown in Figure 8.5, that allows you to specify the input values for the next simulation step. Each harness input has a row in the dialog containing entries 1-5 (see numbering in Figure 8.5). The header row includes the elements labeled 6-8.

  1. The input name.
  2. This pull-down menu has three entries that determine how the next value for the input is specified:
    Random.
    Randomly select the next value for the variable from the type given for the input in the .rsh file.
    Entry.
    Specify the next value with the text-entry box in column three.
    Panel.
    Open a sub-panel to specify the next value.
  3. If the pull-down menu in column two is set to Entry, then the next input value is taken from this text-entry box.
  4. If the pull-down menu in column two is set to Entry, then clicking this history button displays recent values the variable has assumed. Selecting a value from the list causes it to be placed in the text-entry box of column three.
  5. The arrow buttons in this column enable scrolling through the possible values for the variable. The arrows are only available for variables having a type or base type of double, single if a resolution or enumerated set of values is also specified.
  6. This pull-down menu sets the input type for all input variables at once to either Random or Entry.
  7. Clicking this button sorts the rows by input number.
  8. Clicking this button sorts the rows by input name.

When Simulate -> Run or Simulate -> Fast Fun is selected or the corresponding toolbar button (13 or 14 in Figure 8.1) is pressed, the input value specifications in the Next Inputs Values dialog are used for each step in the simulation run.

8.4.2  Test Input Mode

Simulation inputs may also be drawn from tests in a Reactis test suite. Such tests may be generated automatically by Reactis Tester, constructed manually in Reactis Simulator, or imported using a comma separated value file format. By convention files storing Reactis test suites have names suffixed by .rst.

A Reactis test suite may be loaded into Simulator by clicking the  openBtn_web.png  in the tool bar to the right of Source-of-Inputs Dialog (window item 18 in Figure 8.1) or by selecting the Test Suite -> Open menu item.

When a test suite is loaded, the name of the test suite appears in the Reactis for C title bar and the tests of the suite are listed in the Source-of-Inputs Dialog.

When executing in test input mode while Test Suite -> Compare Outputs is selected, after each simulation step, Simulator compares the values computed by the code against the values stored in the test suite for those items. Any difference is flagged if it exceeds the error tolerance specified in the Global Settings dialog (item 1 in Figure 8.6). More precisely, for an output p, let

tolbe the relative tolerance specified in the Settings Dialog
v1be the value in the test suite for p at step i of test j
v2be the value computed by Reactis for C for p at step i of test j


then an error for output p at step i of test j exceeds the specified tolerance if | v1v2 | > tol × | v1 | .


rfcRelativeTol_web.png
Figure 8.6: The relative tolerance used by Reactis Simulator to compare values computed by the program against values stored in test suites may be specified in the Global Settings dialog.

8.5  Tracking Data-Item Values

Reactis Simulator includes several facilities for interactively displaying the values that data items assume during simulation. The watched-variable list, or “watch list” for short, displays the current values of data items designated by the user as “watched variables.” You may also attach scopes to global or static local variables in order to display their values at the end of a simulation step plotted on a graph with time on the horizontal axis. Scopes let you easily see how a variable changes during a simulation run. Distribution scopes enable you to view the set of values a data item has assumed during simulation (but not the time at which they occur).

Difference scopesmay be opened for harness outputs when reading inputs from a test in order to plot the values computed by the program under test against the values stored in the test for the output.

You may add data items to the watch list, or attach scopes to them, by right-clicking on a data item in the Reactis main panel and selecting an entry from the resulting menu as described in Section 8.1.

You may save the current configuration of the data tracking facilities (the variables in the watch list and currently open scopes along with their locations) for use in a future Simulator session. You do so, by selecting View -> Save Profile As... and using the resulting file selection dialog to specify a file in which to save a Reactis profile (.rsp file ). The profile may be loaded at a future time by selecting View -> Load Profile....

8.5.1  The Watched-Variable List

The watch list is displayed in a panel at the bottom of the Simulator screen as shown in Figure 3.8. By default this panel is hidden, although adding a variable to the watch list causes the panel to become visible. Visibility of the panel may also be toggled using the View menu as described in Section 8.2. The panel displays a list of data items and their values. The values are updated when Simulator pauses.

The contents of the watch list may be edited using a pop-up menu that is activated from inside the watch-list panel. Individual data items in the panel may be selected by left-clicking on them. Once an item is selected, right-clicking invokes a pop-up menu that enables the selected item(s) to be deleted, have a scope opened, or have a distribution scope opened. If no item is selected, then these choices are grayed out. The right-click pop-up menu also includes an entry Add Variables which displays a list of all data items in the program under test which may be added to the watch list.

The View menu contains operations for displaying / hiding the watch list, adding data items to the watch list, clearing the watch list.

8.5.2  Scopes

Scopes appear in separate windows, an example of which may be found in Figure 8.7. The tool bar of each scope window contains seven or more items. The first two, window items 1 and 2, are toggle buttons for controlling the scaling of the coordinate system used to display values, and third (window item 3) is a “zoom to fit” button. If both toggles are “on” (the default), then left-clicking in the scope window causes the view to become zoomed-in and re-centered; right-clicking causes the scale to become zoomed-out and re-centered. Turning off either of the buttons 1 or 2 disables scaling for the indicated axis: X or Y respectively.


scopeB_web.png
Figure 8.7: A scope window plotting desired speed (yellow) and actual speed (green).

The fourth and fifth tool bar items are toggle buttons to indicate whether data items should be plotted as solid lines (window item 4) or as points (window item 5).

Clicking the CSV button (window item 6) causes the data presented in the scope to be saved to a CSV (comma separated value) file.

Clicking the ? button displays help for scopes.

If more than one data item is plotted on a scope, then a toggle button will appear in the tool bar for each data item (window items 8 and 9). Turning one of these buttons off will hide the corresponding data item in the scope. Hovering over the button will display the data item to which the button corresponds.

8.5.3  Distribution Scopes

Distribution scopes also appear in separate windows, an example of which may be found in Figure 8.8. The values a data item assumes are displayed as data points distributed across the X-axis. Left-clicking in the distribution scope causes the view to be zoomed-in and re-centered; right-clicking causes the scale to become zoomed-out and re-centered. Clicking the “zoom to fit” button re-scales the view so that the minimum value appears at the left edge of the plot and the maximum value appears at the right edge.


distributionScopeB_web.png
Figure 8.8: Distribution scopes plot the values a data item has assumed during simulation.

8.5.4  Difference Scopes

Difference scopes are useful when one or more values stored in a test for an output do not match the corresponding values computed by the program under test. A difference scope plots the expected value (from the test) against the actual value (from the program under test), as shown in Figure 8.9. When executing tests from a test suite, a difference scope may be opened by either (1) right-clicking on a top-level output and selecting Open Difference Scope, or (2) selecting View Open Difference Scopes... from the menu bar at the top of the Reactismain window.

A difference scope plots the expected value (from the test) against the actual value (from the program) as shown in Figure 8.9. Differences between the two values are highlighted by a red bar on the X-axis.


differenceScope_web.png
Figure 8.9: A difference scope may be opened by right-clicking on a top-level output and selecting Open Difference Scope. The scope plots the values stored in a test for an output and the values computed by the program under test for the output. Differences are flagged in red.

8.6  Tracking Code Coverage

Chapter 7 describes the coverage metrics that Reactis for C employs for measuring how many of a given class of syntactic constructs or coverage targets that appear in the code have been executed at least once. Simulator includes extensive support for viewing this coverage information about the parts of the code that have been exercised by the current simulation run. If a test suite is being executed the coverage data is cumulative. That is all targets covered by the portion of the current test executed so far, plus those targets exercised in previous tests are listed as covered.

8.6.1  The Coverage Summary Dialog

The Coverage Summary Dialog shown in Figure 8.10 may be invoked at any time Simulator is enabled by selecting Coverage -> Show Summary. The dialog reports summary statistics for each coverage criterion tracked by Reactis. Each row in the dialog corresponds to one of the criterion and includes five columns described below from left to right.

  1. The name of the coverage criterion reported in the row.
  2. The number of targets in the criterion that have been exercised at least once.
  3. The number of targets in the criterion that are unreachable. A conservative analysis is performed to check for unreachable targets. Any target listed as unreachable is provably unreachable; however, some unreachable targets might not be flagged as unreachable.
  4. The number of reachable targets in the criterion that have not been exercised.
  5. The percentage of reachable targets in the criterion that have been exercised at least once.

cvgSummary_web.png
Figure 8.10: The Coverage Summary Dialog

8.6.2  Coverage Information in the Main Panel

Selecting Coverage -> Show Details toggles the highlighting of unexercised targets in the main panel. Targets that have been covered are drawn in black, while red implies an uncovered target. Please refer to Chapter 7 for a detailed description of how the different coverage metrics are highlighted in the main panel.

Hovering over an exercised target will cause a pop-up to be displayed that gives the test and step in which the target was first executed. This type of test and step coverage information is displayed with a message of the form test/step. A “.” appearing in the test position ./step denotes the current simulation run which has not yet been added to a test suite.


cDecisionDetails_web.png
Figure 8.11: Viewing coverage details for a decision.

For items included in the decision, condition, MC/DC, and MCC metrics, detailed coverage information may be obtained by right-clicking on the item and selecting View Coverage Details. A dialog similar to the one shown in Figure 8.11 will appear with coverage information. This dialog has two tabs, one titled Decision which contains Decision, Condition, and MC/DC details, and a second named MCC which contains MCC details.

Item 1 in Figure 8.11 shows the coverage details dialog with the Decision tab selected. The table within this tab describes the coverage status of all decision, condition, and MC/DC targets for the following decision:

             g_dsMode==M_INIT && (set && !deactivate)

This decision contains three conditions: g_dsMode==M_INIT, set, and !deactivate. Conditions are the atomic Boolean expressions that are used in decisions (for more details on decisions and conditions, see Sections 7.4 and 7.5).

Each decision details table contains seven columns, numbered 1-7 in Figure 8.11. These are interpreted as follows:

  1. The test/step during which the decision first evaluated to true.
  2. The test/step during which the decision first evaluated to false.
  3. The conditions from which the decision is composed.
  4. The test/step during which the condition first evaluated to true.
  5. The test/step during which the condition first evaluated to false.
  6. The condition values and test/step during which an MC/DC target was covered and the decision evaluated to true.
  7. The condition values and test/step during which an MC/DC target was covered and the decision evaluated to false.

Note that although all targets were covered in Figure 8.11, this will not necessarily be the every time you view the details of a decision, in which case a test/step value of -/- will indicate that a target has not yet been covered.

MC/DC requires that each condition independently affect the outcome of the decision in which it resides. (See Section 7.6 for additional details.) When a condition has met the MC/DC criterion in a set of tests, columns 6 and 7 of the table explain how. Each element of these two columns has the form bbb:test/step, where each b reports the outcome of evaluating one of the conditions in the decision during the test and step specified. Each b is either T to indicate the condition evaluated to true, F to indicate the condition evaluated to false, or x to mean the condition was not evaluated due to short circuiting.


cMccDetails_web.png
Figure 8.12: Viewing MCC details for a decision.

In addition the to MC/DC, Reactis can also measure Modified Condition Coverage, or MCC.MCC targets every possible combination of conditions within a decision, so that a decision containing N conditions can result in as many as 2N MCC targets, although the actual number may be less if short-circuited evaluation is used.

Figure 8.12 shows the coverage details dialog with the MCC tab selected. Each row of this table corresponds to a single MCC target. The table columns are interpreted as follows:

  1. Each of these columns corresponds to a condition. Every condition within the decision is always represented by a single column in the table. For each target, all conditions have one of three possible values:
    True.
    The condition is true.
    False.
    The condition is false.
    x.
    The condition is not evaluated due to short-circuiting.
  2. The next-to-last column contains the decision outcomes for each MCC target.
  3. The last column gives the test/step in which the MCC target was covered.

cMccDetailsFilter_web.png
Figure 8.13: Filtering MCC coverage information.

MCC coverage details can be filtered by clicking on the column headers, as shown in Figure 8.13. A filtered column header is indicated by a prefix of T:, F:, or x:, which correspond to the column values True, False, and x, respectively. In Figure 8.13, items 1 and 2 refer to columns with active filters:

  1. Only MCC targets for which the condition g_dsMode==M_INIT evaluated to true are being shown, as indicated by the T: prefix in the header for this column.
  2. Only MCC targets for which the decision evaluated to false are being shown, as indicated by the F: prefix in the header for this column.

Clicking on a column header advances the filter setting for that column to the next feasible value, eventually cycling back to unfiltered. All columns can also be reset to the unfiltered state at any time by clicking on the Clear Filter button (item 3 in Figure 8.13). Note that the individual filters for each column are combined exclusively (i.e., using the Boolean and operator), so that only targets which satisfy all active filters are shown.

8.6.3  The Coverage Report Browser

The Coverage-Report Browser enables you to view detailed coverage information and export the reports in HTML format. It is invoked by selecting Coverage -> Show Report and is described in detail in Chapter 12.

8.7  Exporting and Importing Test Suites

8.7.1  Exporting Test Suites


export_web.png
Figure 8.14: The Reactis test-suite export window.

The export feature of Reactis for C allows you to save .rst files in different formats so that they may be processed easily by other tools. The feature is launched by selecting Test Suite -> Export... when a test suite is loaded in Simulator. You specify the format and name of the exported file in the General tab of the Export Dialog (Figure 8.14). For some export formats, other tabs appear in the dialog to enable you to fine-tune exactly what is included in the exported file. In the case of .csv files, you may specify a subset of tests from the test suite to be exported as well as which data items (inputs, outputs, configuration variables) should be included in each test step. The following formats are currently supported:

.csv files:
Suites may be saved as comma separated value (CSV) files. The different tabs of the export dialog enable you to specify which data from a test suite should be exported. Namely, you can indicate which tests should be exported and for each test step which inputs, outputs, and configuration variables should have values recorded.

If the Compress output check box is selected, then test steps will be omitted if no item that would be recorded in the step is different from the corresponding value in the previously recorded step. This is especially useful when exporting only input data for a test in which inputs are held constant for a number of steps.

The first line of an exported file will contain a comma separated list of the names of the harness inputs, outputs, and configuration variables that were selected for export. A column recording the simulation time has the label ___t___. Any names containing non-alphanumeric characters will be surrounded by double quotes (") and newlines in names will be translated to \n.

After the first line, subsequent lines contain either:

  • A comma-separated list of values that includes one value for each item appearing in the first row. The order of the values in a row corresponds to the order the items appeared in the first line. Each such line contains the values for one simulation step. For array valued items, the elements of the array appear within double quotes (") as a comma-separated list.
  • An empty line signaling the end of a test.
.txt files:
Suites may be saved in a more verbose plain ASCII format to facilitate inspection of the test data.

8.7.2  Importing Test Suites

Reactis can also import tests and add them to the current test suite. Test suites may be imported if they are stored in the Reactis’s native .rst file format or in the comma separated value (CSV) format (described above) that Reactis exports. The import feature is launched by selecting Test Suite -> Import... when Simulator is enabled.

To execute a test suite in Simulator, the test suite must match the currently selected harness. A test suite matches a harness if it contains data for the same set of inputs, outputs, and configuration variables as the harness. If an externally visible data item (input, output, or configuration variable) is added to or removed from the harness, then previously constructed test suites will no longer match the new version of the harness. The import facility gives you a way to transform the old test suite so that it matches the new version of the harness. Such remapping is also available when importing .csv files.

The Import Dialog, shown in Figure 8.15, is used to specify how test data should be remapped during import. The dialog contains a tab for each type of data item stored in a test suite (inputs, outputs, configuration variables). In the case of .csv files, the import dialog also contains a tab Not Imported that lists items present in the CSV file that are not scheduled to be imported. When an .rst file includes an item not scheduled to be imported, it is placed at the bottom of the appropriate tab. For example, if a test suite contains an input variable X and is being imported with respect to a harness that has no input variable X, then X will appear at the bottom of the Input variables tab and be highlighted in yellow.

Each data item tab (e.g. Inputs) includes a column (e.g. Harness Input Name) listing the code elements in that category. The Suite column lists items from the file being imported that map to the corresponding harness item. In most cases a data item X in the test suite being imported will map to an item with the same name in the harness. If the harness contains an item not found in the test suite being imported, then the corresponding Suite column entry will be listed as Random Value and be highlighted in yellow (as shown in Figure 8.15 for input brake). If this setting is not changed, then upon import a random value will be generated for the input at each test step. The value to be mapped to any harness item may be changed by double clicking on the corresponding entry in the Suite column (alternatively selecting the item and clicking the button Select Suite Item) and then using the resulting dialog to either select an item from the test suite being imported or set it to Random Value.


import_web.png
Figure 8.15: The Import Dialog allows you to import external test data (comma separated value format) and if necessary transform the data to produce a test suite that matches a harness. The import facility is also used to transform an .rst file to make it match a program under test.

8.8  Updating Test Suites

Test Suite -> Update... creates a new test suite that retains the inputs from the current test suite, but updates the outputs with values computed by the execution of your code on the existing inputs. This provides a means to reuse previously constructed test suites when your code changes in a way that causes its externally visible behavior to change.


1
Depending on the variable and its type, some of the hover information may not be available.
2
Depending on a number of factors including the type of a right-clicked data item, some menu items may not be available. For example, scopes are not available for arrays
3
Any operation which modifies the .rsh file is disabled when Simulator is enabled. Several entries from the Edit menu open the Reactis Harness Editor in read-only mode