Automated Testing and Validation with Reactis®

 
 Reactis User's Guide   Contents  |  Index
 Chapters:  1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | A | B | C

Chapter 9  Reactis Validator

Reactis Validator searches for defects and inconsistencies in models. The tool enables you to formulate a property that model behavior should have as an assertion, attach the assertion to a model, and perform an automated search for a violation of the assertion. If Validator finds an assertion violation, it returns a test that leads to the problem. This test may then be executed in Reactis Simulator to gain an understanding of the sequence of events that leads to the problem.

Validator also allows users to specify specific test scenarios they want exercised using one of two alternative mechanisms. User-defined targets enable you to specify an abstract test scenario, whereas virtual sources give you a way to easily specify a concrete scenario.

User-defined targets may be seen as user-defined extensions to the built-in coverage metrics supported by Reactis: in generating test data, Validator (and Tester) will attempt to generate test runs that satisfy the indicated scenarios. Like assertions, user-defined targets are attached to models.

Virtual sources are placed at the top level of a model to control one or more top-level inports as a model executes in Reactis. That is you can specify a sequence of values to be consumed by an inport during simulation or test-generation. Virtual sources can be easily enabled and disabled. When enabled, the virtual source controls a set of inports and while disabled those inports are treated by Reactis just as normal top-level inports.

Validator is particularly useful in requirements validation. Given a list of requirements on model behavior, you can formulate assertions (to check whether a requirement is being satisfied) and user-defined targets or virtual sources (to define test scenarios intended to “stress” the requirement).

Engineers use Validator as follows. First, a model is instrumented with assertions to be checked, user-defined targets, and virtual sources. We refer to assertions, user-defined targets, and virtual sources as Validator objectives. The tool is then invoked on the instrumented model to search for assertion violations and paths leading to the specified user-defined targets. The output of a Validator run is a test suite that includes tests leading to objectives found during the analysis.

Assertions and user-defined targets may be added to any Simulink system or Stateflow diagram in a model; whereas, virtual sources may only be added at the top-level of the model. When adding Validator objectives to a model, three mechanisms (only the first two are available for virtual sources) for formulating objectives are supported:

Expression objectives
are C-like Boolean expressions.
Diagram objectives
are references to subsystems in standard Simulink/Stateflow libraries.
Timer objectives
are directives that tag a model data item as a timer or counter.

An expression objective is a C-like Boolean expression whose variables are wired to data items from the context of the model in which the objective is inserted. Expression objectives are easily attached and modified from the Reactis GUI. For more information on valid expressions see Sections 9.3.1 and 9.3.2.

Diagram objectives give users the full power of Simulink / Stateflow to formulate assertions, user-defined targets, and virtual sources. The objectives may use any Simulink blocks supported by Reactis, including Stateflow diagrams. Diagram objectives are attached to a model using the Reactis GUI to select a Simulink system from a library and “wire” it into the model. The diagrams are created using Simulink and Stateflow in the same way standard models are built. After adding a diagram objective to a model, the diagram is included in the model’s hierarchy tree, as are other library links in a model.

A timer objective tags a model variable as a timer or counter. In the case of a timer target, the target is considered covered after the data item changes from its start value to its end value by a specified increment. For some models, adding timer targets will help Reactis Tester attain higher levels of coverage. Conceptually, this is because the timer target guides Reactis to make a timer expire or a counter to reach its upper bound. In the case of a timer assertion, the assertion is considered violated if the data item ever changes from its start value to the end value by a specified increment.

9.1  The Meaning of Validator Objectives

We now explain in more detail how Reactis interprets objectives. Coverage of these objectives may be tracked in the standard manner, namely through highlighting in the main Reactis panel or in the Coverage-Report Browser. Also, in Simulator, hovering over a covered objective shows the test and step number when it was first covered.

9.1.1  Assertions

Assertions specify properties that should always be true for a model. For a given assertion, Validator searches a model for a simulation run that leads to a configuration in which the assertion does not hold. An expression assertion fails to hold if the given expression evaluates to false1. A timer assertion fails to hold if the data item it tracks ever changes from its start value to the end value by a specified increment. A diagram assertion fails to hold if any output port assumes the boolean value “false” or a numeric value zero. Note that if a diagram has more than one output port, the individual outports are treated as distinct assertions. That is each individual outport is listed in the objective list of the Info File Editor. Also, the individual ports are highlighted independently in the main panel according to whether they are violated or not.

An assertion is considered “covered” when a violation is found. So in contrast to targets, where being covered is considered good, covering an assertion is bad. We therefore highlight covered assertions in red and yellow, in contrast to targets where uncovered targets are highlighted in red.

9.1.2  User-Defined Coverage Targets

User-defined coverage targets extend the Reactis built-in targets described in Chapter 6 (branch coverage, state coverage, condition-action coverage, MC/DC, etc). If a user-defined target is specified as an expression, Reactis will treat it as covered when the expression evaluates to a non-zero numeric value. If a user-defined target is specified as a timer, the target is considered covered after the data item tracked by the timer changes from its start value to its end value by its specified increment. If a user-defined target is specified as a diagram, Reactis will treat it as covered when all output ports of the associated subsystem have assumed either a boolean “true” value or any non-zero numeric value. Note that if a diagram has more than one output port, the individual port names are listed in the target list and highlighted independently according to whether they are covered or not.

9.1.3  Virtual Sources

Virtual sources control top-level inports of a model. If a virtual source is an expression, then at each simulation step, the expression is evaluated to compute a value that will be fed into the top-level inport controlled by the virtual source. If a virtual source is specified as a diagram, then each outport of the diagram may be fed into a top-level inport of the model. Some outports of the virtual source diagram may be left unconnected and monitored by an assertion.

9.2  Use Cases of Validator Objectives

9.2.1  Checking a Requirement with an Expression Assertion and an Expression User-Defined Target

Consider a cruise control application that has the following requirement:

The cruise control shall remain inactive if the speed of the vehicle is less than 30 mph.

To properly test this requirement we need to execute a test in which an attempt is made to activate the cruise control while the speed of the vehicle is less than 30 mph. We can capture this as a simple expression-based user-defined target:

on && activate && (speed < 30)

This expression would be "wired-in" such that on monitors whether the cruise control is turned on, activate monitors whether an attempt has been made to activate the cruise control, and speed is the speed of the vehicle. When this expression becomes true, we have found a test to check our requirement.

We can test whether the applications response to this test is correct with the following expression assertion:

!(active && (speed < 30))

This expression is wired in so that active monitors whether the cruise control is active. If this expression ever becomes false, then the requirement is violated. Alternatively, this requirement could be specified with an equivalent expression that uses the logical implication operator:

active -> (speed >= 30)

9.2.2  Checking a Requirement with a Diagram Assertion

Consider another requirement for a cruise control:

When active, cruise control shall not permit actual, desired speeds to differ by more than 1 mph for more than 3 seconds.

This requirement can be checked by a diagram assertion that monitors the vehicle speed, whether the cruise control is active, and the speed at which the driver has set the cruise control. The top-level interface of the Simulink subsystem (in this case a Stateflow diagram) is shown in Figure 9.1.


images/speedChkTop_web.png
Figure 9.1: The diagram to check the speed maintenance requirement monitors three data items in the model and raises a flag on its output if the requirement is violated.

The simple Stateflow diagram that implements the assertion is shown in Figure 9.2. It works as follows:

  1. Initially the diagram is in the Inactive state.
  2. When the cruise control becomes active it enters state Active and child state OkDiff.
  3. When in OkDiff, the diagram computes the difference between the vehicle speed and the speed at which the driver has set the cruise control.
    1. If the difference exceeds the tolerance, then go to state BigDiff.
    2. If the difference is less than the tolerance, then stay in OkDiff.
  4. When entering BigDiff, start a counter
    1. If the speed difference is corrected within 3 steps, then return to OkDiff.
    2. If the speed difference is not corrected within 3 steps, then flag an error by outputting the value 0 on its outport.

images/speedChk_web.png
Figure 9.2: The implementation of the speed maintenance assertion.

9.2.3  Creating a Functional Test Using a Virtual Source

Virtual sources offer a convenient mechanism for capturing functional tests. Consider the need to test that the cruise control activates and deactivates as expected when executing the following scenario:

 InputExpected Output of Active
1)Turn cruise control onOff
2)Press setOn
3)Press cancelOff
4)Press setOn
5)Press brakeOff
6)Press resumeOn
7)Press gasOff

This scenario can be captured using the virtual source shown in Figure 9.3. The Signal Builder block shown implements the virtual source by capturing the scenario described above. The first six signals are fed into inputs of the model while the final output is monitored by simple assertion that compares the expected value for active against the actual value produced by the model.


images/activeChkVS_web.png
Figure 9.3: Virtual sources offer an easy way to implement functional tests. Any Simulink / Stateflow construct supported by Reactis may be used to implement the virtual source. Signal Builder blocks are especially useful. A Validator assertion can monitor an output of the virtual source, relieving the user of the tedious task of checking expected values against actual responses of the model.

9.3  Adding, Editing, and Removing Objectives

Adding, editing, and removing of objectives is only possible when Simulator is disabled. This is necessary because objectives are linked into a model when Simulator is invoked.

A Validator objective may be added to a model by first selecting in the hierarchy panel the Simulink subsystem or Stateflow state in which the objective should be inserted and then:

  • right-clicking in an empty space in the subsystem in the main panel of the Reactis window and selecting Add Assertion or Add User-Defined Target or Add Virtual Source; or
  • by selecting the Validate -> Add Assertion or Validate -> Add User-Defined Target or Validate -> Add Virtual Source menu items from the main menu bar.

Usage of the resulting dialogs is explained in Section 9.3.1 (expression objectives within Simulink), Section 9.3.2 (expression objectives within Stateflow), Section 9.3.3 (timer objectives), and Section 9.3.4 (diagram objectives).

In Stateflow charts, only expression objectives are supported. Using a group of radio buttons in the wiring dialog, you can choose for the objective to be evaluated at state entry or exit, or every time the chart is triggered while the state is active. These correspond to the usual entry, exit, and during actions of Stateflow. Evaluation of the objective occurs at the end of the corresponding state actions (entry, during, exit).

To edit properties of an objective, either right-click on the objective and select Edit -> Properties..., or click on the objective to select it and then choose the Validate -> Edit Objective... menu item. Note that all the controls in the properties dialog are disabled if Simulator is currently active. To remove an objective, either:

  • right-click on the objective and select Remove, or
  • click on the objective to select it and then choose the Validate -> Remove Objective menu item, or
  • click on the objective to select it and then press the delete key.

Standard cut, copy, and paste operations are available for objectives both from the top-level Edit menu and by right-clicking on an objective. Edit -> Undo and Edit -> Redo unable you to undo and redo these operations. Objectives may be moved by dragging them with the mouse.

9.3.1  The Simulink Expression Objective Dialog


images/exprObj_web.png
Figure 9.4: The dialog for editing expression objectives located within Simulink subsystems.

The numbers below refer to the labels in Figure 9.4.

  1. Objective name. The name must be unique with respect to the subsystem where the objective resides.
  2. Enable or disable the objective. When disabled:
    • assertions are not checked
    • user-defined targets are not tracked
    • virtual sources do not control inports
  3. Expression. Reactis evaluates this expression at each simulation step and interprets the scalar result of numeric type as follows:
    Assertions.
    A zero result means a fail has occurred.
    User-Defined Targets.
    A non-zero result means the target is covered.
    Virtual Sources.
    The result is fed into the inport controlled by the virtual source.
    Free variables in the expression are declared in the Inputs section (window item 5) and wired to data items (Simulink blocks or Stateflow variables) visible at the place in the model where the objective resides. Valid operators and functions are listed in Table 9.1.

    Symbol/NameOperation
    +, -, *, /Add, Subtract, Multiply, Divide
    <, >Less than, Greater than
    <=, >=Less than or equal, Greater than or equal
    ==Equal
    !=, ~=Not equal
    ->Logical Implication
    !Logical Not
    |, ||Logical Or
    &, &&Logical And
    []Vector element access
    abs(), fabs()Absolute value
    x^y, pow(x,y), power(x,y)x to the power y
    exp()Exponent
    ln(), log(), log10()Logarithm
    sqrt()Square root
    rem()Division remainder
    infInfinity value
    floor(), ceil()Rounding
    cos(), sin(), tan()Trigonometric functions
    cosh(), sinh(), tanh()Hyperbolic trigonometric functions
    acos(), asin(), atan(), atan2()Inverse trigonometric functions
    Table 9.1: Simulink expression objective operators and functions.

  4. # steps to hold. This is a non-negative integer value. For assertions, this entry specifies the number of simulation steps that the expression must remain false before flagging an error. For user-targets, the entry specifies the number of steps that the expression must remain true before the target is considered covered.
  5. Inputs. This is where variables used to construct the expression are declared. Each variable can be viewed as an input to the objective. Clicking the Add Variable button adds a row to the section. The rightmost entry box of a row is where you enter the name of a variable. The column(s) to the left of the variable name specify which data item from the model is wired to the variable. Clicking the X button to the right of a row deletes the variable declaration.

    Note that you can any of these selections empty and then later connect them via drag-and-drop in the main panel (see Section 9.3.5). Any variables referenced in the expression that are not declared in this section will be automatically added to the list (with empty wiring).

    When the dialog is dismissed, the wiring specified here may be viewed from the main panel of the Reactis window by hovering over the objective. The wiring will be shown as blue lines which can be though of as virtual wiring. For each input of the objective, a blue line will be drawn from a data item in the model to the input to indicate how the input is wired.

  6. Auto-Wire. Clicking on this causes Reactis to attempt to automatically select the data items from the model that should feed into the inputs of the objective. For an inport X of an objective O, the pairing algorithm selects a data item D if:
    • D is a Simulink block named X, or
    • D is an outport named X of a subsystem with the same parent as O
    If there are no matches or more than one possible match, the auto-wiring attempt will fail for X and the inport will need to be manually bound to a data item.
  7. Output. This portion of the dialog is only present when creating virtual sources. The pull-down menu enables you to specify the model inport that the virtual source controls.

9.3.2  The Stateflow Expression Objective Dialog


images/sfExprTargetEditor_web.png
Figure 9.5: The dialog for editing expression objectives located within Stateflow charts.

The numbers below refer to the labels in Figure 9.5.

  1. Objective name. The name must be unique with respect to the objectives in the state where the objective resides.
  2. Enable or disable the objective. When disabled:
    • assertions are not checked
    • user-defined targets are not tracked
  3. Expression. Reactis evaluates this expression at the point determined by window item 6 and interprets the scalar result of numeric type as follows:
    Assertions.
    A zero result means a fail has occurred.
    User-Defined Targets.
    A non-zero result means the target is covered.
    Valid operators and functions are listed in Table 9.2.

    Symbol/NameOperation
    +, -, *, /Add, Subtract, Multiply, Divide
    <, >Less than, Greater than
    <=, >=Less than or equal, Greater than or equal
    ==Equal
    !=, ~=Not equal
    ->Logical Implication
    !Logical Not
    |, ||Logical Or
    &, &&Logical And
    []Vector element access
    abs(), fabs()Absolute value
    x^y, pow(x,y)x to the power y
    exp()Exponent
    log(), log10()Logarithm
    sqrt()Square root
    floor(), ceil()Rounding
    cos(), sin(), tan()Trigonometric functions
    cosh(), sinh(), tanh()Hyperbolic trigonometric functions
    acos(), asin(), atan(), atan2()Inverse trigonometric functions
    double(), single(), boolean(), int8(), ... 
    int16(), int32(), uint8(), uint16(), uint32()Type cast functions
    Table 9.2: Stateflow expression objective operators and functions.

  4. # steps to hold. This is a non-negative integer value. For assertions, this entry specifies the number of simulation steps that the expression must remain false before flagging an error. For user-targets, the entry specifies the number of steps that the expression must remain true before the target is considered covered.
  5. Visible variables and constants. This lists the variables and constants visible in the scope where the objective is located. Double-clicking on an item in this list will insert it into the expression (window item 5) text box.
  6. Specifies the point during execution when this objective is checked:
    When executing state entry actions
    The objective is checked after the “entry” actions of the state in which it is located execute.
    When executing state during actions
    The objective is checked after the “during” actions of the state in which it is located execute.
    When executing state exit actions
    The objective is checked after the “exit actions” of the state in which it is located execute.
    On chart entry
    The objective is checked every time the chart becomes active, before any events are processed within the chart.
    On chart exit
    The objective is checked every time the chart becomes inactive, after all events have been processed within the chart.

9.3.3  The Timer Objective Dialog

The numbers below refer to the labels in Figure 9.6.


images/timerObjDialog_web.png
Figure 9.6: The dialog for inserting/modifying timer objectives.

  1. Name. The name must be unique with respect to the subsystem where the objective resides:
    • if a timer objective is located in Simulink, then its name must not be the same as any block or other Validator objective residing in the same subsystem.
    • if a timer objective is located in Stateflow, then its name must not be the same as any other Validator objective residing in the same state.
  2. Timer. The data item from the model to be tracked as a timer. The pull-down menu lists all data items in the context where the objective resides.
  3. Enable or disable the objective. When disabled the timer target is not tracked.
  4. Start value. A double specifying the initial value of the timer/counter.
  5. Step size. A double specifying a step size such that the objective will track intermediate steps between the start and end value of this size. The step size should typically be set to the increment used in the model to change the value of the tracked data item.
  6. End value. A double specifying the end value of the timer/counter.

9.3.4  The Diagram Objective Dialog

The Diagram Objective Dialog is the vehicle for adding diagram-based assertions and user-defined targets. It is also used to modify existing diagram objectives. You invoke it by right-clicking in the main panel in white space within a Simulink subsystem and selecting either:

  • Add User-Defined Target -> Diagram, or
  • Add Assertion -> Diagram

The numbers below refer to the labels in Figure 9.7.


images/diagramObjDialogB_web.png
Figure 9.7: The dialog for inserting/modifying diagram objectives.

  1. Objective name. The name must not be the same as any block or Validator objective within the subsystem where the objective resides.
  2. Enable or disable the objective. When disabled:
    • assertions are not checked
    • user-defined targets are not tracked
    • virtual sources do not control inports
  3. Name of the currently chosen objective library. This is a standard .slx file or .slx file storing a Simulink library containing the Simulink / Stateflow subsystems that will be wired into a controller model as Validator diagram objectives. This file is constructed in the usual manner using The MathWorks’ Simulink and Stateflow editors. Note that the actual wiring is established and maintained by Reactis, so the controller model being validated need not be modified at all. The wiring information is stored by Reactis in the .rsi file associated with a model.
  4. Clicking here opens a file selection dialog for choosing an objective library.
  5. System. This tree represents the structure of the library specified in window item 3. Selecting an item in the tree indicates the Simulink subsystem from the library that encodes the objective.
  6. Parameters. If the system chosen above is a masked subsystem, the values for the mask parameters may be entered here.
  7. Inputs. This section enables the user to specify the wiring between input ports of the diagram objective and the blocks in the subsystem of the model where the objective resides. Labels on the right represent the input ports of the diagram objective. The menus to the left list data items from the model that may be wired to the inputs of the objective. Note that an entry in the second column of menus only appears if the item selected in the first column is a block with more than one output. The second column menu selects one of the multiple outputs of the block in column one.

    Note that you can any of these selections empty and then later connect them via drag-and-drop in the main panel (see Section 9.3.5).

    When the dialog is dismissed, the wiring specified here may be viewed from the main panel of the Reactis window by hovering over the diagram objective. The wiring will be shown as blue lines connecting inputs of the objective to data items that flow into the inputs from the model being instrumented for validation.

  8. Outputs. This portion of the dialog is only present when creating virtual sources. The pull-down menus enable you to specify the model inports that the virtual source controls. Each output of the virtual source may control one input of the model. Virtual source outputs left unconnected may be monitored by assertions.

    When the dialog is dismissed, the wiring specified here may be viewed from the main panel of the Reactis window by hovering over the diagram objective. The wiring will be shown as blue lines connecting each outport of the virtual source to the model inport that it controls. Alternatively, you may hover over a controlled model inport to display the blue line connected to the virtual source that controls it.

9.3.5  Wiring Validator objectives within the Reactis main panel


images/objWiring_web.png
Figure 9.8: Connecting the output of “Relational Operator1” to the “active” input of Validator objective “SpdCheck”.

Selecting the proper wiring in the Validator objective property dialogs shown in the previous sections can sometimes be difficult, especially if the blocks in a system do not have descriptive names. As an alternative, you can wire up objectives within Simulink systems via drag-and-drop in the main panel. When creating a new objective, just leave the entries in the inputs wiring table unconnected when dismissing the dialog. For expression objectives, you do not need to manually add the variables to the wiring table. Reactis will automatically add any missing variables.

After creating an objective and clicking “Ok” in the dialog, you can now use drag-and-drop to connect the inputs of the objective:

  1. Left-click on a signal line or a block with only one output port and hold the button down.
  2. Reactis will draw a dashed line from the output port of the block that feeds the signal line to the current mouse position.
  3. While holding the button, move the mouse to the Validator objective to which you want to connect the signal.
  4. Let go of the mouse button. Reactis will select the objective and show a menu listing all of its inputs.
  5. Select the desired input in the menu. This sets the wiring. Note that if some other signal was connected to the input, it will be replaced with the new connection.
  6. You can inspect the proper wiring by hovering over the objective.

9.4  Linking to Requirements Documents

As described above, Validator helps you check if a model satisfies its requirements. Often, requirements for a system are specified using natural language in a requirements document. For requirements documents implemented in Microsoft Word or Microsoft Excel, Reactis offers a facility to establish and manage links between a natural language requirement and Reactis Validator objectives (assertions and user-defined targets) that check for violations of the requirement. This section describes that mechanism for linking to requirements documents.

For each assertion or user-defined target that you insert into a model, you can also establish a link which points to a particular location within a requirements document. Once this link is configured for an objective, you can right-click on the objective in the main Reactis panel and select Go to linked document to see the natural language requirement from which the objective was derived. If you have configured a bidirectional link, you can also go in the other direction from the natural language requirement to Validator objectives which check for compliance of the model to the requirement.

Each dialog for inserting or modifying a Validator objective includes a tab named Document Link, as shown in Figure 9.9. This example shows a link between the assertion LowSpeedInactive and a natural language requirement located at a bookmark named LowSpeedInactive in the Microsoft Word document cruise_requirements.docx. This link can be established with the following steps:

  1. Right-click in the main Reactis panel and select Add Assertion -> Expression and in the Settings tab of the resulting dialog, specify the assertion. Alternatively, a link to a requirement may be added to an existing Validator objective by right-clicking on the objective and selecting Edit Properties....
  2. Select the Document Link tab.
  3. Select the Microsoft Word radio button to indicate the requirement is in a Word document.
  4. Click the file selection button to the right of the Document text entry box.
  5. In the resulting file selection dialog, select cruise_requirements.docx.
  6. In the Location in Document section of the dialog, select radio button Bookmark and enter the bookmark name LowSpeedInactive. Note, that in this scenario, this bookmark did not yet exist in the Word document. If a bookmark at the location to which you wish to link already exists, you can skip the next 3 steps which create a new bookmark in the Word document.
  7. Open cruise_requirements.docx in Microsoft Word.
  8. Select the text that specifies the requirement to which you want to link.
  9. Return to the Reactis Document Link dialog and click the second Create button to create a bookmark to the selected text and establish a bidirectional link. This button click does the following:
    • Creates a new bookmark in cruise_requirements.docx named LowSpeedInactive.
    • Inserts a red R icon at the bookmark. In Word if you ctrl-click on the R, the LowSpeedInactive assertion will flash in yellow in the main Reactis panel. This is a forward link from the natural language requirement to the assertion in the model. Note, that the model must be open in Reactis for this highlighting to occur.
    • Establishes the reverse link from the assertion in the model to the bookmark in the Word document. Subsequently, clicking the Go to link button in the dialog will cause Word to open cruise_requirements.docx and display the new bookmark.
  10. Click the Ok button to save the changes to the document link and dismiss the dialog.

images/reqDocLink_web.png
Figure 9.9: The dialog for establishing and maintaining a link between a Validator objective and a natural language requirement. This example shows a link to a bookmark in a Microsoft Word document.

The different elements of the Document Link tab are labeled in Figure 9.10 and work as follows:

  1. The Document section lets you specify the document to which you wish to link. First use the radio buttons to select the program that manages the requirements document (Microsoft Word or Excel), then specify the document.
  2. When linking to a Word document, the document name goes here.
  3. When linking to a Word document, this button can be clicked to open a file-selection dialog to chose a Word document.
  4. When linking to an Excel document, the document name goes here.
  5. When linking to an Excel document, this button can be clicked to open a file-selection dialog to chose an Excel document.
  6. The Location in Document section lets you specify a location within a requirements document. Two alternative mechanisms for specifying a location within a document are offered:
    • Bookmarks (called named ranges in Excel).
    • A search for a match against a specified text string.
    The radio selector lets you chose the mechanism.
  7. The name of a bookmark (named range) to which to link goes here. If a bookmark already exists in your document, you can simply specify the name here. If you wish to create a new bookmark, you can use buttons 8 and 9
  8. Clicking this button creates a bookmark (named range) in the Word (Excel) document that is specified in the Document section. The location of the bookmark is the currently selected text (for Word) or range of cells (for Excel). If a bookmark name is given in field 7, then that will be the name of the new bookmark. A link from the current Validator objective to the new bookmark will be established such that clicking the Go to link button (item 12) will cause the document to be opened and positioned to the new bookmark. If no name is specified in field 7, then a name will be auto-generated.
  9. Clicking this button does everything that is done when button 8 is clicked and also inserts a link from the new bookmark to the Validator objective. This link appears as a red R in the document and when you ctrl-click on it in Word/Excel, the linked Validator objective flashes yellow in the main Reactis panel. Note that the model must be open in Reactis for this highlighting to work.
  10. When finding a location within the document using a text search, the search string goes here.
  11. If searching in an Excel document, you must also specify the Worksheet in which to search. Specify that here.
  12. Clicking this button opens the document specified in the Document section to the location specified with the Location in Document section.

images/reqDocLinkA_web.png
Figure 9.10: The dialog for establishing and maintaining a link between a Validator objective and a natural language requirement.

9.5  Running Reactis Validator

After adding assertions and user-defined targets to a model, Validator may be invoked by selecting the Validate -> Check Assertions... menu item to determine whether or not assertion violations can be found. Figure 9.11 contains an annotated screen shot of the Validator launch dialog; the labeled items are described in the next subsection. It should first be noted, however, that the Validator launch dialog is very similar to that of the Tester launch dialog as described in Section 8.1. This similarity is due to the fact that conceptually, Validator works by generating test data using the Tester test-generation algorithm and then running the tests on the instrumented model to check for assertion violations. For this reason, many of the features of the Validator launch screen are identical to those of Tester. When this is the case, the descriptions below are very brief; more detail may be found in Section 8.1.


images/validatorWinB_web.png
Figure 9.11: The launch dialog for Validator.

9.5.1  Labeled Window Items

  1. Specify how long Validator should run. There are three options to choose from:
    • A fixed amount of time
    • A fixed number of steps
    • A specified number of random and targeted tests/steps.
    If 100% coverage of all targets is reached prior to the specified run time, Validator will terminate early. Note that for calculating this early termination, assertions are considered uncovered if they have not been violated. Therefore, if any assertion remains not violated, then early termination will not occur.
  2. A list of .rst files containing test suites to be preloaded.
  3. When the Prune check-box to the right of a filename is checked, unnecessary steps (those that do not increase the level of coverage) will be pruned from the preloaded test suite. When the check-box is not checked, no pruning of the suite will occur. Note, that assertions are checked for all steps of the preloaded tests. If a violation is found in a step, it will not be pruned.
  4. If the Use Virtual Sources check-box is checked for an .rst file, then when executing the tests in the preloaded suite, values produced by enabled virtual sources will be used for controlled inputs instead of the input values from the test suite for those inputs. When this item is not checked, input values from the test suite will be used for all inputs.
  5. Clicking this button invokes a file-selection dialog that enables the user to specify an .rst file to be added to the preload list.
  6. Clicking this button removes the currently selected .rst file from the list of test suites to be preloaded.
  7. When running Validator for a fixed length of time, the number of hours and minutes is entered here.
  8. When running Validator for a fixed number of steps, the number of steps is entered here. Validator will decide how many of these steps will be random or targeted. Because of pruning, the number of steps in the final test suite will typically be less than the number entered here.
  9. When running Validator for a specified number of random and targeted steps, the number of tests in the random phase is entered here. Because of pruning that occurs at the end of the random phase, some tests may be eliminated entirely, leading to a smaller number of tests at the end of the random phase than what is specified here.
  10. When running Validator for a specified number of random and targeted steps, the number of steps to take while constructing each test of the random phase is entered here. Upon completion of the random phase, unimportant steps are pruned from the tests, so the lengths of the final tests will usually be shorter than the length specified here.
    NOTE: Specifying too many steps in the random phase can cause Reactis to run out of memory. The upper bound on the number of steps possible depends on model size and available RAM, but in general much more time should be spent in the targeted phase which is more optimized for memory usage.
  11. When running Validator for a specified number of random and targeted steps, the number of execution steps to take during the targeted phase is entered here. The targeted phase uses sophisticated strategies to guide the simulation to exercise parts of the model not visited during the preload or random phases. The value entered specifies an upper bound on the number of simulation steps executed during the targeted phase.
  12. This entry box enables the user to pass one or more of the following parameters to Validator :
    • -a1 turns inputs abstraction on, -a0 turns inputs abstraction off. Inputs abstraction usually improves the performance of Validator and should be left on (default). In rare cases, turning it off may improve coverage. If coverage problems are encountered with inputs abstraction on, it may be beneficial to take a test suite produced with abstraction on, preload it into Validator, turn abstraction off, and then run Validator again.
    • -c n sets the maximum number of input variables that may change during an execution step to n, which must be a positive integer. The default is that every input variable can change at every step. Restricting the number of input variables that can change can lead to easier-to-understand test suites.
    • -C n directs Reactis to use n cores during test generation. Currently supported values for n are 1 and 2. Leveraging multi-core architectures speeds up test-generation for many models.
    • -s randomSeed seed for the random number generator. This is useful for replaying a previous run of Validator. The random seed used to create a .rst file can be found in the test-suite log (which may be viewed in the Test Suite Browser described in Chapter 11), after the “-s” in the “Created by Tester:” line.
  13. The name of the .rst files to be generated.
  14. Clicking this button opens a file-selection dialog for specifying the name of the .rst file to be generated.
  15. Clicking this button displays Validator help.
  16. Clicking this button opens a file selection dialog to specify an .rtp files from which to load Validator launch parameters. Reactis may be configured from the Settings dialog to generate an .rtp file for each Tester or Validator run.
  17. Reset the Validator parameters to their default values.
  18. Scroll backward in the parameter history.
  19. Scroll forward in the parameter history.
  20. Clicking this button starts Validator run.
  21. Clicking this button closes the Validator window.

The Progress Dialog displayed while Validator is running is the same as the Progress Dialog for Tester. For more information see Section 8.2

9.6  Validator Menus in the Reactis Top-Level Window

Chapters 4 and 7 describe most of the menu entries of the menu bar in the Reactis Top-Level Window. We now describe the entries which are related to Validator.

Edit menu.

This menu includes entries used to manipulate Validator objectives which are stored in .rsi files. Note, that .rsi files may be modified only when Simulator is disabled, therefore, when Simulator is active, these items are disabled.

Undo.
Undo an operation (add, edit, remove, move) on a Validator objective.
Redo.
Redo last undone operation (add, edit, remove, move) on a Validator objective.
Cut.
Cut the currently selected Validator objective.
Copy.
Copy the currently selected Validator objective to the clipboard.
Paste.
Paste a Validator objective from the clipboard to the current subsystem. To paste an objective to a specific position, right-click on that position in your model and select Paste from the context menu.
Validate menu.
The menu items include:
Add Assertion.
Add a new Validator assertion at a default location of the currently selected subsystem.
Add User-Defined Target.
Add a new Validator target at a default location of the currently selected subsystem.
Add Virtual Source.
Add a new virtual source at a default location of the currently selected subsystem.
Edit Objective...
Edit the currently selected objective.
Remove Objective.
Remove the currently selected objective.
Enable/Disable Objective.
Enable or Disable the objective.
Check Assertions...
Start a search for tests that violate assertions or cover user-defined targets.

1
The expression language for Validator objectives employs the C language convention of representing false as the numeric value zero and true by any non-zero value