Reactis User's Guide   Contents  |  Index
 Chapters:  1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | A | B | C

Chapter 9  Reactis Validator

Reactis Validator searches for defects and inconsistencies in models. The tool enables you to formulate a property that model behavior should have as an assertion, attach the assertion to a model, and perform an automated search for a violation of the assertion. If Validator finds an assertion violation, it returns a test that leads to the problem. This test may then be executed in Reactis Simulator to gain an understanding of the sequence of events that leads to the problem.

Validator also allows you to specify specific test scenarios that you want exercised using one of two alternative mechanisms. User-defined targets enable you to specify an abstract test scenario, whereas virtual sources give you a way to easily specify a concrete scenario.

User-defined targets (Section 9.1.2) may be seen as extensions to the built-in coverage metrics supported by Reactis: in generating test data, Validator (and Tester) will attempt to generate tests that satisfy the indicated scenarios. Like assertions, user-defined targets are virtually attached to models within Reactis, so your model is not modified.

Virtual sources (Section 9.1.3) are placed at the top level of a model to control one or more top-level inports as a model executes in Reactis. That is you can specify a sequence of values to be consumed by an inport during simulation or test-generation. Virtual sources can be easily enabled and disabled. When enabled, the virtual source controls a set of inports and while disabled those inports are treated by Reactis just as normal top-level inports.

Validator is particularly useful in requirements validation. Given a list of requirements on model behavior, you can formulate assertions (to check whether a requirement is being satisfied) and user-defined targets or virtual sources (to define test scenarios intended to “stress” the requirement).

Engineers use Validator as follows. First, a model is instrumented with assertions to be checked, user-defined targets, and virtual sources. We refer to assertions, user-defined targets, and virtual sources as Validator objectives. The tool is then invoked on the instrumented model to search for assertion violations and paths leading to the specified user-defined targets. The output of a Validator run is a test suite that includes tests leading to objectives found during the analysis.

Assertions and user-defined targets may be added to any Simulink system or Stateflow diagram in a model; whereas, virtual sources may only be added at the top-level of the model. When adding Validator objectives to a model, three mechanisms (only the first two are available for virtual sources) for formulating objectives are supported:

Expression objectives
are C-like Boolean expressions.
Diagram objectives
are references to subsystems in standard Simulink/Stateflow libraries.
Timer objectives
are directives that tag a model data item as a timer or counter.

An expression objective is a C-like Boolean expression whose variables are wired to data items from the context of the model in which the objective is inserted. Expression objectives are easily attached and modified from the Reactis GUI. For more information on valid expressions see Sections 9.3.1 and 9.3.2.

Diagram objectives give users the full power of Simulink / Stateflow to formulate assertions, user-defined targets, and virtual sources. The objectives may use any Simulink blocks supported by Reactis, including Stateflow diagrams. Diagram objectives are attached to a model using the Reactis GUI to select a Simulink system from a library and “wire” it into the model. The diagrams are created using Simulink and Stateflow in the same way standard models are built. After adding a diagram objective to a model, the diagram is included in the model’s hierarchy tree, as are other library links in a model.

A timer objective indicates that a model element acts as a timer or counter. A timer target is covered when the value of the selected data item is changed from the start value to the end value (typically by repeatedly adding the step value to the data item, although this is not strictly necessary — the timer target will still be covered if some or all of the intermediate values are skipped). For some models, adding timer targets will help Reactis Tester attain higher levels of coverage. Conceptually, this is because the timer target guides Reactis to generate tests which continue stepping until a timer expires or a counter reaches its final value. In the case of a timer assertion, the assertion is considered violated if the conditions which would cause a timer target with the same parameters to be covered occur.

9.1  The Meaning of Validator Objectives

We now explain in more detail how Reactis interprets Validator objectives. Coverage of these objectives may be tracked in the same manner as built-in coverage targets (e.g. MC/DC); namely through highlighting and hovering in the main Reactis panel, through the Coverage-Report Browser, and through reports generated by Simulate > Fast Run with Report.

9.1.1  Assertions

Assertions specify properties that should always be true for a model. For a given assertion, Validator searches a model for a simulation run that leads to a state in which the assertion does not hold. An expression assertion fails to hold if the given expression evaluates to false1. A timer assertion fails to hold if the data item it tracks ever changes from its start value to the end value by a specified increment. A diagram assertion fails to hold if any output port assumes the boolean value “false” or a numeric value zero. Note that if a diagram has more than one output port, the individual outports are treated as distinct assertions. That is each individual outport is listed in the objective list of the Info File Editor. Also, the individual ports are highlighted independently in the main panel according to whether they are violated or not.

An assertion is considered “covered” when a violation is found. So in contrast to targets, where being covered is considered good, covering an assertion is bad. We therefore highlight covered assertions in red and yellow, in contrast to targets where uncovered targets are highlighted in red.

9.1.2  User-Defined Coverage Targets

User-defined coverage targets extend the Reactis built-in targets described in Chapter 6 (branch coverage, state coverage, condition-action coverage, MC/DC, etc). If a user-defined target is specified as an expression, Reactis will treat it as covered when the expression evaluates to a non-zero numeric value. If a user-defined target is specified as a timer, the target is considered covered after the data item tracked by the timer changes from its start value to its end value by its specified increment. If a user-defined target is specified as a diagram, Reactis will treat it as covered when all output ports of the associated subsystem have assumed either a boolean “true” value or any non-zero numeric value. Note that if a diagram has more than one output port, the individual port names are listed in the target list and highlighted independently according to whether they are covered or not.

9.1.3  Virtual Sources

Virtual sources control top-level inports of a model. If a virtual source is an expression, then at each simulation step, the expression is evaluated to compute a value that will be fed into the top-level inport controlled by the virtual source. If a virtual source is specified as a diagram, then each outport of the diagram may be fed into a top-level inport of the model. Some outports of the virtual source diagram may be left unconnected and monitored by an assertion.

9.2  Use Cases of Validator Objectives

9.2.1  Checking a Requirement with an Expression Assertion and an Expression User-Defined Target

Consider a cruise control application that has the following requirement:

The cruise control shall remain inactive if the speed of the vehicle is less than 30 mph.

To properly test this requirement we need to execute a test in which an attempt is made to activate the cruise control while the speed of the vehicle is less than 30 mph. We can capture this as a simple expression-based user-defined target:

on && activate && (speed < 30)

This expression would be "wired-in" such that on monitors whether the cruise control is turned on, activate monitors whether an attempt has been made to activate the cruise control, and speed is the speed of the vehicle. When this expression becomes true, we have found a test to check our requirement.

We can test whether the applications response to this test is correct with the following expression assertion:

!(active && (speed < 30))

This expression is wired in so that active monitors whether the cruise control is active. If this expression ever becomes false, then the requirement is violated. Alternatively, this requirement could be specified with an equivalent expression that uses the logical implication operator:

active -> (speed >= 30)

9.2.2  Checking a Requirement with a Diagram Assertion

Consider another requirement for a cruise control:

When active, cruise control shall not permit actual, desired speeds to differ by more than 1 mph for more than 3 seconds.

This requirement can be checked by a diagram assertion that monitors the vehicle speed, whether the cruise control is active, and the speed at which the driver has set the cruise control. The top-level interface of the Simulink subsystem (in this case a Stateflow diagram) is shown in Figure 9.1.


Figure 9.1: The diagram to check the speed maintenance requirement monitors three data items in the model and raises a flag on its output if the requirement is violated.
images/speedChkTop_web.png

The simple Stateflow diagram that implements the assertion is shown in Figure 9.2. It works as follows:

  1. Initially the diagram is in the Inactive state.
  2. When the cruise control becomes active it enters state Active and child state OkDiff.
  3. When in OkDiff, the diagram computes the difference between the vehicle speed and the speed at which the driver has set the cruise control.
    1. If the difference exceeds the tolerance, then go to state BigDiff.
    2. If the difference is less than the tolerance, then stay in OkDiff.
  4. When entering BigDiff, start a counter
    1. If the speed difference is corrected within 3 steps, then return to OkDiff.
    2. If the speed difference is not corrected within 3 steps, then flag an error by outputting the value 0 on its outport.

Figure 9.2: The implementation of the speed maintenance assertion.
images/speedChk_web.png

9.2.3  Creating a Functional Test Using a Virtual Source

Section 3.4.6 showed how you can create a functional test for the cruise control example using the user-guided simulation feature of Reactis Simulator. Virtual sources offer an alternate mechanism for capturing functional tests. Consider the need to test that the cruise control activates and deactivates as expected when executing the following scenario:

 InputExpected Output of Active
1)Turn cruise control onOff
2)Press setOn
3)Press cancelOff
4)Press setOn
5)Press brakeOff
6)Press resumeOn
7)Press gasOff

This scenario can be captured using the virtual source shown in Figure 9.3. The Signal Builder block shown implements the virtual source by capturing the scenario described above. The first six signals are fed into inputs of the model while the final output is monitored by a simple assertion that compares the expected value for active against the actual value produced by the model.


Figure 9.3: Virtual sources offer an easy way to implement functional tests. Any Simulink / Stateflow construct supported by Reactis may be used to implement the virtual source. Signal Builder blocks are especially useful. A Validator assertion can monitor an output of the virtual source, relieving the user of the tedious task of checking expected values against actual responses of the model.
images/activeChkVS_web.png

9.3  Adding, Editing, and Removing Objectives

Adding, editing, and removing of objectives is only possible when Simulator is disabled. This is necessary because objectives are linked into a model when Simulator is invoked.

A Validator objective may be added to a model by first selecting in the hierarchy panel the Simulink subsystem or Stateflow state in which the objective should be inserted and then:

  • right-clicking in an empty space in the subsystem in the main panel of the Reactis window and selecting Add Assertion or Add User-Defined Target or Add Virtual Source; or
  • by selecting the Validate > Add Assertion or Validate > Add User-Defined Target or Validate > Add Virtual Source menu items from the main menu bar.

Usage of the resulting dialogs is explained in Section 9.3.1 (expression objectives within Simulink), Section 9.3.2 (expression objectives within Stateflow), Section 9.3.3 (timer objectives), and Section 9.3.4 (diagram objectives).

In Stateflow charts, only expression objectives are supported. Using a group of radio buttons in the wiring dialog, you can choose for the objective to be evaluated at state entry or exit, or every time the chart is triggered while the state is active. These correspond to the usual entry, exit, and during actions of Stateflow. Evaluation of the objective occurs at the end of the corresponding state actions (entry, during, exit).

To edit properties of an objective, either right-click on the objective and select Edit > Properties..., or click on the objective to select it and then choose the Validate > Edit Objective... menu item. Note that all the controls in the properties dialog are disabled if Simulator is currently active. To remove an objective, either:

  • right-click on the objective and select Remove, or
  • click on the objective to select it and then choose the Validate > Remove Objective menu item, or
  • click on the objective to select it and then press the delete key.

Standard cut, copy, and paste operations are available for objectives both from the top-level Edit menu and by right-clicking on an objective. Edit > Undo and Edit > Redo enable you to undo and redo these operations. Objectives may be moved by dragging them with the mouse.

9.3.1  The Simulink Expression Objective Dialog


Figure 9.4: The dialog for editing expression objectives located within Simulink subsystems.
images/exprObj_web.png

The numbers below refer to the labels in Figure 9.4.

  1. Objective name. The name must be unique with respect to the subsystem where the objective resides.
  2. Enable or disable the objective. When disabled:
    • assertions are not checked
    • user-defined targets are not tracked
    • virtual sources do not control inports
  3. Expression. Reactis evaluates this expression at each simulation step and interprets the scalar result of numeric type as follows:
    Assertions.
    A zero result means a fail has occurred.
    User-Defined Targets.
    A non-zero result means the target is covered.
    Virtual Sources.
    The result is fed into the inport controlled by the virtual source.
    Free variables in the expression are declared in the Inputs section (window item 5) and wired to data items (Simulink blocks or Stateflow variables) visible at the place in the model where the objective resides. Valid operators and functions are listed in Table 9.1.

    Symbol/NameOperation
    +, -, *, /Add, Subtract, Multiply, Divide
    <, >Less than, Greater than
    <=, >=Less than or equal, Greater than or equal
    ==Equal
    !=Not equal
    ->Logical Implication
    !Logical Not
    ||Logical Or
    &&Logical And
    []Vector element access
    a.bAccess element b of bus a
    abs(), fabs()Absolute value
    x^y, pow(x,y), power(x,y)x to the power y
    exp()Exponent
    ln(), log(), log10()Logarithm
    sqrt()Square root
    rem()Division remainder
    infInfinity value
    floor(), ceil()Rounding
    cos(), sin(), tan()Trigonometric functions
    cosh(), sinh(), tanh()Hyperbolic trigonometric functions
    acos(), asin(), atan(), atan2()Inverse trigonometric functions
    pre()Access values from previous simulation steps (see Section 9.3.5).
    if expr1 then expr2 else expr3If expr1 is true, evaluate expr2. Otherwise evaluate expr3.
    Table 9.1: Simulink expression objective operators and functions.


  4. # steps to hold. This is a non-negative integer value. For assertions, this entry specifies the number of simulation steps that the expression must remain false before flagging an error. For user-targets, the entry specifies the number of steps that the expression must remain true before the target is considered covered.
  5. Inputs. This is where variables used to construct the expression are declared. Each variable can be viewed as an input to the objective. Clicking the Add Variable button adds a row to the section. The rightmost entry box of a row is where you enter the name of a variable. The column(s) to the left of the variable name specify which data item from the model is wired to the variable. Clicking the X button to the right of a row deletes the variable declaration.

    Note that you can leave any of these selections empty and then later connect them via drag-and-drop in the main panel (see Section 9.3.6). Any variables referenced in the expression that are not declared in this section will be automatically added to the list (with empty wiring).

    When the dialog is dismissed, the wiring specified here may be viewed from the main panel of the Reactis window by hovering over the objective. The wiring will be shown as blue lines which can be thought of as virtual wiring. For each input of the objective, a blue line will be drawn from a data item in the model to the input to indicate how the input is wired.

  6. Auto-Wire. Clicking on this causes Reactis to attempt to automatically select the data items from the model that should feed into the inputs of the objective. For an inport X of an objective O, the pairing algorithm selects a data item D if:
    • D is a Simulink block named X, or
    • D is an outport named X of a subsystem with the same parent as O
    If there are no matches or more than one possible match, the auto-wiring attempt will fail for X and the inport will need to be manually bound to a data item.

For virtual sources an additional pull-down menu at the bottom of the dialog enables you to specify the model inport that the virtual source controls.

9.3.2  The Stateflow Expression Objective Dialog


Figure 9.5: The dialog for editing expression objectives located within Stateflow charts.
images/sfExprTargetEditor_web.png

The numbers below refer to the labels in Figure 9.5.

  1. Objective name. The name must be unique with respect to the objectives in the state where the objective resides.
  2. Enable or disable the objective. When disabled:
    • assertions are not checked
    • user-defined targets are not tracked
  3. Expression. Reactis evaluates this expression at the point determined by window item 6 and interprets the scalar result of numeric type as follows:
    Assertions.
    A zero result means a fail has occurred.
    User-Defined Targets.
    A non-zero result means the target is covered.
    Valid operators and functions are listed in Table 9.2.

    Symbol/NameOperation
    +, -, *, /Add, Subtract, Multiply, Divide
    <, >Less than, Greater than
    <=, >=Less than or equal, Greater than or equal
    ==Equal
    !=Not equal
    ->Logical Implication
    !Logical Not
    ||Logical Or
    &&Logical And
    []Vector element access
    a.bAccess element b of bus a
    abs(), fabs()Absolute value
    x^y, pow(x,y)x to the power y
    exp()Exponent
    log(), log10()Logarithm
    sqrt()Square root
    floor(), ceil()Rounding
    cos(), sin(), tan()Trigonometric functions
    cosh(), sinh(), tanh()Hyperbolic trigonometric functions
    acos(), asin(), atan(), atan2()Inverse trigonometric functions
    double(), single(), boolean(), int8(), ... 
    int16(), int32(), uint8(), uint16(), uint32()Type cast functions
    pre()Access values from previous simulation steps (see Section 9.3.5).
    if expr1 then expr2 else expr3If expr1 is true, evaluate expr2. Otherwise evaluate expr3.
    Table 9.2: Stateflow expression objective operators and functions.


  4. # steps to hold. This is a non-negative integer value. For assertions, this entry specifies the number of simulation steps that the expression must remain false before flagging an error. For user-targets, the entry specifies the number of steps that the expression must remain true before the target is considered covered.
  5. Visible variables and constants. This lists the variables and constants visible in the scope where the objective is located. Double-clicking on an item in this list will insert it into the expression (window item 5) text box.
  6. Specifies the point during execution when this objective is checked:
    When executing state entry actions
    The objective is checked after the “entry” actions of the state in which it is located execute.
    When executing state during actions
    The objective is checked after the “during” actions of the state in which it is located execute.
    When executing state exit actions
    The objective is checked after the “exit actions” of the state in which it is located execute.
    On chart entry
    The objective is checked every time the chart becomes active, before any events are processed within the chart.
    On chart exit
    The objective is checked every time the chart becomes inactive, after all events have been processed within the chart.

9.3.3  The Timer Objective Dialog


Figure 9.6: The dialog for inserting/modifying timer objectives.
images/timerObjDialog_web.png

The timer objective dialog is used to add a timer target to a model or edit the parameters of an existing timer target. A timer target can be added by right clicking within the main panel and selecting Add User-Defined Target > Timer. Timer targets can be added within blank areas of Simulink subsystems, or anywhere within a Stateflow chart (including inside a state). A timer objective dialog is shown in Figure 9.6. The numbers below refer to the labels in that Figure.

  1. Name. The name must be unique within the subsystem where the objective resides:
    • If a timer objective is located in Simulink, then its name must not be the same as any block or other Validator objective residing in the same subsystem.
    • If a timer objective is located in Stateflow, then its name must not be the same as any other Validator objective residing in the same state.
  2. Timer. The data item from the model to be tracked as a timer. The pull-down menu lists all possible data items in the subsystem/state where the objective will reside.
  3. Enable or disable the objective. When disabled the timer target is not tracked.
  4. Start value. The initial value of the timer.
  5. Step size. The amount the timer increases or decreases during a step.
  6. End value. The end value of the timer.

Figure 9.7: The timer target Sustained Fault is covered when the output of the Product block steps from 0 to 110.
images/timerTargetExample_web.png

A typical use-case for a timer target is shown in Figure 9.7. It shows a subsystem which signals a fault when a sensor input is negative for 100 consecutive simulation steps. The timer target Sustained Fault is covered when the output value of the Product block reaches 110 (at which point the output fault will have been true for 10 consecutive steps). Adding this target to the model helps Tester rapidly reach 100% coverage.

One important note: in Figure 9.7, the timer target does not use the output of the Count block as the counter. Instead it uses the input to the Count block. When a unit delay block is used as a timer/counter, Tester generally produces better coverage for timer targets when using the input to the unit delay block rather than the output of the block. The recommended best practice for creating timer targets which rely on unit delay blocks is to use the input to the block rather than the output as the basis for the target.

9.3.4  The Diagram Objective Dialog

The Diagram Objective Dialog is the vehicle for adding diagram-based assertions and user-defined targets. It is also used to modify existing diagram objectives. You invoke it by right-clicking in the main panel in white space within a Simulink subsystem and selecting either:

  • Add User-Defined Target > Diagram, or
  • Add Assertion > Diagram

The numbers below refer to the labels in Figure 9.8.


Figure 9.8: The dialog for inserting/modifying diagram objectives.
images/diagramObjDialogB_web.png

  1. Objective name. The name must not be the same as any block or Validator objective within the subsystem where the objective resides.
  2. Enable or disable the objective. When disabled:
    • assertions are not checked
    • user-defined targets are not tracked
    • virtual sources do not control inports
  3. Name of the currently chosen objective library. This is a standard .slx file or .slx file storing a Simulink library containing the Simulink / Stateflow subsystems that will be wired into a controller model as Validator diagram objectives. This file is constructed in the usual manner using The MathWorks’ Simulink and Stateflow editors. Note that the actual wiring is established and maintained by Reactis, so the controller model being validated need not be modified at all. The wiring information is stored by Reactis in the .rsi file associated with a model.
  4. Clicking here opens a file selection dialog for choosing a Validator objective library.
  5. System. This tree represents the structure of the library specified in window item 3. Selecting an item in the tree indicates the Simulink subsystem from the library that encodes the objective.
  6. Parameters. If the system chosen above is a masked subsystem, the values for the mask parameters may be entered here.
  7. Inputs. This section enables the user to specify the wiring between input ports of the diagram objective and the blocks in the subsystem of the model where the objective resides. Labels on the right represent the input ports of the diagram objective. The menus to the left list data items from the model that may be wired to the inputs of the objective. Note that an entry in the second column of menus only appears if the item selected in the first column is a block with more than one output. The second column menu selects one of the multiple outputs of the block in column one.

    Note that you can leave any of these selections empty and then later connect them via drag-and-drop in the main panel (see Section 9.3.6).

    When the dialog is dismissed, the wiring specified here may be viewed from the main panel of the Reactis window by hovering over the diagram objective. The wiring will be shown as blue lines connecting inputs of the objective to data items that flow into the inputs from the model being instrumented for validation.

If the diagram objective is a virtual source, the dialog will have an additional Outputs section at the bottom. This section contains pull-down menus that enable you to specify the model inports that the virtual source controls. Each output of the virtual source may control one input of the model. Virtual source outputs left unconnected may be monitored by assertions.

When the dialog is dismissed, the wiring specified here may be viewed from the main panel of the Reactis window by hovering over the diagram objective. The wiring will be shown as blue lines connecting each outport of the virtual source to the model inport that it controls. Alternatively, you may hover over a controlled model inport to display the blue line connected to the virtual source that controls it.

9.3.5  Accessing Values from Previous Simulation Steps

Some Validator objectives may require access to values from previous simulation steps. For example, testing whether a block’s output has changed by more than a certain amount requires subtracting the previous output value from the current output value. In diagram objectives, values from previous steps are accessed via Simulink’s Unit Delay block. In expression objectives, prior values are accessed by using the pre() function.

The pre() function can take either one or two arguments. The first argument is an expression which will be evaluated in the context of a previous simulation step. For example, pre(x+y) will return the sum of the values held by x and y during the previous simulation step. The optional second argument to the pre() function specifies the number of simulation steps to go back, so pre(x+y,2) will return the sum of x and y from two steps ago, pre(x+y,3) will return the sum of x and y from three steps ago, and so on.

If the pre() function is used to access a value prior to the first simulation step (e.g., when pre() is called during the first simulation step), the returned value will be zero. If this produces undesirable results, an if-then-else clause should be used to avoid invoking pre() until a sufficient number of simulation steps have been taken.

For example, to create an assertion that checks whether a variable x never changes by more than 0.5 per simulation step, use the following expression:

if t==0 then 1 else abs(x-pre(x))<=0.5

This expression returns true on the first simulation step. On all subsequent steps, it returns true if the difference between the current and previous values of x does not exceed 0.5.

9.3.6  Wiring Validator Objectives Within the Reactis Main Panel


Figure 9.9: Connecting the output of “Relational Operator1” to the “active” input of Validator objective “SpdCheck”.
images/objWiring_web.png

Selecting the proper wiring in the Validator objective property dialogs shown in the previous sections can sometimes be difficult, especially if the blocks in a system do not have descriptive names. As an alternative, you can wire up objectives within Simulink systems via drag-and-drop in the main panel. When creating a new objective, just leave the entries in the inputs wiring table unconnected when dismissing the dialog. For expression objectives, you do not need to manually add the variables to the wiring table. Reactis will automatically add any missing variables.

After creating an objective and clicking “Ok” in the dialog, you can now use drag-and-drop to connect the inputs of the objective:

  1. Left-click on a signal line or a block with only one output port and hold the mouse button down.
  2. Reactis will draw a dashed line from the output port of the block that feeds the signal line to the current mouse position.
  3. While holding the button, move the mouse cursor to the Validator objective to which you want to connect the signal.
  4. While over the objective, release the mouse button. Reactis will select the objective and show a menu listing all of its inputs.
  5. Select the desired input in the menu. This sets the wiring. Note that if some other signal was connected to the input, it will be replaced with the new connection.
  6. You can inspect the proper wiring by hovering over the objective.

9.4  Linking to Requirements Documents

As described above, Validator helps you check if a model satisfies its requirements. Often, requirements for a system are specified using natural language in a requirements document. For requirements documents implemented in Microsoft Word or Microsoft Excel, Reactis offers a facility to establish and manage links between a natural language requirement and Reactis Validator objectives (assertions and user-defined targets) that check for violations of the requirement. This section describes that mechanism for linking to requirements documents.

For each assertion or user-defined target that you insert into a model, you can also establish a link which points to a particular location within a requirements document. Once this link is configured for an objective, you can right-click on the objective in the main Reactis panel and select Go to linked document to see the natural language requirement from which the objective was derived. If you have configured a bidirectional link, you can also go in the other direction from the natural language requirement to Validator objectives which check for compliance of the model to the requirement.

Each dialog for inserting or modifying a Validator objective includes a tab named Document Link, as shown in Figure 9.10. This example shows a link between the assertion LowSpeedInactive and a natural language requirement located at a bookmark named LowSpeedInactive in the Microsoft Word document cruise_requirements.docx. This link can be established with the following steps:

  1. Right-click in the main Reactis panel and select Add Assertion > Expression and in the Settings tab of the resulting dialog, specify the assertion. Alternatively, a link to a requirement may be added to an existing Validator objective by right-clicking on the objective and selecting Edit Properties....
  2. Select the Document Link tab.
  3. Select the Microsoft Word radio button to indicate the requirement is in a Word document.
  4. Click the file selection button to the right of the Document text entry box.
  5. In the resulting file selection dialog, select cruise_requirements.docx.
  6. In the Location in Document section of the dialog, select radio button Bookmark and enter the bookmark name LowSpeedInactive. Note, that in this scenario, this bookmark did not yet exist in the Word document. If a bookmark at the location to which you wish to link already exists, you can skip the next 3 steps which create a new bookmark in the Word document.
  7. Open cruise_requirements.docx in Microsoft Word.
  8. Select the text that specifies the requirement to which you want to link.
  9. Return to the Reactis Document Link dialog and click the second Create button to create a bookmark to the selected text and establish a bidirectional link. This button click does the following:
    • Creates a new bookmark in cruise_requirements.docx named LowSpeedInactive.
    • Inserts a red R icon at the bookmark. In Word if you ctrl+click on the R, the LowSpeedInactive assertion will flash in yellow in the main Reactis panel. This is a forward link from the natural language requirement to the assertion in the model. Note, that the model must be open in Reactis for this highlighting to occur.
    • Establishes the reverse link from the assertion in the model to the bookmark in the Word document. Subsequently, clicking the Go to link button in the dialog will cause Word to open cruise_requirements.docx and display the new bookmark.
  10. Click the Ok button to save the changes to the document link and dismiss the dialog.

Figure 9.10: The dialog for establishing and maintaining a link between a Validator objective and a natural language requirement. This example shows a link to a bookmark in a Microsoft Word document.
images/reqDocLink_web.png

The different elements of the Document Link tab are labeled in Figure 9.11 and work as follows:

  1. The Document section lets you specify the document to which you wish to link. First use the radio buttons to select the program that manages the requirements document (Microsoft Word or Excel), then specify the document.
  2. When linking to a Word document, the document name goes here.
  3. When linking to a Word document, this button can be clicked to open a file-selection dialog to chose a Word document.
  4. When linking to an Excel document, the document name goes here.
  5. When linking to an Excel document, this button can be clicked to open a file-selection dialog to chose an Excel document.
  6. The Location in Document section lets you specify a location within a requirements document. Two alternative mechanisms for specifying a location within a document are offered:
    • Bookmarks (called named ranges in Excel).
    • A search for a match against a specified text string.
    The radio selector lets you chose the mechanism.
  7. The name of a bookmark (named range) to which to link goes here. If a bookmark already exists in your document, you can simply specify the name here. If you wish to create a new bookmark, you can use buttons 8 and 9
  8. Clicking this button creates a bookmark (named range) in the Word (Excel) document that is specified in the Document section. The location of the bookmark is the currently selected text (for Word) or range of cells (for Excel). If a bookmark name is given in field 7, then that will be the name of the new bookmark. A link from the current Validator objective to the new bookmark will be established such that clicking the Go to link button (item 12) will cause the document to be opened and positioned to the new bookmark. If no name is specified in field 7, then a name will be auto-generated.
  9. Clicking this button does everything that is done when button 8 is clicked and also inserts a link from the new bookmark to the Validator objective. This link appears as a red R in the document and when you ctrl+click on it in Word/Excel, the linked Validator objective flashes yellow in the main Reactis panel. Note that the model must be open in Reactis for this highlighting to work.
  10. When finding a location within the document using a text search, the search string goes here.
  11. If searching in an Excel document, you must also specify the Worksheet in which to search. Specify that here.
  12. Clicking this button opens the document specified in the Document section to the location specified with the Location in Document section.

Figure 9.11: The dialog for establishing and maintaining a link between a Validator objective and a natural language requirement.
images/reqDocLinkA_web.png

9.5  Running Reactis Validator

After adding assertions and user-defined targets to a model, Validator may be invoked by selecting the Validate > Check Assertions... menu item to determine whether or not assertion violations can be found. Figure 9.12 contains an annotated screen shot of the Validator launch dialog; the labeled items are described in the next subsection. It should first be noted, however, that the Validator launch dialog is very similar to that of the Tester launch dialog as described in Section 8.1. This similarity is due to the fact that conceptually, Validator works by generating test data using the Tester test-generation algorithm and then running the tests on the instrumented model to check for assertion violations. For this reason, many of the features of the Validator launch screen are identical to those of Tester. When this is the case, the descriptions below are very brief; more detail may be found in Section 8.1.


Figure 9.12: The launch dialog for Validator.
images/validatorWinB_web.png

9.5.1  Labeled Window Items

  1. Specify how long Validator should run. There are three options to choose from:
    • A fixed amount of time
    • A fixed number of steps
    • A specified number of random and targeted tests/steps.
    If 100% coverage of all targets is reached prior to the specified run time, Validator will terminate early. Note that for calculating this early termination, assertions are considered uncovered if they have not been violated. Therefore, if any assertion remains not violated, then early termination will not occur.
  2. A list of .rst filescontaining test suites to be preloaded.
  3. When the Prune check-box to the right of a filename is checked, unnecessary steps (those that do not increase the level of coverage) will be pruned from the preloaded test suite. When the check-box is not checked, no pruning of the suite will occur. Note, that assertions are checked for all steps of the preloaded tests. If a violation is found in a step, it will not be pruned.
  4. If the Use Virtual Sources check-box is checked for an .rst file, then when executing the tests in the preloaded suite, values produced by enabled virtual sources will be used for controlled inputs instead of the input values from the test suite for those inputs. When this item is not checked, input values from the test suite will be used for all inputs.
  5. Clicking this button invokes a file-selection dialog that enables the user to specify an .rst file to be added to the preload list.
  6. Clicking this button removes the currently selected .rst file from the list of test suites to be preloaded.
  7. When running Validator for a fixed length of time, the number of hours and minutes is entered here.
  8. When running Validator for a fixed number of steps, the number of steps is entered here. Validator will decide how many of these steps will be random or targeted. Because of pruning, the number of steps in the final test suite will typically be less than the number entered here.
  9. When running Validator for a specified number of random and targeted steps, the number of tests in the random phase is entered here. Because of pruning that occurs at the end of the random phase, some tests may be eliminated entirely, leading to a smaller number of tests at the end of the random phase than what is specified here.
  10. When running Validator for a specified number of random and targeted steps, the number of steps to take while constructing each test of the random phase is entered here. Upon completion of the random phase, unimportant steps are pruned from the tests, so the lengths of the final tests will usually be shorter than the length specified here.
    NOTE: Specifying too many steps in the random phase can cause Reactis to run out of memory. The upper bound on the number of steps possible depends on model size and available RAM, but in general much more time should be spent in the targeted phase which is more optimized for memory usage.
  11. When running Validator for a specified number of random and targeted steps, the number of execution steps to take during the targeted phase is entered here. The targeted phase uses sophisticated strategies to guide the simulation to exercise parts of the model not visited during the preload or random phases. The value entered specifies an upper bound on the number of simulation steps executed during the targeted phase.
  12. This entry box enables the user to pass one or more of the following parameters to Validator :
    • -a1 turns inputs abstraction on, -a0 turns inputs abstraction off. Inputs abstraction usually improves the performance of Validator and should be left on (default). In rare cases, turning it off may improve coverage. If coverage problems are encountered with inputs abstraction on, it may be beneficial to take a test suite produced with abstraction on, preload it into Validator, turn abstraction off, and then run Validator again.
    • -c n sets the maximum number of input variables that may change during an execution step to n, which must be a positive integer. The default is that every input variable can change at every step. Restricting the number of input variables that can change can lead to easier-to-understand test suites.
    • -C n directs Reactis to use n cores during test generation. Currently supported values for n are 1 and 2. Leveraging multi-core architectures speeds up test-generation for many models.
    • -s randomSeed seed for the random number generator. This is useful for replaying a previous run of Validator. The random seed used to create a .rst file can be found in the test-suite log (which may be viewed in the Test Suite Browser described in Chapter 11), after the “-s” in the “Created by Tester:” line.
  13. The name of the .rst file to be generated.
  14. Clicking this button opens a file-selection dialog for specifying the name of the .rst file to be generated.
  15. Clicking this button displays Validator help.
  16. Clicking this button opens a file selection dialog to specify an .rtp file from which to load Validator launch parameters. Reactis may be configured from the Settings dialog to generate an .rtp file for each Tester or Validator run.
  17. Reset the Validator parameters to their default values.
  18. Scroll backward in the parameter history.
  19. Scroll forward in the parameter history.
  20. Clicking this button starts Validator run.
  21. Clicking this button closes the Validator window.

The Progress Dialog displayed while Validator is running is the same as the Progress Dialog for Tester. For more information see Section 8.2

9.6  Validator Menus in the Reactis Top-Level Window

Chapters 4 and 7 describe most of the menu entries of the menu bar in the Reactis Top-Level Window. We now describe the entries which are related to Validator.

Edit menu.

This menu includes entries used to manipulate Validator objectives which are stored in .rsi files. Note, that .rsi files may be modified only when Simulator is disabled, therefore, when Simulator is active, these items are disabled.

Undo.
Undo an operation (add, edit, remove, move) on a Validator objective.
Redo.
Redo last undone operation (add, edit, remove, move) on a Validator objective.
Cut.
Cut the currently selected Validator objective.
Copy.
Copy the currently selected Validator objective to the clipboard.
Paste.
Paste a Validator objective from the clipboard to the current subsystem. To paste an objective to a specific position, right-click on that position in your model and select Paste from the context menu.
Validate menu.
The menu items include:
Add Assertion.
Add a new Validator assertion at a default location of the currently selected subsystem.
Add User-Defined Target.
Add a new Validator target at a default location of the currently selected subsystem.
Add Virtual Source.
Add a new virtual source at a default location of the currently selected subsystem.
Edit Objective...
Edit the currently selected objective.
Remove Objective.
Remove the currently selected objective.
Enable/Disable Objective.
Enable or Disable the objective.
Check Assertions...
Start a search for tests that violate assertions or cover user-defined targets.

9.7  Tracking Coverage Within Validator Objectives

Reactis tracks any coverage targets located within diagram objectives, but does not track any targets within expression objectives. Since a diagram objective is implemented as a Simulink / Stateflow model it can contain any of the coverage targets that Reactis tracks (e.g. branches, decisions, MC/DC targets, etc.). By default Reactis will treat these targets exactly as it treats targets within your model, i.e. they will be included in all coverage reporting and Reactis Tester will try to exercise them. By examining which coverage of targets within an objective were exercised you can determine how close testing came to violating an assertion or covering a UDT.

However, in some cases you might want to turn off this tracking of targets within objectives. For example, when generating a final report, you may prefer to see the only the statistics for covered targets within your model. You can turn off this tracking from the Coverage Metrics pane of the Info File Editor with the setting named Track coverage for contents of Validator Objectives.


1
The expression language for Validator objectives employs the C language convention of representing false as the numeric value zero and true by any non-zero value