Test Cases tab on the Activity form

The Test Cases tab displays either automated test cases or unit test cases, depending on your configuration. This topic discusses how to view and create unit test cases when you have the AutomatedTesting privilege enabled and you are viewing rules in the Automated Unit Testing framework.

Recording test cases and viewing invalid test cases

Using the buttons on the Test Cases tab, you can record new test cases or request a report that alerts you to test cases that might need to be updated or re-recorded:

Field

Description

Record New Test Case

Click to record a new test case for unit testing this activity.

  1. Select the thread you want to run the test case in from the Data context list.
  2. Select one of the following options:
    • Empty – Omit pre-populating the main test page with values.
    • Create or reset Test Page – Pre-populate the test page with data. You can select a data transform, or the system applies the default data transform of the rule's Applies To class (the pyDefault rule for that class).

    • Run Against a Saved Test Case – Select to run against a saved test case if the rule has saved test cases. Next, select a saved test case from the list.

      If differences are found between results of running the current state of the rule and the saved test case, they are displayed and you have the options of overwriting the saved test case or choosing to ignore differences for future test runs.

  3. If the activity uses parameters, the system presents an input form for parameters. Enter constants for input parameters as prompted. (Some input parameters may be optional.)

  4. Click Execute to start the activity in your requestor session.

  5. Click Save Test Case.

To set a white list of pages or properties for an activity test case, first record and save the test case and then locate the test case and open its rule form. Use the fields on the Results tab of the Test Case rule form to set the white list for the test case.

Clear Clipboard and Record

Click to clear the clipboard and then record a new test case for unit testing this activity. Follow the procedure described for the Record New Test Case button in the preceding table entry.

Note: The best practice is to use the Clear Clipboard and Record button to record an activity test case instead of the Record New Test Case button. When running an activity with the purpose of saving that run as a test case, it is important to have the clipboard in a known, clear state before starting the run. Otherwise, the clipboard could contain pages from an earlier run that you might not want saved as part of the test case, or which present an inaccurate picture of what the activity's steps do.

Invalid Test Cases

Click to get a report of any test cases saved for this activity that might need to be updated or re-recorded. If rules have changed, test cases that were recorded prior to the changes might not return the expected results. Examine the Invalid Test Cases report to see which test cases are affected.

When you click this button, the Invalid Test Cases window opens, and you can choose to run the report immediately or to run as a background process and have the results emailed to you. If you run the report immediately, the Test Case window opens and displays the list of test cases.

Refresh

Click to update the list of saved test cases displayed on this tab.

Test cases

If any test cases are saved for this activity, they are listed in the table in the center of this tab. You can run an individual test case by selecting its name.

Field

Description

Name

Name of the previously recorded and saved test case. To play back that test case, click its name in this list. The Run Rule window opens and runs that test case.

Created By

Operator who created the saved test case.

Created On

When the saved test case was created.

Last Run On

When the saved test case was last run.

Status of Last Run

The test case's status from the last time it was run.

Saved Results

Not used.

Playing back saved test cases

To play back a saved test case, click its name in the Name field.

The Run Rule window opens and the system runs the test case. If differences are found between the results of running the current state of the rule and the saved test case, they are displayed in the Run Rule window with the following options.

If a white list exists for the test case, only differences for those white-listed pages and properties are displayed in the Run Rule window after running the test case. If differences are reported for white-listed pages or properties, you can indicate whether to remove the property or page from the white list by selecting the corresponding radio button (for a property) or check box (for a page) in the Remove White List Entry For column, and clicking the Save Ignores button. The test case is saved and re-run, applying the updated selections.

Note: Activities that result in display of an XML stream will not display that stream during test case playback.

Field

Description

Save Ignores The differences encountered are displayed in the lower part of the window. If you want to indicate that a difference is to be ignored in future runs, you can select to have the system ignore it for either this test case or for all test cases, and then click Save Ignores to save your choices.

You can also indicate that differences involving an entire page should be ignored in future runs of this specific test case by selecting the check box corresponding to that page name. If you specify to ignore a page for this test case, all property differences found for properties on that page are ignored each time this test case runs.

Overwrite Test Case Click to overwrite this test case with the current run state.