ITK/Procedure for Adding a Test (itk 3.x)

From KitwarePublic
< ITK
Jump to navigationJump to search

{{#switch:|User|User talk=|#default={{#ifeq:{{{category}}}|no||}}}}


Appropriate testing is the most important aspects of writing software.

How to Add a Test to ITK

NOTE: ITKv4 has a new simpler procedure for adding tests. ITKv3's procedure is described here.

The general procedure involves the following steps

  • Write the test
  • Add the Test to the CMakeLists.txt file
  • Run the Test locally
  • Submit an Experimental build
  • Commit the Test

Write the test

Naming the test file

Unit tests should be named with the name of the class they are testing and the word Test at the end.

For example:

  • class itkBMPImageIO will have a test called itkBMPImageIOTest.cxx
  • class itkIndex will have a test called itkIndexTest.cxx

Sometimes it may be necessary to have multiple tests, in which case a number will be added after the "Test" string of the name.

For example:

  • itkBMPImageIOTest.cxx
  • itkBMPImageIOTest2.cxx
  • itkBMPImageIOTest3.cxx

Note that we skip the "1" entry. It is implicitly the first test.

Writing the content

The test file must contain a function that has the EXACT same name as the file (without the extension).

For example, the file itkBMPImageIOTest.cxx must contain a function

    int itkBMPImageIOTest( int argc, char * argv [] )

The function must have a return value, and must be one of the following two options:

  • EXIT_FAILURE
  • EXIT_SUCCESS

NOTE: Input for tests should reside in Testing/Data/Input or Examples/Data. Tests should not depend on the output of another test. All tests are not necessarily run in the same session. So each test should be able to run with a clean Testing/Temporary directory.

Test Drivers

ITK uses test drivers to manage the very large numbers of unit test. A test driver aggregates many tests into a single executable by registering all of them as functions.

Prior to ITKv4, test drivers resided in the source tree and it was the responsibility of the developer to maintain these files. ITKv4 uses the CMake test driver generator. The generated test driver is created by CMake in the binary tree of the test directory.

As a general guideline, there is one Test driver per major directory. The test driver is named after the directory that it is testing.

For example, the classes in the directory

   Insight/Code/Algorithms

are tested by files in the directory

   Insight/Testing/Code/Algorithms

and will use a test driver file called

   AlgorithmsCxxTests.cxx

How to Add a Test

NOTE: This procedure has changed since ITKv3.

Add the Test to the List in the CMakelists.txt file

Tests are added by listing them in the CMakeLists.txt file of the Testing/Code/KIT directory. Each Testing CMakeLists.txt file follows a convention.

For example, the CMakeLists.txt file in Testing/Code/BasicFilters has this structure:

set(KIT BasicFilters)
set(KIT_LIBS ITKIO ITKNumeris ITKStatistics)

Add new tests source files to this list.

set(KitTests
  itkAbsImageFilterAndAdaptorTest
  itkAbsoluteValueDifferenceImageFilterTest
...
  itkZeroCrossingImageFilterTest
)

The macro

CreateTestDriver(${KIT} "${KIT_LIBS}" "${KitTests}" TestDriver)

generates the C++ code for this test driver and place it in the binary directory for the tests (in this example, ITK_BINARY_DIR/Testing/Code/BasicFilters/BasicFiltersCxxTests.cxx)

ADD_TEST

Minimum Case

Add a line like the following

 ADD_TEST(itkBMPImageIOTest5
    ${TestDriver} 
    itkBMPImageIOTest2 
    )

where

  • itkBMPImageIOTest5 is a symbolic name for the tests. This is the name that ctest will use, and the one that will appear in a Dashboard
  • ${TestDriver} is the CMake variable containing the executable for that directory.
  • itkBMPImageIOTest2 is the name of the test itself, and it must match the test filename and the name of the function inside the test file.

Passing arguments

Some tests may require command line arguments. In that case, these arguments can be added after the name of the tests function. For example

 ADD_TEST(itkBMPImageIOTest5
    ${TestDriver} 
    itkBMPImageIOTest2 
    ${ITK_DATA_ROOT}/Input/image_color.bmp
    ${ITK_TEST_OUTPUT_DIR}/image_color.bmp)
    )

will pass two BMP images as arguments to the test.

Adding regression Testing

Some test may produce images as output, in which case we should add regression testing instructions that make possible to compare the test output against a baseline image.

A typical case will look like:

ADD_TEST(itkBMPImageIOTest5
    ${TestDriver} 
    --compare ${ITK_DATA_ROOT}/Baseline/IO/image_color.bmp
              ${ITK_TEST_OUTPUT_DIR}/image_color.bmp
    itkBMPImageIOTest2 
    ${ITK_DATA_ROOT}/Input/image_color.bmp
    ${ITK_TEST_OUTPUT_DIR}/image_color.bmp)
    )

Note that the "--compare" string goes just after the name of the executable, and it is followed by the filename of the baseline image and the filename of the test output.

Since different platforms may produce slightly different, but still acceptable, results; the regression testing system allows users to define a tolerance for the comparison. Three different tolerances are available

  • --compareIntensityTolerance
  • --compareRadiusTolerance
  • --compareNumberOfPixelsTolerance


compareIntensityTolerance sets the intensity difference after which two pixels are considered to be different. For example, if this tolerance is set to 5, then a pixel with value 123 is considered to be the same as pixels with values in the range [118,128].

compareNumberOfPixelsTolerance defines a Manhattan-type neighborhood around a pixel. When comparing pixel A from one image to pixel B in another image, all pixels in the neighborhood of pixel B are compared against pixel A, The most similar one is selected and then their intendity difference is tested against compareIntensityTolerance. If the difference is larger, then the pixel "FAILS" the comparison.

compareNumberOfPixelsTolerance defines how many pixels can be accepted to fail, and still consider the two images to be the same. If more than compareNumberOfPixelsTolerance failed the comparison, then the test will FAIL and a difference image between the test output and the baseline will be produced and posted in the Dashboard.

WARNING: Tolerances must be used sparingly. Before adding a tolerance, you must exhaust the options for making sure that there is not a bug in the test, or a bug in the class, that is being revealed by the test failure in that platform. Once you demonstrate that a tolerance is needed, add just enough of a tolerance to make the test pass. Do not over-relax the test, because then it will lose its capacity for detecting real failures in the future, and it will end up providing a false sense of security.

A test with regression verification and tolerances will look like


ADD_TEST(itkBMPImageIOTest5
    ${IO_TESTS} 
    --compare ${ITK_DATA_ROOT}/Baseline/IO/image_color.bmp
              ${ITK_TEST_OUTPUT_DIR}/image_color.bmp
    --compareIntensityTolerance 5
    --compareRadiusTolerance 1
    --compareNumberOfPixelsTolerance 25
    itkBMPImageIOTest2 
    ${ITK_DATA_ROOT}/Input/image_color.bmp
    ${ITK_TEST_OUTPUT_DIR}/image_color.bmp)
    )

This test will consider two pixels to be different only if their intensity values differ by more than 5 units. It will compare a pixels with the 3x3 neighborhood of the other pixel (in 2D), and it will tolerate 25 pixel failures. Pixel failures above that value it will result in the test failing, and a difference image being posted to the Dashboard.

Tests that output temporary data

Often tests will produce an image that will be compared to a baseline using the --compare method described above. In order for the image to end up in ${ITK_TEST_OUTPUT_DIR}, you must pass either that directory or the full file name of the output file to the test. As an example, say that itkMyFilterTest produces an image called MyFilterTest.png that should be compared to a baseline image of the same name. The below statement, along with using argv[1] as the output file name in the test, would cause the output image to be produced in the correct directory to be compared.

ADD_TEST(itkMyFilterTest
    ${TestDriver} 
    --compare ${ITK_DATA_ROOT}/Baseline/Filtering/MyFilterTest.png
              ${ITK_TEST_OUTPUT_DIR}/MyFilterTest.png
    itkMyFilterTest
    ${ITK_TEST_OUTPUT_DIR}/MyFilterTest.png)
    )

Run the Test locally

Rerun CMake on your project in order to include the new test in the project configuration of your native build system.

Run the test by calling

  ctest -V -R TestName

Where "TestName" is a regular expression that identifies the test or tests that you want to run.


Submit an Experimental build

You can do this by calling

   ctest -D Experimental

or

   make Experimental

or in Windows, Visual Studio, by selecting the Experimental project and invoking "Build" on int.

Adding testing baseline data

Please see this page for instructions on adding baseline images.