ITK/Release 4/Modularization/ Add tests
The safe assumption behind testing is that "if it is not tested, it is broken".
Non-regression tests
The same way it used to be, except the newer syntax in "add_test" command:
add_executable(itkBinaryThresholdFilterTest itkBinaryThresholdFilterTest.cxx)
target_link_libraries(itkBinaryThresholdFilterTest ${ITK-BinaryThreshold_LIBRARIES})
# adopt the modern CMake syntax :[NAME, COMMAND]
itk_add_test(NAME itkBinaryThresholdFilterTest
COMMAND itkBinaryThresholdFilterTest input.png output.png)
Regression tests
Approach A
Using the test driver macro The CMake macro "CreateTestDriver" defined in
Module/Core/TestKernal/CreateTestDriver.cmake
is designed for grouped regression tests in ITK.
All the tests in one module share one test driver.
set(ITKFooFilterTests
itkFooFilterTest1.cxx
itkFooFilterTest2.cxx)
CreateTestDriver(ITKFooFilter "${ITK_FooFilter-Test_LIBRARIES}" "${ITK-FooFilterTest}")
itk_add_test(NAME itkFooFilterTest1
COMMAND ITKFooFilterTestDriver itkFooFilterTest1)
itk_add_test(NAME itkFooFilterTest2
COMMAND ITKFooFilterTestDriver itkFooFilterTest2)
You can find Approach A used in most of the ITK modules.
for example in
ITK/Modules/IO/BMP/test/CMakeLists.txt
Approach B
Stand-alone regression tests
There is also a test driver executable("itkTestdriver") designed for stand-alone regression tests. For instance, all the tests in Examples are stand-alone programs each has its own main function call, in which case, the approach A is not appropriate.
add_exectuable(itkFooTest itkFooTest.cxx)
target_line_libraries( itkFooTest ${ITK_LIBRARIES})
itk_add_test(NAME itkFooTest
COMMAND itkTestDriver
--compare outputBaseline.png output.png
'''$<TARGET_FILE:'''itkFooFilterTest'''>''' input.png output.png)
The new syntax
$<TARGET_FILE: xxx>
in the add_test command triggers CMake to locate the executable automatically. No need to specify the path for the executable here.
You can find Approach B used in Examples.
For example in
ITK/Examples/Filtering/test/CMakeLists.txt
Add testing data to ITK
Please refer to: | Add Data
Writing tests
Unit tests
Writing a unit test helps ensuring that the class being exercised
- works as expected,
- works invariably along time,
- works invariably across different platforms.
It also helps in detecting potential issues, and checking how a given class is affected by changes introduced in the code.
A direct consequence of writing and executing a test is that the code coverage is improved. When writing a unit test you should consider the following principles:
- Make sure that all methods of the class being tested are exercised.
- Make sure that if the class is templated, a number of different instances with different arguments are created so that the class is thoroughly tested.
- Make sure your test returns an EXIT_SUCCESS value when succeeding and an EXIT_FAILURE value when failing.
- When a test checks multiple features that individually may return either an EXIT_SUCCESS or an EXIT_FAILURE code, make sure that the whole test is run, and return the code in the end. This will help in locating more issues in the same run.
// appropriate include files
int main( int, char *[] )
{
int testPassStatus = EXIT_SUCCESS;
//
// ...
//
if( itk::Math::FloatDifferenceULP( floatRepresentationfx1.asFloat, floatRepresentationfx2.asFloat ) != -1 )
{
std::cout << "Unexpected float distance." << std::endl;
testPassStatus = EXIT_FAILURE;
}
if( itk::Math::FloatAlmostEqual( floatRepresentationfx1.asFloat, floatRepresentationfx2.asFloat ) )
{
std::cout << "floatRepresentationfx1 is almost equal to floatRepresentationfx2\n" << std::endl;
}
else
{
std::cout << "floatRepresentationfx1 is NOT almost equal to floatRepresentationfx2\n" << std::endl;
testPassStatus = EXIT_FAILURE;
}
return testPassStatus;
}
instead of
// appropriate include files
int main( int, char *[] )
{
int testPassStatus = EXIT_SUCCESS;
//
// ...
//
if( itk::Math::FloatDifferenceULP( floatRepresentationfx1.asFloat, floatRepresentationfx2.asFloat ) != -1 )
{
std::cout << "Unexpected float distance." << std::endl;
return EXIT_FAILURE;
}
if( itk::Math::FloatAlmostEqual( floatRepresentationfx1.asFloat, floatRepresentationfx2.asFloat ) )
{
std::cout << "floatRepresentationfx1 is almost equal to floatRepresentationfx2\n" << std::endl;
}
else
{
std::cout << "floatRepresentationfx1 is NOT almost equal to floatRepresentationfx2\n" << std::endl;
return EXIT_FAILURE;
}
return testPassStatus;
}
- When an exception can be thrown by a class, write a condition (in a try/catch block) in the test that is able to reproduce the exception condition.
- When appropriate, exercise a given method a number of times (with different arguments) so that the method is thoroughly tested.
- When required, print messages to the standard output (or error output) so that any errors happened can be checked by a human reader.
- Whenever possible, test against a baseline (ground truth): if an object is not tested against a baseline, we will not know whether it is working as expected, even if its code coverage is 100%.
- Use the testing macros defined in itkTestingMacros.h. Particularly, make sure that the macro EXERCISE_BASIC_OBJECT_METHODS is called right after the creation of the object whose class is being tested.