Proposals:Increasing ITK Code Coverage: Difference between revisions
(New page: __TOC__ = Motivation = ITK currently has a 804% code coverage. This means that about 30,000 lines of code are not tested. We could significantly increase the code coverage of the toolk...) |
|||
(22 intermediate revisions by 5 users not shown) | |||
Line 3: | Line 3: | ||
= Motivation = | = Motivation = | ||
ITK currently has a | ITK currently (Dec 20th 2008) has a 80.5% code coverage. | ||
This means that about | http://www.cdash.org/CDash/viewCoverage.php?buildid=240378 | ||
Sloccount report on number of lines of code in the Insight/Code directory returns: | |||
158,928 lines of code | |||
This means that about 31,000 lines of code are '''not tested'''. | |||
We could significantly increase the code coverage of the toolkit, and in the process reduce the number of hidden bugs, by asking volunteers to adopt particular classes and write additional tests for increasing their code coverage. This could be done at the image of the "Adopt a Bug" program. | We could significantly increase the code coverage of the toolkit, and in the process reduce the number of hidden bugs, by asking volunteers to adopt particular classes and write additional tests for increasing their code coverage. This could be done at the image of the "Adopt a Bug" program. | ||
Line 11: | Line 17: | ||
= Infrastructure = | = Infrastructure = | ||
It has been pointed out that the current testing infrastructure of ITK impose a high threshold of effort on contributors of new tests | It has been pointed out that the current testing infrastructure of ITK impose a high threshold of effort on contributors of new tests. | ||
== Requirements == | |||
There are some existing unit test harnesses that might decrease the effort and provide additional functionality. A unit testing package for ITK must meet the following requirements: | |||
# It must have an itk-compatible license. | |||
# We must be able to distribute it with itk. | |||
# It must support '''all''' itk platforms. | |||
# It must fit within the itk test harness facility. Recall that we try to minimize the number of executables by combining large numbers of tests into FooTests.cxx files. | |||
# It must be compatible with cmake/ctest/cdash. For example, a test must be able to "return EXIT_SUCCESS" and "return EXIT_FAILURE". | |||
# It must not add complexity to an already complex testing process. | |||
# It must be compatible with itk's strict attention to compile warnings and dynamic memory analysis. In other words, it must not produce warnings or purify defects. | |||
# It should have a minimal source footprint. | |||
# It should allow to test the Examples, but without inserting instrumented code in them (so they still should look like normal Examples). | |||
Suggestions for improving the testing system to make easier for contributors to introduce new tests include | |||
== Boost Test == | |||
* http://www.boost.org/doc/libs/1_37_0/libs/test/doc/html/index.html | |||
Suggested by Steve Robbins | |||
* http://www.itk.org/mailman/private/insight-developers/2008-December/011421.html | |||
=== How it could work === | |||
<pre> | |||
--------------------- itkImageRegionTest.cxx --------------------------------- | |||
#define BOOST_AUTO_TEST_MAIN | |||
#include <boost/test/auto_unit_test.hpp> | |||
#include "itkImageRegion.h" | |||
template< unsigned int VImageDimension > | |||
struct Fixture | |||
{ | |||
typedef itk::ImageRegion<VImageDimension> RegionType; | |||
typedef typename RegionType::IndexType IndexType; | |||
typedef typename RegionType::SizeType SizeType; | |||
RegionType mRegion; | |||
}; | |||
struct Fixture1 : public Fixture<1> | |||
{ | |||
Fixture1( int start0, | |||
int size0 ) | |||
{ | |||
IndexType start = {{ start0 }}; | |||
SizeType size = {{ size0 }}; | |||
mRegion = RegionType( start, size ); | |||
} | |||
}; | |||
struct Fixture2 : public Fixture<2> | |||
{ | |||
Fixture2( int start0, int start1, | |||
int size0, int size1 ) | |||
{ | |||
IndexType start = {{ start0, start1 }}; | |||
SizeType size = {{ size0, size1 }}; | |||
mRegion = RegionType( start, size ); | |||
} | |||
}; | |||
struct Fixture3 : public Fixture<3> | |||
{ | |||
Fixture3( int start0, int start1, int start2, | |||
int size0, int size1, int size2 ) | |||
{ | |||
IndexType start = {{ start0, start1, start2 }}; | |||
SizeType size = {{ size0, size1, size2 }}; | |||
mRegion = RegionType( start, size ); | |||
} | |||
}; | |||
BOOST_AUTO_TEST_CASE( testSlice ) | |||
{ | |||
Fixture3 volume( 12, 12, 12, 10, 20, 30 ); | |||
Fixture2 slice0( 12, 12, 20, 30 ); | |||
Fixture2 slice1( 12, 12, 10, 30 ); | |||
Fixture2 slice2( 12, 12, 10, 20 ); | |||
BOOST_CHECK_EQUAL( slice0.mRegion, volume.mRegion.Slice( 0 ) ); | |||
BOOST_CHECK_EQUAL( slice1.mRegion, volume.mRegion.Slice( 1 ) ); | |||
BOOST_CHECK_EQUAL( slice2.mRegion, volume.mRegion.Slice( 2 ) ); | |||
} | |||
BOOST_AUTO_TEST_CASE( testSliceOutOfBounds ) | |||
{ | |||
Fixture3 volume( 12, 12, 12, 10, 20, 30 ); | |||
BOOST_CHECK_THROW( volume.mRegion.Slice( -1 ), std::exception ); | |||
BOOST_CHECK_THROW( volume.mRegion.Slice( 3 ), std::exception ); | |||
} | |||
BOOST_AUTO_TEST_CASE( testVolumeIsInside ) | |||
{ | |||
Fixture3 volumeA( 12, 12, 12, 10, 20, 30 ); | |||
Fixture3 volumeB( 14, 14, 14, 5, 10, 15 ); | |||
BOOST_CHECK( volumeA.mRegion.IsInside( volumeB.mRegion ) ); | |||
BOOST_CHECK( ! volumeB.mRegion.IsInside( volumeA.mRegion ) ); | |||
} | |||
--------------------- itkImageRegionTest.cxx --------------------------------- | |||
</pre> | |||
== Google Test == | |||
The [http://code.google.com/p/googletest/ Google Test framework] is very similar to the Boost test harness. GTest is essentially a set of macros that assist the developer in writing concise tests. The framework does not make use of exceptions, nor templates and is supported on all major platforms and some minor ones, i.e., Cygwin, Windows CE, and Symbian. The code is available under the [http://www.opensource.org/licenses/bsd-license.php BSD license]. The framework supports many of the features already in place through ctest, e.g., run every N'th test, run all matching tests, etc. The source code and includes are ~600K and trivially compiles using CMake without configuration or modification. | |||
<pre> | |||
# Build Google Testing | |||
set ( GTestSource | |||
Utilities/gtest-1.2.1 | |||
Utilities/gtest-1.2.1/src/gtest.cc | |||
Utilities/gtest-1.2.1/src/gtest-death-test.cc | |||
Utilities/gtest-1.2.1/src/gtest-filepath.cc | |||
Utilities/gtest-1.2.1/src/gtest-port.cc | |||
Utilities/gtest-1.2.1/src/gtest-test-part.cc | |||
Utilities/gtest-1.2.1/src/gtest-typed-test.cc | |||
) | |||
include_directories ( ${MI3CLib_SOURCE_DIR}/Testing/Utilities/gtest-1.2.1/include ) | |||
add_library(gtest ${BUILD_SHARED_LIBS} ${GTestSource}) | |||
</pre> | |||
=== Test Driver === | |||
The test driver is a very simple main function: | |||
<pre> | |||
#include <gtest/gtest.h> | |||
int main(int argc, char* argv[]) | |||
{ | |||
testing::InitGoogleTest ( &argc, argv ); | |||
return RUN_ALL_TESTS(); | |||
} | |||
</pre> | |||
=== Types of Test === | |||
There are two basic types of tests, simple tests using the TEST macro, and test fixtures using the TEST_F macro. Many different macros are available to when executing tests, ranging from string comparisons, expected exceptions, floating point comparisons, etc. The basic framework is [http://code.google.com/p/googletest/wiki/GoogleTestPrimer well documented] with [http://code.google.com/p/googletest/wiki/GoogleTestAdvancedGuide advanced guidance] for those who dig deeper. Below is example code from an internal project that demonstrates how to write a test. Text fixtures run as methods in a sub-class of the fixture and have access to all public and protected ivars of the fixture. All test macros function as stream operators with any text directed into them appearing in the output. NB: in this example an [http://en.wikipedia.org/wiki/MD5 MD5 hash] is used to verify correct output rather than comparison to a known good image. | |||
<pre> | |||
TEST(IO, LoadCT) { | |||
mi3c::ImageLoader loader; | |||
mi3c::Image::Pointer image = loader.setFilename ( dataFinder.getFile ( "CT.hdr" ) ).execute(); | |||
ASSERT_EQ ( "c1d43aaa5b991431a9daa1dc4b55dbb1", image->getMD5() ) << " failed to load the expected image data"; | |||
} | |||
class ImageDataTest : public testing::Test { | |||
public: | |||
ImageDataTest () { | |||
image = NULL; | |||
floatImage = NULL; | |||
} | |||
virtual void SetUp() { | |||
mi3c::ImageLoader loader; | |||
try { | |||
image = loader.setFilename ( dataFinder.getFile ( "MRA.hdr" ) ).execute(); | |||
mi3c::ConvertDataType convert ( mi3c::mi3cFLOAT ); | |||
floatImage = convert.execute ( image ); | |||
} catch ( itk::ImageFileReaderException e ) { | |||
FAIL(); // Couldn't load, so fail this test before we go any further with bad data. | |||
} | |||
} | |||
virtual void TearDown() { | |||
image = NULL; | |||
floatImage = NULL; | |||
} | |||
mi3c::Image::Pointer image; | |||
mi3c::Image::Pointer floatImage; | |||
}; | |||
TEST_F(ImageDataTest, DiscreteGaussianFilter) { | |||
mi3c::DiscreteGaussianFilter filter; | |||
mi3c::Image::Pointer o = filter.execute ( image ); | |||
EXPECT_EQ ( "6adeb490bda64b47e9c1bd6c547e570e", o->getMD5() ) << " Filtered with a gaussian"; | |||
EXPECT_EQ ( "300c7ee796d1b3c2b49a7649789bbf55", filter.execute ( floatImage )->getMD5() ) << " Filtered with a gaussian"; | |||
} | |||
TEST_F(ImageDataTest, MeanFilter) { | |||
mi3c::MeanFilter filter; | |||
filter.setRadius ( 1 ); | |||
mi3c::Image::Pointer o = filter.execute ( image ); | |||
EXPECT_EQ ( "8b7235e1f8497b0a7fb84eb5c94af00b", o->getMD5() ) << " Mean filtered"; | |||
EXPECT_EQ ( "069a6670309db5c03a79af11a9c6e526", filter.execute ( floatImage )->getMD5() ) << " Mean filtered"; | |||
} | |||
</pre> | |||
=== Running the tests === | |||
Test status is reported (in color) when running the tests and final status is reported as the exit status, much like current ITK testing. | |||
<pre> | |||
[blezek@mi3bld04 MI3CLib-linux86-gcc]$ bin/NoOp /mi3c/projects/Source/MI3CTestData | |||
[==========] Running 9 tests from 3 test cases. | |||
[----------] Global test environment set-up. | |||
[----------] 4 tests from ImageDataTest | |||
[ RUN ] ImageDataTest.MD5 | |||
[ OK ] ImageDataTest.MD5 | |||
[ RUN ] ImageDataTest.Threshold | |||
[ OK ] ImageDataTest.Threshold | |||
[ RUN ] ImageDataTest.DiscreteGaussianFilter | |||
[ OK ] ImageDataTest.DiscreteGaussianFilter | |||
[ RUN ] ImageDataTest.MeanFilter | |||
[ OK ] ImageDataTest.MeanFilter | |||
[----------] 2 tests from IO | |||
[ RUN ] IO.LoadCT | |||
[ OK ] IO.LoadCT | |||
[ RUN ] IO.LoadInvalidFile | |||
[ OK ] IO.LoadInvalidFile | |||
[----------] 3 tests from Image | |||
[ RUN ] Image.InstantiateImage | |||
[ OK ] Image.InstantiateImage | |||
[ RUN ] Image.InstantiateImage2 | |||
[ OK ] Image.InstantiateImage2 | |||
[ RUN ] Image.TestHash | |||
[ OK ] Image.TestHash | |||
[----------] Global test environment tear-down | |||
[==========] 9 tests from 3 test cases ran. | |||
[ PASSED ] 9 tests. | |||
[blezek@mi3bld04 MI3CLib-linux86-gcc]$ echo $? | |||
0 | |||
</pre> | |||
=== CMake Integration === | |||
With a slightly clever CMake macro, and a regular expression or two, Google tests are trivially integrated into CMake projects. Here, all TEST and TEST_F macros found in the source code are added as tests to the project. Each test is run as: | |||
NoOp --gtest_filter=TestGroup.TestName | |||
where TestGroup is the first argument to the TEST macro, and TestName is the second. | |||
<pre> | |||
# C++ tests | |||
set ( mi3cTestSource | |||
Source/NoOp.cxx | |||
Source/ImageTests.cxx | |||
Source/IOTests.cxx | |||
Source/FilterTests.cxx | |||
) | |||
add_executable(NoOp ${mi3cTestSource}) | |||
macro(ADD_GOOGLE_TESTS executable) | |||
foreach ( source ${ARGN} ) | |||
file(READ "${source}" contents) | |||
string(REGEX MATCHALL "TEST_?F?\\(([A-Za-z_0-9 ,]+)\\)" found_tests ${contents}) | |||
foreach(hit ${found_tests}) | |||
string(REGEX REPLACE ".*\\(([A-Za-z_0-9]+)[, ]*([A-Za-z_0-9]+)\\).*" "\\1.\\2" test_name ${hit}) | |||
add_test(${test_name} ${executable} --gtest_filter=${test_name} ${MI3CTestingDir}) | |||
endforeach(hit) | |||
endforeach() | |||
endmacro() | |||
# Add all tests found in the source code, calling the executable to run them | |||
add_google_tests ( ${EXECUTABLE_OUTPUT_PATH}/NoOp ${mi3cTestSource}) | |||
</pre> | |||
== UnitTestCpp == | |||
* http://unittest-cpp.sourceforge.net/ | |||
Suggested by Mathieu Malaterre | |||
* http://www.itk.org/mailman/private/insight-developers/2008-December/011423.html | |||
This package is distributed under an MIT License: | |||
* https://unittest-cpp.svn.sourceforge.net/svnroot/unittest-cpp/UnitTest++/COPYING | |||
== Custom XML Testing Framework for CTest CMake CDash Integration == | |||
This proposal has been submitted as a Insight Journal Article: | |||
http://hdl.handle.net/10380/3116 | |||
If we look at the XML for the CDash ([http://public.kitware.com/Wiki/CDash:XML CDash-XML]), specifically the file which describes the valid xml for test (ValidationSchemata/Text.xsd) we see the NamedMeasurement tag. This is the field that is displayed for the test. It can show a variety of types with the "type" attribute. Previously Dart defined the following types: "numeric/integer", "numeric/float", "numeric/double", "numeric/boolean", "text/string", "text/plain", "image/png", "image/jpeg". | |||
The way this testing system works is each test produces XML output of "NamedMeasurements". This XML is then compared against a XML baseline. Then any difference will be reported and sent to CDash via ctest. To produce a baseline, the output of the test should be carefully manually verified, then attributes added to the XML tags to describe how to do comparisons. | |||
Sample test: | |||
<pre> | |||
class ImageFileReaderInfoTest: | |||
public itk::Regression | |||
{ | |||
protected: | |||
virtual int Test(int argc, char* argv[] ) | |||
{ | |||
if( argc < 2 ) | |||
{ | |||
std::cerr << "Usage: " << argv[0] << " inputFile" << std::endl; | |||
return EXIT_FAILURE; | |||
} | |||
typedef unsigned char PixelType; | |||
typedef itk::Image<PixelType,3> ImageType; | |||
typedef itk::ImageFileReader<ImageType> ReaderType; | |||
ReaderType::Pointer reader = ReaderType::New(); | |||
reader->SetFileName( argv[1] ); | |||
reader->UpdateOutputInformation(); | |||
ImageType::Pointer image = reader->GetOutput(); | |||
ImageType::RegionType region = image->GetLargestPossibleRegion(); | |||
itk::ImageIOBase::Pointer imageIO = reader->GetImageIO(); | |||
this->MeasurementInsightSize( region.GetSize(), "ImageSize" ); | |||
this->MeasurementInsightVector( image->GetSpacing(), "ImageSpacing" ); | |||
this->MeasurementInsightPoint( image->GetOrigin(), "ImageOrigin" ); | |||
this->MeasurementInsightMatrix( image->GetDirection(), "ImageDirection" ); | |||
this->MeasurementTextString( imageIO->GetFileTypeAsString( imageIO->GetFileType()), "FileTypeAsString" ); | |||
this->MeasurementTextString( imageIO->GetByteOrderAsString( imageIO->GetByteOrder() ), "ByteOrderAsString" ); | |||
this->MeasurementTextString( imageIO->GetPixelTypeAsString( imageIO->GetPixelType() ), "PixelTypeAsString" ); | |||
this->MeasurementTextString( imageIO->GetComponentTypeAsString( imageIO->GetComponentType() ), "ComponentTypeAsString" ); | |||
this->MeasurementNumericInteger( imageIO->GetNumberOfComponents( ), "NumberOfComponents" ); | |||
return EXIT_SUCCESS; | |||
} | |||
}; | |||
</pre> | |||
The baseline XML: | |||
<pre> | |||
<?xml version="1.0" encoding="US-ASCII"?> | |||
<!-- created on victoria at Fri Jul 24 17:22:33 2009 | |||
--> | |||
<output> | |||
<DartMeasurement name="ImageSize_0" type="numeric/integer">34</DartMeasurement> | |||
<DartMeasurement name="ImageSize_1" type="numeric/integer">34</DartMeasurement> | |||
<DartMeasurement name="ImageSize_2" type="numeric/integer">141</DartMeasurement> | |||
<DartMeasurement name="ImageSpacing_0" type="numeric/double">62</DartMeasurement> | |||
<DartMeasurement name="ImageSpacing_1" type="numeric/double">62</DartMeasurement> | |||
<DartMeasurement name="ImageSpacing_2" type="numeric/double">1</DartMeasurement> | |||
<DartMeasurement name="ImageOrigin_0" type="numeric/double">30</DartMeasurement> | |||
<DartMeasurement name="ImageOrigin_1" type="numeric/double">30</DartMeasurement> | |||
<DartMeasurement name="ImageOrigin_2" type="numeric/double">0</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_0_0" type="numeric/double">1</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_0_1" type="numeric/double">0</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_0_2" type="numeric/double">0</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_1_0" type="numeric/double">0</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_1_1" type="numeric/double">1</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_1_2" type="numeric/double">0</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_2_0" type="numeric/double">0</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_2_1" type="numeric/double">0</DartMeasurement> | |||
<DartMeasurement name="ImageDirection_2_2" type="numeric/double">1</DartMeasurement> | |||
<DartMeasurement name="FileTypeAsString" type="text/string">Binary</DartMeasurement> | |||
<DartMeasurement name="ByteOrderAsString" type="text/string">LittleEndian</DartMeasurement> | |||
<DartMeasurement name="PixelTypeAsString" type="text/string">scalar</DartMeasurement> | |||
<DartMeasurement name="ComponentTypeAsString" type="text/string">unsigned_char</DartMeasurement> | |||
<DartMeasurement name="NumberOfComponents" type="numeric/integer">1</DartMeasurement> | |||
</output> | |||
</pre> | |||
The strengths of this approach is that it separates the execution and the validation (or is it verification) making the test code it's self smaller. It easily integrates with the CDash/CMake infrastructure (as it was designed to). Many existing test which print text could easily be migrated to this approach so that the output of the program is also validated and we will know when it changes. It could easily be expanded to compare new types. A single executable could be run with multiple arguments for multiple test and each test could have a different baseline. On the down sides this may require the most work to get working. | |||
=== Downloading and Documentation === | |||
The latest version can be downloaded via SVN: | |||
svn co http://erie.nlm.nih.gov/svn/Repository/blowekamp/utilities/trunk/ | |||
The following little CMakeLists.txt file is needed to make a working project out of the 3 utility libraries: | |||
<pre> | |||
PROJECT( TF_DRIVER ) | |||
CMAKE_MINIMUM_REQUIRED ( VERSION 2.6 ) | |||
CMAKE_POLICY( SET CMP0012 NEW ) | |||
ENABLE_TESTING() | |||
INCLUDE(CTest) | |||
OPTION(BUILD_SHARED_LIBS "Build with shared libraries." OFF) | |||
INCLUDE_DIRECTORIES ( ${PROJECT_SOURCE_DIR} ) | |||
SET (CMAKE_RUNTIME_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/bin CACHE INTERNAL "Single output directory for building all executables.") | |||
ADD_SUBDIRECTORY ( FileUtilities ) | |||
ADD_SUBDIRECTORY ( TestingFramework ) | |||
ADD_SUBDIRECTORY ( itktf ) | |||
</pre> | |||
Documentation can be found here: | |||
http://erie.nlm.nih.gov/~blowek1/itkCommandLineTools/classitk_1_1Regression.html | |||
== Other C++ Unit Test Framework == | |||
* Exploring the C++ Unit Testing Framework Jungle, http://gamesfromwithin.com/?p=29 |
Latest revision as of 13:20, 11 July 2010
Motivation
ITK currently (Dec 20th 2008) has a 80.5% code coverage.
http://www.cdash.org/CDash/viewCoverage.php?buildid=240378
Sloccount report on number of lines of code in the Insight/Code directory returns:
158,928 lines of code
This means that about 31,000 lines of code are not tested.
We could significantly increase the code coverage of the toolkit, and in the process reduce the number of hidden bugs, by asking volunteers to adopt particular classes and write additional tests for increasing their code coverage. This could be done at the image of the "Adopt a Bug" program.
Infrastructure
It has been pointed out that the current testing infrastructure of ITK impose a high threshold of effort on contributors of new tests.
Requirements
There are some existing unit test harnesses that might decrease the effort and provide additional functionality. A unit testing package for ITK must meet the following requirements:
- It must have an itk-compatible license.
- We must be able to distribute it with itk.
- It must support all itk platforms.
- It must fit within the itk test harness facility. Recall that we try to minimize the number of executables by combining large numbers of tests into FooTests.cxx files.
- It must be compatible with cmake/ctest/cdash. For example, a test must be able to "return EXIT_SUCCESS" and "return EXIT_FAILURE".
- It must not add complexity to an already complex testing process.
- It must be compatible with itk's strict attention to compile warnings and dynamic memory analysis. In other words, it must not produce warnings or purify defects.
- It should have a minimal source footprint.
- It should allow to test the Examples, but without inserting instrumented code in them (so they still should look like normal Examples).
Suggestions for improving the testing system to make easier for contributors to introduce new tests include
Boost Test
Suggested by Steve Robbins
How it could work
--------------------- itkImageRegionTest.cxx --------------------------------- #define BOOST_AUTO_TEST_MAIN #include <boost/test/auto_unit_test.hpp> #include "itkImageRegion.h" template< unsigned int VImageDimension > struct Fixture { typedef itk::ImageRegion<VImageDimension> RegionType; typedef typename RegionType::IndexType IndexType; typedef typename RegionType::SizeType SizeType; RegionType mRegion; }; struct Fixture1 : public Fixture<1> { Fixture1( int start0, int size0 ) { IndexType start = {{ start0 }}; SizeType size = {{ size0 }}; mRegion = RegionType( start, size ); } }; struct Fixture2 : public Fixture<2> { Fixture2( int start0, int start1, int size0, int size1 ) { IndexType start = {{ start0, start1 }}; SizeType size = {{ size0, size1 }}; mRegion = RegionType( start, size ); } }; struct Fixture3 : public Fixture<3> { Fixture3( int start0, int start1, int start2, int size0, int size1, int size2 ) { IndexType start = {{ start0, start1, start2 }}; SizeType size = {{ size0, size1, size2 }}; mRegion = RegionType( start, size ); } }; BOOST_AUTO_TEST_CASE( testSlice ) { Fixture3 volume( 12, 12, 12, 10, 20, 30 ); Fixture2 slice0( 12, 12, 20, 30 ); Fixture2 slice1( 12, 12, 10, 30 ); Fixture2 slice2( 12, 12, 10, 20 ); BOOST_CHECK_EQUAL( slice0.mRegion, volume.mRegion.Slice( 0 ) ); BOOST_CHECK_EQUAL( slice1.mRegion, volume.mRegion.Slice( 1 ) ); BOOST_CHECK_EQUAL( slice2.mRegion, volume.mRegion.Slice( 2 ) ); } BOOST_AUTO_TEST_CASE( testSliceOutOfBounds ) { Fixture3 volume( 12, 12, 12, 10, 20, 30 ); BOOST_CHECK_THROW( volume.mRegion.Slice( -1 ), std::exception ); BOOST_CHECK_THROW( volume.mRegion.Slice( 3 ), std::exception ); } BOOST_AUTO_TEST_CASE( testVolumeIsInside ) { Fixture3 volumeA( 12, 12, 12, 10, 20, 30 ); Fixture3 volumeB( 14, 14, 14, 5, 10, 15 ); BOOST_CHECK( volumeA.mRegion.IsInside( volumeB.mRegion ) ); BOOST_CHECK( ! volumeB.mRegion.IsInside( volumeA.mRegion ) ); } --------------------- itkImageRegionTest.cxx ---------------------------------
Google Test
The Google Test framework is very similar to the Boost test harness. GTest is essentially a set of macros that assist the developer in writing concise tests. The framework does not make use of exceptions, nor templates and is supported on all major platforms and some minor ones, i.e., Cygwin, Windows CE, and Symbian. The code is available under the BSD license. The framework supports many of the features already in place through ctest, e.g., run every N'th test, run all matching tests, etc. The source code and includes are ~600K and trivially compiles using CMake without configuration or modification.
# Build Google Testing set ( GTestSource Utilities/gtest-1.2.1 Utilities/gtest-1.2.1/src/gtest.cc Utilities/gtest-1.2.1/src/gtest-death-test.cc Utilities/gtest-1.2.1/src/gtest-filepath.cc Utilities/gtest-1.2.1/src/gtest-port.cc Utilities/gtest-1.2.1/src/gtest-test-part.cc Utilities/gtest-1.2.1/src/gtest-typed-test.cc ) include_directories ( ${MI3CLib_SOURCE_DIR}/Testing/Utilities/gtest-1.2.1/include ) add_library(gtest ${BUILD_SHARED_LIBS} ${GTestSource})
Test Driver
The test driver is a very simple main function:
#include <gtest/gtest.h> int main(int argc, char* argv[]) { testing::InitGoogleTest ( &argc, argv ); return RUN_ALL_TESTS(); }
Types of Test
There are two basic types of tests, simple tests using the TEST macro, and test fixtures using the TEST_F macro. Many different macros are available to when executing tests, ranging from string comparisons, expected exceptions, floating point comparisons, etc. The basic framework is well documented with advanced guidance for those who dig deeper. Below is example code from an internal project that demonstrates how to write a test. Text fixtures run as methods in a sub-class of the fixture and have access to all public and protected ivars of the fixture. All test macros function as stream operators with any text directed into them appearing in the output. NB: in this example an MD5 hash is used to verify correct output rather than comparison to a known good image.
TEST(IO, LoadCT) { mi3c::ImageLoader loader; mi3c::Image::Pointer image = loader.setFilename ( dataFinder.getFile ( "CT.hdr" ) ).execute(); ASSERT_EQ ( "c1d43aaa5b991431a9daa1dc4b55dbb1", image->getMD5() ) << " failed to load the expected image data"; } class ImageDataTest : public testing::Test { public: ImageDataTest () { image = NULL; floatImage = NULL; } virtual void SetUp() { mi3c::ImageLoader loader; try { image = loader.setFilename ( dataFinder.getFile ( "MRA.hdr" ) ).execute(); mi3c::ConvertDataType convert ( mi3c::mi3cFLOAT ); floatImage = convert.execute ( image ); } catch ( itk::ImageFileReaderException e ) { FAIL(); // Couldn't load, so fail this test before we go any further with bad data. } } virtual void TearDown() { image = NULL; floatImage = NULL; } mi3c::Image::Pointer image; mi3c::Image::Pointer floatImage; }; TEST_F(ImageDataTest, DiscreteGaussianFilter) { mi3c::DiscreteGaussianFilter filter; mi3c::Image::Pointer o = filter.execute ( image ); EXPECT_EQ ( "6adeb490bda64b47e9c1bd6c547e570e", o->getMD5() ) << " Filtered with a gaussian"; EXPECT_EQ ( "300c7ee796d1b3c2b49a7649789bbf55", filter.execute ( floatImage )->getMD5() ) << " Filtered with a gaussian"; } TEST_F(ImageDataTest, MeanFilter) { mi3c::MeanFilter filter; filter.setRadius ( 1 ); mi3c::Image::Pointer o = filter.execute ( image ); EXPECT_EQ ( "8b7235e1f8497b0a7fb84eb5c94af00b", o->getMD5() ) << " Mean filtered"; EXPECT_EQ ( "069a6670309db5c03a79af11a9c6e526", filter.execute ( floatImage )->getMD5() ) << " Mean filtered"; }
Running the tests
Test status is reported (in color) when running the tests and final status is reported as the exit status, much like current ITK testing.
[blezek@mi3bld04 MI3CLib-linux86-gcc]$ bin/NoOp /mi3c/projects/Source/MI3CTestData [==========] Running 9 tests from 3 test cases. [----------] Global test environment set-up. [----------] 4 tests from ImageDataTest [ RUN ] ImageDataTest.MD5 [ OK ] ImageDataTest.MD5 [ RUN ] ImageDataTest.Threshold [ OK ] ImageDataTest.Threshold [ RUN ] ImageDataTest.DiscreteGaussianFilter [ OK ] ImageDataTest.DiscreteGaussianFilter [ RUN ] ImageDataTest.MeanFilter [ OK ] ImageDataTest.MeanFilter [----------] 2 tests from IO [ RUN ] IO.LoadCT [ OK ] IO.LoadCT [ RUN ] IO.LoadInvalidFile [ OK ] IO.LoadInvalidFile [----------] 3 tests from Image [ RUN ] Image.InstantiateImage [ OK ] Image.InstantiateImage [ RUN ] Image.InstantiateImage2 [ OK ] Image.InstantiateImage2 [ RUN ] Image.TestHash [ OK ] Image.TestHash [----------] Global test environment tear-down [==========] 9 tests from 3 test cases ran. [ PASSED ] 9 tests. [blezek@mi3bld04 MI3CLib-linux86-gcc]$ echo $? 0
CMake Integration
With a slightly clever CMake macro, and a regular expression or two, Google tests are trivially integrated into CMake projects. Here, all TEST and TEST_F macros found in the source code are added as tests to the project. Each test is run as:
NoOp --gtest_filter=TestGroup.TestName
where TestGroup is the first argument to the TEST macro, and TestName is the second.
# C++ tests set ( mi3cTestSource Source/NoOp.cxx Source/ImageTests.cxx Source/IOTests.cxx Source/FilterTests.cxx ) add_executable(NoOp ${mi3cTestSource}) macro(ADD_GOOGLE_TESTS executable) foreach ( source ${ARGN} ) file(READ "${source}" contents) string(REGEX MATCHALL "TEST_?F?\\(([A-Za-z_0-9 ,]+)\\)" found_tests ${contents}) foreach(hit ${found_tests}) string(REGEX REPLACE ".*\\(([A-Za-z_0-9]+)[, ]*([A-Za-z_0-9]+)\\).*" "\\1.\\2" test_name ${hit}) add_test(${test_name} ${executable} --gtest_filter=${test_name} ${MI3CTestingDir}) endforeach(hit) endforeach() endmacro() # Add all tests found in the source code, calling the executable to run them add_google_tests ( ${EXECUTABLE_OUTPUT_PATH}/NoOp ${mi3cTestSource})
UnitTestCpp
Suggested by Mathieu Malaterre
This package is distributed under an MIT License:
Custom XML Testing Framework for CTest CMake CDash Integration
This proposal has been submitted as a Insight Journal Article: http://hdl.handle.net/10380/3116
If we look at the XML for the CDash (CDash-XML), specifically the file which describes the valid xml for test (ValidationSchemata/Text.xsd) we see the NamedMeasurement tag. This is the field that is displayed for the test. It can show a variety of types with the "type" attribute. Previously Dart defined the following types: "numeric/integer", "numeric/float", "numeric/double", "numeric/boolean", "text/string", "text/plain", "image/png", "image/jpeg".
The way this testing system works is each test produces XML output of "NamedMeasurements". This XML is then compared against a XML baseline. Then any difference will be reported and sent to CDash via ctest. To produce a baseline, the output of the test should be carefully manually verified, then attributes added to the XML tags to describe how to do comparisons.
Sample test:
class ImageFileReaderInfoTest: public itk::Regression { protected: virtual int Test(int argc, char* argv[] ) { if( argc < 2 ) { std::cerr << "Usage: " << argv[0] << " inputFile" << std::endl; return EXIT_FAILURE; } typedef unsigned char PixelType; typedef itk::Image<PixelType,3> ImageType; typedef itk::ImageFileReader<ImageType> ReaderType; ReaderType::Pointer reader = ReaderType::New(); reader->SetFileName( argv[1] ); reader->UpdateOutputInformation(); ImageType::Pointer image = reader->GetOutput(); ImageType::RegionType region = image->GetLargestPossibleRegion(); itk::ImageIOBase::Pointer imageIO = reader->GetImageIO(); this->MeasurementInsightSize( region.GetSize(), "ImageSize" ); this->MeasurementInsightVector( image->GetSpacing(), "ImageSpacing" ); this->MeasurementInsightPoint( image->GetOrigin(), "ImageOrigin" ); this->MeasurementInsightMatrix( image->GetDirection(), "ImageDirection" ); this->MeasurementTextString( imageIO->GetFileTypeAsString( imageIO->GetFileType()), "FileTypeAsString" ); this->MeasurementTextString( imageIO->GetByteOrderAsString( imageIO->GetByteOrder() ), "ByteOrderAsString" ); this->MeasurementTextString( imageIO->GetPixelTypeAsString( imageIO->GetPixelType() ), "PixelTypeAsString" ); this->MeasurementTextString( imageIO->GetComponentTypeAsString( imageIO->GetComponentType() ), "ComponentTypeAsString" ); this->MeasurementNumericInteger( imageIO->GetNumberOfComponents( ), "NumberOfComponents" ); return EXIT_SUCCESS; } };
The baseline XML:
<?xml version="1.0" encoding="US-ASCII"?> <!-- created on victoria at Fri Jul 24 17:22:33 2009 --> <output> <DartMeasurement name="ImageSize_0" type="numeric/integer">34</DartMeasurement> <DartMeasurement name="ImageSize_1" type="numeric/integer">34</DartMeasurement> <DartMeasurement name="ImageSize_2" type="numeric/integer">141</DartMeasurement> <DartMeasurement name="ImageSpacing_0" type="numeric/double">62</DartMeasurement> <DartMeasurement name="ImageSpacing_1" type="numeric/double">62</DartMeasurement> <DartMeasurement name="ImageSpacing_2" type="numeric/double">1</DartMeasurement> <DartMeasurement name="ImageOrigin_0" type="numeric/double">30</DartMeasurement> <DartMeasurement name="ImageOrigin_1" type="numeric/double">30</DartMeasurement> <DartMeasurement name="ImageOrigin_2" type="numeric/double">0</DartMeasurement> <DartMeasurement name="ImageDirection_0_0" type="numeric/double">1</DartMeasurement> <DartMeasurement name="ImageDirection_0_1" type="numeric/double">0</DartMeasurement> <DartMeasurement name="ImageDirection_0_2" type="numeric/double">0</DartMeasurement> <DartMeasurement name="ImageDirection_1_0" type="numeric/double">0</DartMeasurement> <DartMeasurement name="ImageDirection_1_1" type="numeric/double">1</DartMeasurement> <DartMeasurement name="ImageDirection_1_2" type="numeric/double">0</DartMeasurement> <DartMeasurement name="ImageDirection_2_0" type="numeric/double">0</DartMeasurement> <DartMeasurement name="ImageDirection_2_1" type="numeric/double">0</DartMeasurement> <DartMeasurement name="ImageDirection_2_2" type="numeric/double">1</DartMeasurement> <DartMeasurement name="FileTypeAsString" type="text/string">Binary</DartMeasurement> <DartMeasurement name="ByteOrderAsString" type="text/string">LittleEndian</DartMeasurement> <DartMeasurement name="PixelTypeAsString" type="text/string">scalar</DartMeasurement> <DartMeasurement name="ComponentTypeAsString" type="text/string">unsigned_char</DartMeasurement> <DartMeasurement name="NumberOfComponents" type="numeric/integer">1</DartMeasurement> </output>
The strengths of this approach is that it separates the execution and the validation (or is it verification) making the test code it's self smaller. It easily integrates with the CDash/CMake infrastructure (as it was designed to). Many existing test which print text could easily be migrated to this approach so that the output of the program is also validated and we will know when it changes. It could easily be expanded to compare new types. A single executable could be run with multiple arguments for multiple test and each test could have a different baseline. On the down sides this may require the most work to get working.
Downloading and Documentation
The latest version can be downloaded via SVN:
svn co http://erie.nlm.nih.gov/svn/Repository/blowekamp/utilities/trunk/
The following little CMakeLists.txt file is needed to make a working project out of the 3 utility libraries:
PROJECT( TF_DRIVER ) CMAKE_MINIMUM_REQUIRED ( VERSION 2.6 ) CMAKE_POLICY( SET CMP0012 NEW ) ENABLE_TESTING() INCLUDE(CTest) OPTION(BUILD_SHARED_LIBS "Build with shared libraries." OFF) INCLUDE_DIRECTORIES ( ${PROJECT_SOURCE_DIR} ) SET (CMAKE_RUNTIME_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/bin CACHE INTERNAL "Single output directory for building all executables.") ADD_SUBDIRECTORY ( FileUtilities ) ADD_SUBDIRECTORY ( TestingFramework ) ADD_SUBDIRECTORY ( itktf )
Documentation can be found here:
http://erie.nlm.nih.gov/~blowek1/itkCommandLineTools/classitk_1_1Regression.html
Other C++ Unit Test Framework
- Exploring the C++ Unit Testing Framework Jungle, http://gamesfromwithin.com/?p=29