CMake/Testing With CTest

From KitwarePublic
< CMake
Revision as of 20:55, 21 October 2008 by Alex (talk | contribs)
Jump to navigationJump to search

Introduction

CTest is a testing tool distributed as a part of CMake. It can be used to automate updating (using CVS for example), configuring, building, testing, performing memory checking, performing coverage, and submitting results to a CDash or Dart dashboard system. This tutorial will introduce the testing with CTest.

Simple Testing

The easiest way to create CTest input files is using CMake. It has support for adding tests in the project. To do that, insert the following command in the CMakeLists.txt file:

ENABLE_TESTING()

From that point on, you can add tests in the project using ADD_TEST command:

ADD_TEST(SimpleTest ${EXECUTABLE_OUTPUT_PATH}/SimpleTest Hello)

After building the project, you should be able to run CTest on the project.

1. When using Makefile generators, such as Unix Makefiles, Borland Makefiles, or NMake Makefiles, the CTest can simply be run by running:

make test

2. On GUI development environments such as Visual Studio rebuilding the target RUN_TESTS will run tests on the project.

For more information about ENABLE_TESTING and ADD_TEST, Look at CMake Documentation or run:

cmake --help
cmake --help-full
cmake --help-command-list

or

cmake --help-command ENABLE_TESTING
cmake --help-command ADD_TEST

Internally "make test" or "RUN_TESTS" runs CTest. CTest is a program that actually performs the testing. The default behavior of CTest is to read "DartTestfile.txt" and runs tests that are in the file.

Dashboards

Testing Dashboards are web pages that display overview of the project testing. The testing clients update, configure, and build the project, as well as run some number of tests. The results of these operations are then submitted to the central server, which prepares the overview pages. Examples of Testing Dashboards are: VTK, ITK, and ParaView. More dashboards can be found on www.cdash.org and my.cdash.org.

There are three types of dashboard submissions. Experimental submission is submission of the current state of the project. It can be performed at any time and will appear on the dashboard on the next roll-up. Nightly submission is similar to experimental, except that the extra update will be performed using the last nighty time. This way all nightly dashboard submissions correspond to the state of the project at the same point in time. Finally Continuous submissions are similar to Experimental submissions, except that an update is performed and if any files were modified, the full dashboard testing is performed.

Dashboard Preparation Using CMake

To enable submission to Dart Testing Dashboard, include the following in CMakeLists.txt:

INCLUDE(CTest)

When you include the CTest module, it will give you an option BUILD_TESTING, which is enabled by default. In this case it will setup everything required for testing and also call ENABLE_TESTING(), so you don't have to do this in your cmake files.

By default the settings will be to submit to Kitware's Public Dashboard. In order to submit to some other dashboard, create a file "CTestConfig.cmake" in the toplevel source directory and set the dashboard preferences. Instead of manually creating this file you can also download a project specific CTestConfig.cmake from the respective project page on CDash.

CTestConfig.cmake file is not necessary if all you want is to submit to the Public Dashboard.

Example of this file is:

# Dashboard is opened for submissions for a 24 hour period starting at
# the specified NIGHLY_START_TIME. Time is specified in 24 hour format.
SET (CTEST_NIGHTLY_START_TIME "23:00:00 EDT")

# Dart server to submit results (used by client)
IF(CTEST_DROP_METHOD MATCHES http)
  SET (CTEST_DROP_SITE "public.kitware.com")
  SET (CTEST_DROP_LOCATION "/cgi-bin/HTTPUploadDartFile.cgi")
ELSE(CTEST_DROP_METHOD MATCHES http)
  SET (CTEST_DROP_SITE "public.kitware.com")
  SET (CTEST_DROP_LOCATION "/incoming")
  SET (CTEST_DROP_SITE_USER "ftpuser")
  SET (CTEST_DROP_SITE_PASSWORD "public")
ENDIF(CTEST_DROP_METHOD MATCHES http)

SET (CTEST_TRIGGER_SITE 
       "http://${DROP_SITE}/cgi-bin/Submit-vtk-TestingResults.pl")

This will submit testing results to the VTK dashboard.

CTest - Client for Dart Dashboard Server

CTest started as a simple tool to run some number of tests in the project, but has evolved in the full Dart compatible client. It can perform simple task of running a set of tests but it can also generate and submit Dart compatible Dashboard results. The good thing about CTest is that it is self sustained. All you need to do testing is CTest. Also, if you use CMake, you already have CTest. Detailed description of CTest option can be seen by running:

ctest --help

or

ctest --help-full

A simple way to submit Experimental dashboard is:

ctest -D Experimental

This will configure project, build project and check for any warnings or errors, run tests if any tests are specified, run coverage if any coverage files are specified, and submit to the specified Dart server.

To convert existing Dart Client run from the project, find lines like:

 cd ProjectNightlyBuildDirectory
 tclsh /location/of/Dart/Source/Client/DashboardManager.tcl DartConfiguration.tcl \
	Nightly Start Update Configure Build Test Submit

and convert them to CTest style:

cd ProjectNightlyBuildDirectory
ctest -D Nightly

Dashboard can be also generated in stages. This way partial testing results can be submitted and seen before long operations are completed:

cd ProjectNightlyBuildDirectory
ctest -D NightlyStart
ctest -D NightlyUpdate
ctest -D NightlyConfigure
ctest -D NightlyBuild
ctest -D NightlySubmit
ctest -D NightlyTest
ctest -D NightlyCoverage
ctest -D NightlySubmit
ctest -D NightlyMemCheck
ctest -D NightlySubmit

To show what targets are available, run:

ctest -D Help

Actually, any word other than Help may be used.

Advanced CTest

CTest has several additional features that include:

  1. FTP/HTTP/SCP/XMLRPC submission support
  2. Run individual tests, subset of tests, exclude tests, etc.
  3. Dynamic analysis using Valgrind or Purify
  4. Customization of the testing by providing:
    • Custom build error/warning regular expressions
    • Ability to suppress some tests from being tested or memory checked and ability to run subset of tests
    • Ability to run commands before and after tests are run
  5. Ability to run whole testing process described in a single script

Submission Of Tests

CTest currently supports four methods directly and any other indirectly. Direct methods are HTTP, FTP, SCP and XML-RPC. Both HTTP and FTP methods require extra trigger mechanism, while SCP method relies on the fact that files are on the right place. To set the appropriate submission method, set CTEST_DROP_METHOD variable in CTestConfig.cmake.

Example for HTTP submission would be:

 SET (CTEST_DROP_METHOD http)
 SET (CTEST_DROP_SITE "public.kitware.com")
 SET (CTEST_DROP_LOCATION "/cgi-bin/HTTPUploadDartFile.cgi")
 SET (CTEST_TRIGGER_SITE 
      "http://${DROP_SITE}/cgi-bin/Submit-CMake-TestingResults.pl")

where http://public.kitware.com/cgi-bin/HTTPUploadDartFile.cgi is a submit script and http://public.kitware.com/cgi-bin/Submit-CMake-TestingResults.pl is a trigger script.

For FTP submission:

  SET (CTEST_DROP_METHOD ftp) 
  SET (CTEST_DROP_SITE "public.kitware.com")
  SET (CTEST_DROP_LOCATION "/incoming")
  SET (CTEST_DROP_SITE_USER "ftpuser")
  SET (CTEST_DROP_SITE_PASSWORD "public")
  SET (CTEST_TRIGGER_SITE 
      "http://${DROP_SITE}/cgi-bin/Submit-CMake-TestingResults.pl")

where /incoming is a location on the FTP site public.kitware.com with user ftpuser and password public. The trigger scrip is the same as with the http submit.

For XML-RPC submission (Dart2):

  SET (CTEST_DROP_METHOD xmlrpc) 
  SET (CTEST_DROP_SITE "www.na-mic.org:8081")
  SET (CTEST_DROP_LOCATION "PublicDashboard")

where XML-RPC submission is on the server www.na-mic.org with the port 8081. The project name is PublicDashboard. XML-RPC submission does not require the trigger script.

Running Individual Tests

CTest supports two different ways of specifying subset of tests to run.

The first way is to specify the regular expression using -R and -E. -R specifies tests to be included and -E specifies the tests to be removed. For example, when running ctest in show-only mode, where no tests are run, we may see something like:

Test project
  1/ 13 Testing PythonDataDesc                
  2/ 13 Testing VTKTest                       
  3/ 13 Testing SystemInformation             
  4/ 13 Testing TestVTKWriters                
  5/ 13 Testing TestVTKPython                 
  6/ 13 Testing VTKPythonMultiGrid            
  7/ 13 Testing IronImage                     
  8/ 13 Testing IronImageMagic                
  9/ 13 Testing IronImageStrideMagic          
 10/ 13 Testing IronRectMagic                 
 11/ 13 Testing IronRectStrideMagic           
 12/ 13 Testing IronStructMagic               
 13/ 13 Testing IronStructStrideMagic

If we now run

 ctest -R Python

We will only see tests that contain string Python:

Test project
  1/  3 Testing PythonDataDesc                
  2/  3 Testing TestVTKPython                 
  3/  3 Testing VTKPythonMultiGrid

We can also ommit tests using -R, for example:

 ctest -E Iron

will produce:

Test project
  1/  6 Testing PythonDataDesc                
  2/  6 Testing VTKTest                       
  3/  6 Testing SystemInformation             
  4/  6 Testing TestVTKWriters                
  5/  6 Testing TestVTKPython                 
  6/  6 Testing VTKPythonMultiGrid

Both -R and -E can be used at the same time.

To determine what tests are avilable, you can always run:

 ctest -N

which will display the list of tests but not actually run them.

The second way of specifying tests is using explicit test number option -I:

 ctest -I 3,5

will run tests:

Test project
Running tests: 3 4 5 
  3/ 13 Testing SystemInformation             
  4/ 13 Testing TestVTKWriters                
  5/ 13 Testing TestVTKPython

We can also specify stride:

 ctest -I ,,3

will run tests:

Test project
Running tests: 1 4 7 10 13 
  1/ 13 Testing PythonDataDesc                
  4/ 13 Testing TestVTKWriters                
  7/ 13 Testing IronImage                     
 10/ 13 Testing IronRectMagic                 
 13/ 13 Testing IronStructStrideMagic

Or run individual tests:

 ctest -I 4,4,,4,7,13

will run tests:

Test project
Running tests: 4 7 13
  4/ 13 Testing TestVTKWriters
  7/ 13 Testing IronImage
 13/ 13 Testing IronStructStrideMagic

Make sure that the first and second argument are the index of the first test

Dynamic Analysis

Software development can be significantly hindered when memory leaks are introduced in the code. Both Purify and Valgrind can catch most of them. Setting up both is extremely easy.

For example, to setup purify, all you have to do is to add:

PURIFYCOMMAND:FILEPATH=c:/Progra~1/Rational/common/purify.exe

To your cmake cache. Same way to setup valgrind, you add:

MEMORYCHECK_COMMAND:FILEPATH=/home/kitware/local/bin/valgrind

You can add additional options by specifying MEMORYCHECK_COMMAND_OPTIONS and MEMORYCHECK_SUPPRESSIONS_FILE.

Make sure to run:

ctest -D NightlyMemoryCheck

or

ctest -D NightlyStart
ctest -D NightlyUpdate
ctest -D NightlyConfigure
ctest -D NightlyBuild
ctest -D NightlyTest
ctest -D NightlyMemCheck
ctest -D NightlySubmit

Customizing CTest

CTest can be customized by providing CTestCustom.ctest or CTestCustom.cmake file in the build tree. If both files exist, CTestCustom.cmake will be preferred. If the CTestCustom.cmake/.ctest file is distributed with the sources of the project, e.g. CONFIGURE_FILE() can be used to put it in the build tree. The file may contain any SET command for any CMake variable, but the following ones will be used:

Variable Description
CTEST_CUSTOM_ERROR_MATCH Regular expression for errors during build process
CTEST_CUSTOM_ERROR_EXCEPTION Regular expression for error exceptions during build process
CTEST_CUSTOM_WARNING_MATCH Regular expression for warnings during build process
CTEST_CUSTOM_WARNING_EXCEPTION Regular expression for warning exception during build process
CTEST_CUSTOM_MAXIMUM_NUMBER_OF_ERRORS Maximum number of errors to display
CTEST_CUSTOM_MAXIMUM_NUMBER_OF_WARNINGS Maximum number of warnings to display
CTEST_CUSTOM_TESTS_IGNORE List of tests to ignore during the Test stage
CTEST_CUSTOM_MEMCHECK_IGNORE List of tests to ignore during the MemCheck stage
CTEST_CUSTOM_PRE_TEST Command to execute before any tests are run during Test stage
CTEST_CUSTOM_POST_TEST Command to execute after any tests are run during Test stage
CTEST_CUSTOM_MAXIMUM_PASSED_TEST_OUTPUT_SIZE Maximum size of passed test output
CTEST_CUSTOM_MAXIMUM_FAILED_TEST_OUTPUT_SIZE Maximum size of failed test output
CTEST_CUSTOM_PRE_MEMCHECK Command to execute before any tests are run during MemCheck stage
CTEST_CUSTOM_POST_MEMCHECK Command to execute after any tests are run during MemCheck stage
CTEST_CUSTOM_COVERAGE_EXCLUDE Regular expression for excluding files from coverage testing

Example of CTestCustom.cmake file would be:

SET(CTEST_CUSTOM_MEMCHECK_IGNORE
  ${CTEST_CUSTOM_MEMCHECK_IGNORE}
  TestSetGet
  otherPrint-ParaView
  Example-vtkLocal
  Example-vtkMy

  # These tests do not actually run any VTK code
  HeaderTesting-Common
  HeaderTesting-Filtering
  HeaderTesting-Graphics
  HeaderTesting-Imaging
  HeaderTesting-IO

  # this one runs python which then runs two
  # program so no memory checking there
  Sockets-image
  )

SET(CTEST_CUSTOM_WARNING_MATCH
  ${CTEST_CUSTOM_WARNING_MATCH}
  "{standard input}:[0-9][0-9]*: Warning: "
  )

IF("@CMAKE_SYSTEM@" MATCHES "OSF")
  SET(CTEST_CUSTOM_WARNING_EXCEPTION
    ${CTEST_CUSTOM_WARNING_EXCEPTION}
    "XdmfDOM"
    "XdmfExpr"
    "vtkKWApplication"
    "vtkKWObject"
    )
ENDIF("@CMAKE_SYSTEM@" MATCHES "OSF")

SET(CTEST_CUSTOM_WARNING_EXCEPTION
  ${CTEST_CUSTOM_WARNING_EXCEPTION}
  "tcl8.4.5/[^/]+/../[^/]+/[^.]+.c[:\"]"
  "tk8.4.5/[^/]+/[^/]+.c[:\"]"
  "VTK/Utilities/vtktiff/"
  "Utilities/vtkmpeg2/"
  "Utilities/hdf5/"
  "xtree.[0-9]+. : warning C4702: unreachable code"
  "warning LNK4221"
  "variable .var_args[2]*. is used before its value is set"
  )

CTest Scripting

For an example of how CTest can run the whole testing process described in a single script, look at how CMake dashboards are created with the CTest -S script.

Conclusion

Performing tests on the project is a great software development practice and can result in significant improvement on the quality of the project. CTest provides a simple and reliable way of performing nightly, continuous, and experimental tests.

More information about CTest can be found in Mastering CMake.



CMake: [Welcome | Site Map]