ITK/Release 4 Planning
- Support ND image in N+1 dimension
- 2D image can have an origin specified in 3D, thus a series of 2D images is not always Z-aligned
- Support ND images in M dimensions where M > N.
- All images are oriented - remove concept of an un-oriented image
- Check use of orientation throughout ITK
- Support re-orientation of ND oriented images
- Using anything other than 3D images won't compile with itkOrientedImageFilter
- Spatial Objects
- Allow the use of strides that are not equal to the image width
- Would ease the collaboration of ITK with opencv
- Would allow the use of sse operations
- Might be considered redundant with correct use of image regions but is not since GetLargestPossibleRegion should correspond to the image width and not its stride
- Drop the itk::Image::GetBufferPointer() method
- This method has been many time described as a problem to implement new image layouts.
- As expressed above, we need however to be able to use the memory held by ITK images within other libraries. This could potentially be done by making itk::Image be only a base class that has no knowledge of the memory layout and by implementing different image subclasses.
- Consider replacing ImportImageContainer by std::vector or using std::vector to implement it
- This would give STL iterators that operate on the whole image literally for free and make it easy to use a lot of algorithms implemented in STL and BOOST
- Boost gil also offers a compelling alternative for memory management of images. Unfortunately it seems to be still focused on 2D
- See Alternative Memory Models for ITK Images on the Insight Journal for an initial implementation of such ideas
- Discuss a proper way of handling dynamic images (2D+t is not really 3D and 3D+t is difficult in terms of memory management)
- Complete statistics refactoring (see NAMIC sandbox)
- Consolidate FEM Meshes and ITK Meshes
Backward compatibility and cleanup
- Clean-up CMake Vars ==
- See proposal HERE.
- Remove Deprecated Features
- Functions that have been deprecated (and appropriately marked as such) for more than 3 releases should be removed.
- Modify the itkSetMacro to use a const reference argument, i.e. #define itkSetMacro(name,type) virtual void Set##name (const type & _arg)
- This cannot be done int ITK 3.x because of backward compatibility issues
- Make the semantics of the ITK containers match th one from STL
- See this bug report
- Set the default options values to provide the highest result quality
- Some filters have default options values to produce quick transforms rather than high quality transforms. This is the case for the distance map filters, which produced squared results and don't use image spacing by default. This behavior is desirable in some conditions, but shouldn't be the default one.
- Supported compilers
- We should reconsider the list of supported compilers. ITK 4.0 might be a good time to drop, for example, MSVC 6.0 that only implements a subset of modern C++.
- Define a transition period during which developments need not be backward compatible
- Such a period could be defined in terms of a number of "beta" releases
- Set up the infrastructure to ease the implementation of modern optimization schemes for image registration
- Requires Hessian or pseudo-Hessians of the cost function
- Requires several types of update rules (additive, compositional, inverse compositional, etc.)
- References: "Lucas-Kanade 20 years on" by Baker et al.; "Homography-based 2D Visual Tracking and Servoing" by Benhimane and Malis, "Groupwise Geometric and Photometric Direct Image Registration" by Bartoli; etc.
- Allow the use of regularization terms that depends on the spatial transformation. See elastix for an example implementation.
- Clean up the use of parameter scaling in the optimizers
- One possibility would be that the optimizers only perform unscaled minimization. It would then be up to a cost function wrapper to do the rescaling and potentially return the opposite of the cost function. This is similar to how vnl optimizers are used in ITK
- See also elastix for another example implementation.
- Optimizers should return the best visited value
- See Bug 3205
- Modify transforms to support a consistent API across transform types
- Modify order of parameters to be consistent across transforms.
- Modify the base class for optimizers to support key optimizer API calls such as SetMaximize and SetNumberOfIterations or SetMaximumIteration
- Make the registration framework work with vector images natively.
- Currently several itk filters/functions assume that the pixel is of scalar type. This prevents from using the registration framework with vector images.
- Several filters/functions useful for registration are specialized for vectors whereas it is often unnecessary. It is often quite easy to adapt the filters that assume scalar pixels to make them work with vector pixels. For example, there is a VectorInterpolateImageFunction, but the regular InterpolateImageFunction should do just fine. Actually, there is even a unit test to check that the LinearInterpolateImageFunction correctly handles vector images. Below is a list of filters that could potentially be removed:
- VectorCastImageFilter: could be reworked if we provide a conversion operator in the vector pixel class
- In cases where the implementation has to be slightly different for vector pixels, we should consider using partial template specialization.
- This would require dropping support for visual c++ 6.
- An initial simple implementation of vector image registration can be found on the NAMIC SandBox.
- Define a composite transform which can contain any number of transforms, composed.
- Only expose the parameters of the last transform for optimization (default)
- Used in multivariate atlas formation (DTI reg with T1 reg with atlas)
- Remove all of the Centered transforms
- See Insight Journal Papers:
Architecture and Software engineering
- Implement a pure virtual base class for each API to support instantiation of templated filters at run-time with different dimensions. Many classes in ITK are templated, for example over spatial dimension and pixel type, or over images that are templated over spatial dimension and pixel type. However, many of the operations that are carried out do not depend on the spatial dimension and pixel type. A pure virtual base class for a particular filter, such as itk::ResampleImageFilter, would define the API of the ResampleImageFilter without implementing any of the functions that depend on TInputImage, TOutputImage or TInterpolatorPrecisionType. This would enable a pointer to the virtual base class to be manipulated in code,
and a specialized implementation with a particular TInputImage, TOutputImage and TInterpolatePrecisionType to be instantiated at run time. This would enable an image to be read in, its dimension and pixel type to be established at run time, an appropriate specialized class to be instantiated and used, rather than the current practice of fixing at compile time the dimension and pixel type that will be utilized. For example, a single program could be written using the virtual base class API with run-time instantiation of a 2D filter for floating point pixels if the input is a 2D with floating point pixels, and a 3D filter with unsigned short pixels if the input is 3D with unsigned short pixels.
Can you explain a bit more?
- Add interfaces to the algorithms that turn incomplete initialization into compile time error for "linear" environments or enable some kind of validation instead of throwing an exception in "dynamic" environments. In both cases, the entry points to doing real work of the algorithm should then be guarded by assertions regarding the required parameters, not exceptions - since ending up there without proper initialization would always be a programming error.
- As a "linear" environments I define an implementations where the parameters and the input to an algorithm are completely determined by the program. In this case, an error in initialization (by missing a SetXXX method) usually is a programming error. Adding an initialization method or constructor that takes all required parameters would enable the developer to move this error from run-time to compile-time.
- As a "dynamic" environments I imagine e.g. a GUI program, where the user can set the parameters to an algorithm dynamically. Here, a missing SetXXX is not a programming error, but a user error. However, since more than one parameter might be missing, exceptions are not a good way to report the problem. Instead, it should be possible to call some validation function that reports all the missing parameters to the user.
- Allow partial template specialization, (which would imply dropping support for VC 6.0).
- Discuss whether to move to TR1. Portability might be achieved through the boost TR1 wrapper library.
- SmartPointer< T > should be implicitly convertible to SmartPointer< U > whenever T* can be implicitly converted to U*.
- Testing framework
- Add a decent testing framework e.g. based on BOOST.test or googletest; see discussion on the itk-developers
- Code Revision Control
- Migrate to Subversion
- Portability issues
- Discuss the use of fixed-width types to enhance portability and interoperability. This can be done by using cstdint from boost.
- Allow the use of unicode file names, see this bug report
Proper resampling/consistency in IndexToPhysicalPoint, ContinuousIndexToPhysicalPoint, Point*
- Refactor all the interpolators
- See Proposals:Refactoring Index Point Coordinate System
- See ITK Bug 6558
- Fix bug 0005335 - transform initializer computes geometric center incorrectly
- Move the framework from the IJ paper:
Make as much filters as possible able to run in place
In place computation is a great way to avoid running out of memory when updating a pipeline. We should review all the existing filters to find the filters which could be implemented that way, and use InPlaceImageFilter has their base class. Also, a global setting to control the default in place/not in place behavior would be great.
Make the boundary conditions usage consistent across the toolkit
At the moment, some filters let the user provide a boundary condition, some don't but use one internally, and some just don't use them at all. This should be consistent in the toolkit, and if it make sense, it should be changeable by the user. Boundary conditions also make some filters hard to enhance with much more efficient algorithms - see BoxMeanImageFilter for an example.
Replace the current implementation of Marching Cubes and add a 4D version
The itkBinaryMask3DMeshSource filter currently provides the closest functionality to the Marching Cubes algorithm in ITK. However the code of this filter has to be rewritten in order to match the quality standards of the rest of the toolkit. As part of this rewrite we should provide implementations for 2D (marching squares), 3D marching cubes and a 4D version that could be used for segmenting 3D+time datasets.
Normalize the Binary/Label/Grayscale usage in code and in the class names
Arbitrary precision type
for reconstruction and geometry processing, you might want to use arbitrary precision type. Boost has one, GMP is now LGPL. That also could be a feature of the numerical library, and then the solvers could directly use this, if needed.
inspired from exct and filtered kernels in CGAL
Exact geometrical test (point in circle => delaunay
If we cannot go for arbitrary precision types, in some case it is sufficient to support some operations to have exact geometrical predicates. This is mandatory for a robust delaunay implementation. The implementation for the point-in-circle predicate which is necessary and sufficient for exact 2D delaunay, is public domain.
Note that abitrary precision would allow for any exact geometrical predicates.
3rd Party Libraries
- Out dated libraries
- Many 3rd party libraries (ex libTIFF) are years out of date. One possibility is to update them to their newest official release. Another is to remove them and require developers to use their own version (i.e. USE_SYSTEM_TIFF).
- Linear algebra package
- The current linear algebra package used by ITK is VNL. It's performance and robustness is not very good, it is not actively maintained and cannot use a vendor back-end such as MKL. We should therefore discuss the alternative possibilities. Below is a list of potential linear algebra libraries:
- Boost uBLAS with bindings for LAPACK
- MTL or MTL4
- Eigen seems nice. It has quite a few linear algebra operations embedded and seems very fast.
- Unify with the underlying routines of Numpy/Scipy 
- Some uBLAS/numpy bindings are available from pyUlas.
- Numerical analysis package
- The current numerical analysis package used by ITK is VNL. It's performance and robustness is not very good, it is not actively maintained. We should therefore discuss the alternative possibilities. Below is a list of potential alternatives:
- The main numerical analysis tools we use from vnl are the optimizers. Most of these optimizers have an alternative quasi-ITK implementation in elastix.