https://public.kitware.com/Wiki/api.php?action=feedcontributions&user=Dcthomp&feedformat=atomKitwarePublic - User contributions [en]2024-03-29T12:57:46ZUser contributionsMediaWiki 1.38.6https://public.kitware.com/Wiki/index.php?title=ParaView/ParaView_Readers_and_Parallel_Data_Distribution&diff=51718ParaView/ParaView Readers and Parallel Data Distribution2013-03-07T16:06:09Z<p>Dcthomp: </p>
<hr />
<div><center>'''<font color="green">Under Development. Editors, try to keep this list sorted by the extension </font>'''</center><br />
<br />
<br />
{| cellspacing="0" cellpadding="5" border="1" align="center" style="text-align:center;"<br />
| '''File Extension'''<br />
| '''Format Description'''<br />
| '''Reader'''<br />
| '''Notes on Data Parallelism'''<br />
| '''Time Support'''<br />
|-<br />
| *.ex2,.exo<br />
| Exodus Files<br />
| [http://paraview.org/OnlineHelpCurrent/ExodusReader.html ExodusII reader]<br />
|<br />
This file format supports parallel distribution of data by splitting data across many files. It also supports mesh adaptation by splitting files at simulation times where adaptation occurs. Hence ParaView does the following:<br />
<br />
The root node scans the directory for files in the set and reads metadata (blocks and variables defined on them) from a single file in the set. It then broadcasts this information to all processes. Each reads a different subset of files.<br />
<br />
|<br />
Time steps may be contained in a file as long as the mesh topology does not change. Time may also be split across a file series.<br />
|-<br />
| *.vtk<br />
| Legacy VTK Files<br />
| [http://paraview.org/OnlineHelpCurrent/LegacyVTKFileReader.html Legacy VTK reader]<br />
|<br />
This file format has no support for saving parallel distribution of data. Hence ParaView does the following:<br />
<br />
'''Structured Data'''<br />
----<br />
For structured data, this reader reads the entire file on all processes and then '''crops''' the structured extents on each process so that for each process, the filters downstream process a block of the structured data alone.<br />
<br />
'''Unstructured Data'''<br />
----<br />
The root node reads the entire file and then the data is distributed to all processes (using MPI) after an internal, fairly naive, partitioning algorithm (vtkTransmitUnstructuredGridPiece or vtkTransmitPolyDataPiece).<br />
<br />
|<br />
Time supported only as a file series.<br />
|-<br />
|}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/DesignOfStatistics&diff=33020ParaView/DesignOfStatistics2010-10-21T18:01:38Z<p>Dcthomp: </p>
<hr />
<div>Now on the [http://www.paraview.org/ParaView3/index.php/DesignOfStatistics ParaView Developer wiki]</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/DesignOfStatistics&diff=33019ParaView/DesignOfStatistics2010-10-21T18:00:45Z<p>Dcthomp: Blanked the page</p>
<hr />
<div></div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/DesignOfStatistics&diff=33013ParaView/DesignOfStatistics2010-10-21T17:36:40Z<p>Dcthomp: /* Features */</p>
<hr />
<div>Things to do:<br />
<br />
== Bugs ==<br />
A number of bugs must be fixed.<br />
* [http://www.paraview.org/Bug/view.php?id=11347 Datasets with only cell or row data] (Kitware)<br />
* [http://www.paraview.org/Bug/view.php?id=11349 Models are now multiblocks of tables] (Sandia)<br />
<br />
== Features ==<br />
Among the considered features:<br />
<br />
=== Integral ===<br />
These features are related to making statistics pervasive throughout the user interface.<br />
* Find data (find values that are extremal, or in a particular inter-quantile interval, etc.) (SNL: analysis, KW: consult on GUI)<br />
* Calculator (additional statistics buttons for some descriptive statistics) (SNL: analysis, KW: consult on GUI)<br />
* Automatic statistics filter (decides what to do for the user given a dataset and variables of interest on it) (SNL)<br />
<br />
=== Advanced Filters ===<br />
This more or less what is currently in PV, plus the following:<br />
* Hypothesis testing (SNL)<br />
* Linked selection for the model tables (KW, SNL: consult)<br />
* Access to more (or all) parameters in a generic way (SNL, KW: some GUI work may be required)</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/DesignOfStatistics&diff=33012ParaView/DesignOfStatistics2010-10-21T17:34:22Z<p>Dcthomp: /* Bugs */</p>
<hr />
<div>Things to do:<br />
<br />
== Bugs ==<br />
A number of bugs must be fixed.<br />
* [http://www.paraview.org/Bug/view.php?id=11347 Datasets with only cell or row data] (Kitware)<br />
* [http://www.paraview.org/Bug/view.php?id=11349 Models are now multiblocks of tables] (Sandia)<br />
<br />
== Features ==<br />
Among the considered features:<br />
<br />
=== Integral ===<br />
These features are related to making statistics pervasive throughout the user interface.<br />
* Find data (find values that are extremal, or in a particular inter-quantile interval, etc.)<br />
* Calculator (additional statistics buttons for some descriptive statistics)<br />
* Automatic statistics filter (decides what to do for the user given a dataset and variables of interest on it)<br />
<br />
=== Advanced Filters ===<br />
This more or less what is currently in PV, plus the following:<br />
* Hypothesis testing<br />
* Linked selection for the model tables<br />
* Access to more (or all) parameters in a generic way</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/DesignOfStatistics&diff=33008ParaView/DesignOfStatistics2010-10-21T17:20:18Z<p>Dcthomp: /* Bugs */</p>
<hr />
<div>Things to do:<br />
<br />
== Bugs ==<br />
A number of bugs must be fixed.<br />
* [http://www.paraview.org/Bug/view.php?id=11347 Datasets with only cell or row data]<br />
* [http://www.paraview.org/Bug/view.php?id=11349 Models are now multiblocks of tables]<br />
<br />
== Features ==<br />
Among the considered features:<br />
<br />
=== Integral ===<br />
These features are related to making statistics pervasive throughout the user interface.<br />
* Find data (find values that are extremal, or in a particular inter-quantile interval, etc.)<br />
* Calculator (additional statistics buttons for some descriptive statistics)<br />
* Automatic statistics filter (decides what to do for the user given a dataset and variables of interest on it)<br />
<br />
=== Advanced Filters ===<br />
This more or less what is currently in PV, plus the following:<br />
* Hypothesis testing<br />
* Linked selection for the model tables<br />
* Access to more (or all) parameters in a generic way</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/DesignOfStatistics&diff=33006ParaView/DesignOfStatistics2010-10-21T17:16:43Z<p>Dcthomp: /* Bugs */</p>
<hr />
<div>Things to do:<br />
<br />
== Bugs ==<br />
A number of bugs must be fixed.<br />
* [http://www.paraview.org/Bug/view.php?id=11347 Datasets with only cell or row data]<br />
* Models are now multiblocks of tables<br />
<br />
== Features ==<br />
Among the considered features:<br />
<br />
=== Integral ===<br />
These features are related to making statistics pervasive throughout the user interface.<br />
* Find data (find values that are extremal, or in a particular inter-quantile interval, etc.)<br />
* Calculator (additional statistics buttons for some descriptive statistics)<br />
* Automatic statistics filter (decides what to do for the user given a dataset and variables of interest on it)<br />
<br />
=== Advanced Filters ===<br />
This more or less what is currently in PV, plus the following:<br />
* Hypothesis testing<br />
* Linked selection for the model tables<br />
* Access to more (or all) parameters in a generic way</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/DesignOfStatistics&diff=33005ParaView/DesignOfStatistics2010-10-21T17:14:52Z<p>Dcthomp: /* Bugs */</p>
<hr />
<div>Things to do:<br />
<br />
== Bugs ==<br />
A number of bugs must be fixed.<br />
* Datasets with only cell or row data<br />
* Models are now multiblocks of tables<br />
<br />
== Features ==<br />
Among the considered features:<br />
<br />
=== Integral ===<br />
These features are related to making statistics pervasive throughout the user interface.<br />
* Find data (find values that are extremal, or in a particular inter-quantile interval, etc.)<br />
* Calculator (additional statistics buttons for some descriptive statistics)<br />
* Automatic statistics filter (decides what to do for the user given a dataset and variables of interest on it)<br />
<br />
=== Advanced Filters ===<br />
This more or less what is currently in PV, plus the following:<br />
* Hypothesis testing<br />
* Linked selection for the model tables<br />
* Access to more (or all) parameters in a generic way</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK/Modularization_Proposal&diff=28636VTK/Modularization Proposal2010-08-16T16:43:20Z<p>Dcthomp: /* Proposed Work */</p>
<hr />
<div>== Abstract ==<br />
<br />
As VTK continues to grow, the granularity that was originally chosen for the kits is becoming too coarse. For example, the Graphics kit contains 556 files and it depends on the Common and Filtering kits which contain 942 files between them. Therefore, to use a single filter from the Graphics kit, the developer has to link against 3 large libraries. This is specially problematic when linking against shared libraries. Wrapping tends to make things worse because it brings in all symbols from all libraries. Another issue is that there are optional classes in many of the kits. This means that there are many ways of building the kits, which makes it hard for package managers. We propose to rearrange VTK's kits to address these problems.<br />
<br />
== Use Cases ==<br />
<br />
=== Ability to Link Against a (Somewhat) Minimal Subset of VTK === <br />
<br />
It should be possible to link against a reasonable subset of VTK depending on what part of VTK are used. Some people want to use only VTK's data model. Others want to use it as a rendering library. In the first case, it should be possible to link against a common (core?) library and the data model library without bringing in the execution model and bunch of unnecessary utility classes. In the second case, it should be possible to link against common, data model and execution model - no need to bring in bunch of filters.<br />
<br />
=== Ability to Include a Minimal Subset of VTK ===<br />
<br />
This is somewhat different than the above use cases. More and more, application codes want to embed a subset of VTK. Currently, the only way to do this is to pull in the whole thing. It should be possible to pick and choose kits (as long as the dependencies are satisfied) to be embedded in an application. This requires that the build system handles the absence of some of the kits.<br />
<br />
=== Making Sense of VTK Kits ===<br />
<br />
Currently, VTK kits are poorly named. Filtering contains only the execution model and filter superclasses but no concrete filters. Graphics contains feature extraction algorithms and computational geometry algorithms (and more) etc etc. The developer should be able to make a good guess about what is in a kit from its name.<br />
<br />
== Proposed Work ==<br />
<br />
=== Reorganize VTK ===<br />
<br />
* Rename existing kits such that the names make sense<br />
* Create new kits<br />
* Clean up dependencies<br />
<br />
=== Clean up VTK's CMake files ===<br />
<br />
VTK does not use most of the newer CMake functionality. We should modernize VTK's CMake files while doing the reorganization such that it is more easily maintained by non-CMake-experts (i.e. those other than Bill, Brad and Dave Cole)<br />
<br />
::<font color="green">I know that changing the include directives to require the kit name in front of the class name has been discussed before. If possible, I would like to make that change as the kits names are changed and CMake files updated. With more intuitive kit names, this should not be a burden and it will reduce the size of the compiler command line, especially in the face of having even more kits.</font> --[[User:Dcthomp|Dcthomp]]<br />
<br />
=== Allow for the compilation of subsets of VTK ===<br />
<br />
Change VTK's CMake files such that they gracefully handle the absence of kits, as long as the dependencies are satisfied. It should be possible to delete bunch of subdirectories from VTK and still have what is left configure and compile properly.<br />
<br />
<font color="green"><br />
=== Allow for instantiator to be turned off? ===<br />
<br />
Any chance these changes could include the ability to turn off the vtkInstantiator on a per-kit or per-class basis? Perhaps a single CMake variable could point to a file containing classes for which instantiators should be generated would be OK?</font> --[[User:Dcthomp|Dcthomp]]<br />
<br />
== Issues ==<br />
<br />
=== Backwards Compatibility ===<br />
<br />
Renaming and reorganizing the kits will require that all consumers of VTK change their link commands.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK/Modularization_Proposal&diff=28633VTK/Modularization Proposal2010-08-16T16:37:53Z<p>Dcthomp: /* Clean up VTK's CMake files */</p>
<hr />
<div>== Abstract ==<br />
<br />
As VTK continues to grow, the granularity that was originally chosen for the kits is becoming too coarse. For example, the Graphics kit contains 556 files and it depends on the Common and Filtering kits which contain 942 files between them. Therefore, to use a single filter from the Graphics kit, the developer has to link against 3 large libraries. This is specially problematic when linking against shared libraries. Wrapping tends to make things worse because it brings in all symbols from all libraries. Another issue is that there are optional classes in many of the kits. This means that there are many ways of building the kits, which makes it hard for package managers. We propose to rearrange VTK's kits to address these problems.<br />
<br />
== Use Cases ==<br />
<br />
=== Ability to Link Against a (Somewhat) Minimal Subset of VTK === <br />
<br />
It should be possible to link against a reasonable subset of VTK depending on what part of VTK are used. Some people want to use only VTK's data model. Others want to use it as a rendering library. In the first case, it should be possible to link against a common (core?) library and the data model library without bringing in the execution model and bunch of unnecessary utility classes. In the second case, it should be possible to link against common, data model and execution model - no need to bring in bunch of filters.<br />
<br />
=== Ability to Include a Minimal Subset of VTK ===<br />
<br />
This is somewhat different than the above use cases. More and more, application codes want to embed a subset of VTK. Currently, the only way to do this is to pull in the whole thing. It should be possible to pick and choose kits (as long as the dependencies are satisfied) to be embedded in an application. This requires that the build system handles the absence of some of the kits.<br />
<br />
=== Making Sense of VTK Kits ===<br />
<br />
Currently, VTK kits are poorly named. Filtering contains only the execution model and filter superclasses but no concrete filters. Graphics contains feature extraction algorithms and computational geometry algorithms (and more) etc etc. The developer should be able to make a good guess about what is in a kit from its name.<br />
<br />
== Proposed Work ==<br />
<br />
=== Reorganize VTK ===<br />
<br />
* Rename existing kits such that the names make sense<br />
* Create new kits<br />
* Clean up dependencies<br />
<br />
=== Clean up VTK's CMake files ===<br />
<br />
VTK does not use most of the newer CMake functionality. We should modernize VTK's CMake files while doing the reorganization such that it is more easily maintained by non-CMake-experts (i.e. those other than Bill, Brad and Dave Cole)<br />
<br />
::<font color="green">I know that changing the include directives to require the kit name in front of the class name has been discussed before. If possible, I would like to make that change as the kits names are changed and CMake files updated. With more intuitive kit names, this should not be a burden and it will reduce the size of the compiler command line, especially in the face of having even more kits.</font> --[[User:Dcthomp|Dcthomp]]<br />
<br />
=== Allow for the compilation of subsets of VTK ===<br />
<br />
Change VTK's CMake files such that they gracefully handle the absence of kits, as long as the dependencies are satisfied. It should be possible to delete bunch of subdirectories from VTK and still have what is left configure and compile properly.<br />
<br />
== Issues ==<br />
<br />
=== Backwards Compatibility ===<br />
<br />
Renaming and reorganizing the kits will require that all consumers of VTK change their link commands.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK/Charts&diff=17217VTK/Charts2009-11-02T23:44:44Z<p>Dcthomp: /* Goals */</p>
<hr />
<div>== Goals ==<br />
<br />
* Render 2D charts using OpenGL/VTK<br />
* Scalable to large data sets<br />
* Simple, flexible API<br />
* Enable server side rendering of charts with VTK compositing<br />
* Proper handling of IEEE Not-A-Number (NaN) values in plots of both experimental and simulation data.<br />
<br />
== Open Questions ==<br />
<br />
* Alternate backend to produce publication quality output?<br />
* Possibility to extend the API to 3D charts in the future?<br />
* Maybe real-time charting? (just-in-time visualization of data sets)<br />
<br />
== Further Details on API ==<br />
<br />
Please see [[VTK/Charts/API]] for a discussion about the API. There is also more detail about the [[VTK/Charts/2DAPI]] and the [[VTK/Charts/ChartAPI]] along with proposed class relationships.<br />
<br />
== Chart Types ==<br />
<br />
* XY plot<br />
* Scatter<br />
* Bar chart<br />
* Histogram<br />
* Stack chart<br />
* Pie chart<br />
* Parallel axes<br />
* Tree map<br />
* Bubble chart<br />
<br />
== Existing Applications/Libraries ==<br />
<br />
Below is a summary of different libraries or applications that produce 2D charts and plots. Those listed either provide both screen and publication quality rendering, or just screen rendering.<br />
<br />
=== Optimized for Screen Rendering ===<br />
<br />
* [http://www.prefuse.org/ prefuse]<br />
* [http://www.tableausoftware.com/ tableau]<br />
* [http://manyeyes.alphaworks.ibm.com/manyeyes/ Many Eyes]<br />
* [http://code.google.com/apis/chart/ Google Chart]<br />
* [http://qwt.sourceforge.net/ Qwt]<br />
* [http://qwtplot3d.sourceforge.net/ QwtPlot3D]<br />
* [http://anaphe.web.cern.ch/anaphe/qplotter.html QPlotter]<br />
* [http://www.ggobi.org/ ggobi]<br />
* [http://www.zedgraph.org/ ZedGraph]<br />
* [http://www.steema.com/ TeeChart]<br />
* [http://www.iocomp.com/ Iocomp]<br />
* [http://www.codeproject.com/KB/miscctrl/xgraph.aspx/ Scientific charting control]<br />
* [http://www.advsofteng.com/ ChartDirector]<br />
* [http://www.dundas.com/ Dundas]<br />
* [http://www.visifire.com/ Visifire]<br />
* [http://code.google.com/p/core-plot/ core-plot]<br />
* [http://vis.stanford.edu/protovis/ Protoviz]<br />
<br />
=== Multiple Backends (Publication Quality) ===<br />
<br />
* [http://plplot.sourceforge.net/ PLPlot]<br />
* [http://www.gnuplot.info/ gnuplot]<br />
* [http://matplotlib.sourceforge.net/ matplotlib]<br />
* [http://home.gna.org/veusz/ Veusz]<br />
* [http://plasma-gate.weizmann.ac.il/Grace/ Grace]<br />
* [http://glx.sourceforge.net/ GLE]</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK/Charts/API&diff=16911VTK/Charts/API2009-10-20T16:36:57Z<p>Dcthomp: One more comment.</p>
<hr />
<div>After looking at various other frameworks, tools and products out there here are some thoughts on the API. Employing a factory based interface seems like it would give us the most intuitive API. So a chart could be instantiated in the following way,<br />
<br />
<pre>// Instantiate a new chart<br />
vtkChart *chart = vtkChart::New();<br />
// Add a bar plot<br />
vtkPlot *barPlot = chart->AddPlot(Chart::BAR);<br />
// Set the data source to table, column 0 is x, column 2 is y<br />
barPlot->SetData(table, 0, 2);<br />
// Set the width of the bars<br />
barPlot->SetWidth(20);<br />
// Add a line plot<br />
vtkPlot *linePlot = chart->AddPlot(Chart::LINE);<br />
// Set the data source to table, column 0 is x, column 2 is y<br />
linePlot->SetData(table, 0, 2);<br />
// Set the line width<br />
linePlot->SetWidth(2);</pre><br />
<br />
I could also overload the New() function to accept an argument such as vtkChart::New(vtkChart::LINE) which would create a new chart object with a line chart contained. The chart can iterate over the plots and paint each one in turn, this allows an intuitive interface where multiple plot types can be plotted on the same axes.<br />
<br />
::<font color="green">The biggest problem I have with this is that many plot/chart settings will be specific to one particular subclass. For instance, a line plot might have line thickness but no glyph size while a scatterplot would have glyph size but no line thickness. If you only get a pointer to a vtkPlot you won't be able to call either unless every possible property is defined on the base class. This could get '''very''' ugly. So, rather than using the traditional SetXXX() calls, I would argue that charts should make heavy use of a Set() method that takes some property object or string along with the value.</font><br />
<br />
::<font color="green">Also, AddPlot() or the vtkChart::New() variant that takes an argument should take strings and not enums. That will allow run-time addition of new plot types. A few simple functions would make it easy to create new types of plots (or even just shortcuts for plots with some style defaults set differently).</font><br />
<br />
There has been a lot of talk about API. Do we want a declarative API? This would make constructs such as,<br />
<br />
<pre>vtkChart *chart = vtkChart::New();<br />
chart->SetWidth(200)<br />
->SetHeight(200);<br />
<br />
chart->AddPlot(vtkChart::BAR)<br />
->SetData(table, 1, 2)<br />
->SetWidth(20)<br />
->SetLabel("My Data");</pre><br />
<br />
This would require setter functions that returned a pointer to the object, i.e.<br />
<br />
<pre>vtkChart * vtkChart::SetWidth(int width);<br />
vtkPlot * vtkChart::AddPlot(enum);<br />
vtkPlot * vtkPlot::SetLabel(const char *label);</pre><br />
<br />
One nice thing about this approach is that ignoring the pointer returned allows traditional VTK syntax to be used.<br />
<br />
<pre>vtkPlot *plot = chart->AddPlot(vtkChart::BAR);<br />
plot->SetData(table, 1, 2);<br />
plot->SetWidth(20);<br />
plot->SetLabel("My Data");</pre><br />
<br />
Admittedly it is not in keeping with the current pattern used in VTK, there are numerous examples of its use in other libraries and frameworks. It should possibly be more of a global change to the library API, rather than isolated to one or two components.<br />
::<font color="green">I am not sure declarative properties are that useful. However, I think there's no harm (other than a lot of work) in changing vtkSetMacro and other SetXXX() methods to return "this" instead of void. Note that<br />
<br />
find /path/to/ParaView -name "*.h" -exec grep "void Set[A-Z].*(" {} \; | grep -v "static " | wc -l<br />
<br />
::returned 5262 SetXXX() methods that were declared directly instead of using vtkSetMacro (many may involve pointers to vtkObject subclasses).</font></div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK/Charts/API&diff=16910VTK/Charts/API2009-10-20T16:28:37Z<p>Dcthomp: Some comments.</p>
<hr />
<div>After looking at various other frameworks, tools and products out there here are some thoughts on the API. Employing a factory based interface seems like it would give us the most intuitive API. So a chart could be instantiated in the following way,<br />
<br />
<pre>// Instantiate a new chart<br />
vtkChart *chart = vtkChart::New();<br />
// Add a bar plot<br />
vtkPlot *barPlot = chart->AddPlot(Chart::BAR);<br />
// Set the data source to table, column 0 is x, column 2 is y<br />
barPlot->SetData(table, 0, 2);<br />
// Set the width of the bars<br />
barPlot->SetWidth(20);<br />
// Add a line plot<br />
vtkPlot *linePlot = chart->AddPlot(Chart::LINE);<br />
// Set the data source to table, column 0 is x, column 2 is y<br />
linePlot->SetData(table, 0, 2);<br />
// Set the line width<br />
linePlot->SetWidth(2);</pre><br />
<br />
I could also overload the New() function to accept an argument such as vtkChart::New(vtkChart::LINE) which would create a new chart object with a line chart contained. The chart can iterate over the plots and paint each one in turn, this allows an intuitive interface where multiple plot types can be plotted on the same axes.<br />
::<font color="green">AddPlot() or the vtkChart::New() variant that takes an argument should take strings and not enums. That will allow run-time addition of new plot types. A few simple functions would make it easy to create new types of plots (or even just shortcuts for plots with some style defaults set differently).</font><br />
<br />
There has been a lot of talk about API. Do we want a declarative API? This would make constructs such as,<br />
<br />
<pre>vtkChart *chart = vtkChart::New();<br />
chart->SetWidth(200)<br />
->SetHeight(200);<br />
<br />
chart->AddPlot(vtkChart::BAR)<br />
->SetData(table, 1, 2)<br />
->SetWidth(20)<br />
->SetLabel("My Data");</pre><br />
<br />
This would require setter functions that returned a pointer to the object, i.e.<br />
<br />
<pre>vtkChart * vtkChart::SetWidth(int width);<br />
vtkPlot * vtkChart::AddPlot(enum);<br />
vtkPlot * vtkPlot::SetLabel(const char *label);</pre><br />
<br />
One nice thing about this approach is that ignoring the pointer returned allows traditional VTK syntax to be used.<br />
<br />
<pre>vtkPlot *plot = chart->AddPlot(vtkChart::BAR);<br />
plot->SetData(table, 1, 2);<br />
plot->SetWidth(20);<br />
plot->SetLabel("My Data");</pre><br />
<br />
Admittedly it is not in keeping with the current pattern used in VTK, there are numerous examples of its use in other libraries and frameworks. It should possibly be more of a global change to the library API, rather than isolated to one or two components.<br />
::<font color="green">I am not sure declarative properties are that useful. However, I think there's no harm (other than a lot of work) in changing vtkSetMacro and other SetXXX() methods to return "this" instead of void. Note that<br />
<br />
find /path/to/ParaView -name "*.h" -exec grep "void Set[A-Z].*(" {} \; | grep -v "static " | wc -l<br />
<br />
::returned 5262 SetXXX() methods that were declared directly instead of using vtkSetMacro (many may involve pointers to vtkObject subclasses).</font></div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:PV_Vis09_Tut_Statistics_Movie.zip&diff=16827File:PV Vis09 Tut Statistics Movie.zip2009-10-14T19:36:29Z<p>Dcthomp: Fix link in description.</p>
<hr />
<div>Movie of ParaView statistics from slide 9 of the [[IEEE Vis09 ParaView Tutorial|IEEE VisWeek ParaView Statistics Tutorial]].</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=IEEE_Vis09_ParaView_Tutorial&diff=16826IEEE Vis09 ParaView Tutorial2009-10-14T19:32:59Z<p>Dcthomp: </p>
<hr />
<div>Here are the slides for the Advanced ParaView Visualization tutorial given at the IEEE Vis08 ParaView tutorial. This tutorial comprises a collection of advanced topics given by a group of ParaView developers from various organizations. Most of the topics are intended for visualization experts and those already familiar with ParaView.<br />
<br />
Click one of the links in the agenda below to retrieve the slides for that presentation.<br />
<br />
{|<br />
| [[Media:PV Vis09 Tut IntroWhatsNew.pdf | Introduction / What's New]]<br />
| Kenneth Moreland<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:PV Vis09 Tut Plugins.pdf | Plugins]] ([[Media:PV Vis09 Tut Plugins Examples.tar.gz |examples]])<br />
| Kenneth Moreland<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:PV Vis09 Tut Python.pdf | Python Scripting]]<br />
| David DeMarle<br />
| Kitware, Inc.<br />
|-<br />
| [[Media:Manta_adaptive_paraview.pdf | Petascale Distance (Manta and Adaptive) Visualization]]<br />
| Jonathan Woodring<br />
| Los Alamos National Laboratory<br />
|-<br />
| [[Media:PV Vis09 Tut InSitu1.pdf | ''In-Situ'' Visualization: Integration]] ([[Media:PV Vis09 Tut InSitu Examples.tar.gz |examples]])<br />
| David Thompson<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:PV Vis09 Tut InSitu2.pdf | ''In-Situ'' Visualization: Bridging]]<br />
| Nathan Fabian<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:PV Vis09 Tut Statistics.pdf | Statistics]] ([[Media:PV_Vis09_Tut_Statistics_Movie.zip|Movie]] from slide 9)<br />
| Philippe Pebay<br />
| Sandia National Laboratories<br />
|}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:PV_Vis09_Tut_Statistics_Movie.zip&diff=16825File:PV Vis09 Tut Statistics Movie.zip2009-10-14T19:26:42Z<p>Dcthomp: Movie of ParaView statistics from slide 9 of the IEEE VisWeek ParaView Statistics Tutorial.</p>
<hr />
<div>Movie of ParaView statistics from slide 9 of the [[Media:PV Vis09 Tut Statistics Movie.zip|IEEE VisWeek ParaView Statistics Tutorial]].</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:PV_Vis09_Tut_InSitu1.pdf&diff=16820File:PV Vis09 Tut InSitu1.pdf2009-10-14T19:15:18Z<p>Dcthomp: uploaded a new version of "Image:PV Vis09 Tut InSitu1.pdf": Version as presented at VisWeek09.</p>
<hr />
<div></div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:PV_Vis09_Tut_Statistics.pdf&diff=16819File:PV Vis09 Tut Statistics.pdf2009-10-14T19:13:08Z<p>Dcthomp: uploaded a new version of "Image:PV Vis09 Tut Statistics.pdf": Version as presented at VisWeek09.</p>
<hr />
<div></div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16587Statistical analysis2009-09-23T05:57:09Z<p>Dcthomp: No longer a plugin, but part of ParaView.</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include [http://kitware.com/InfovisWiki/index.php/Statistics_Engines statistics] filters.<br />
These filters provide a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once ParaView is started, you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Caveats ==<br />
<br />
<font color="#aa3333">'''Warning''': When computing statistics on point arrays ''and'' running pvserver with data distributed across more than a single process, the statistics will be skewed</font> because points stored on both processes (due to cells that neighbor each other on different processes) will be counted once for each process they appear in. We are working to resolve this issue but without forcing a redistribution of the data, it is not simple.<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes.<br />
These contingency tables are empirical joint probability distributions;<br />
given a pair of attribute values, the observed frequency per observation is retured.<br />
Thus the result of analysis is a tabular bivariate probability distribution.<br />
This table serves as a Bayesian-style prior model when assessing a set of observations.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
The learned model output format is rather dense and can be confusing, so it is discussed here.<br />
The first filter output is a multiblock dataset consisting of 2 tables.<br />
* Raw covariance data.<br />
* Covariance matrix and its Cholesky decomposition.<br />
<br />
==== Raw covariances ====<br />
<br />
The first table has 3 meaningful columns: 2 titled "Column1" and "Column2" whose entries generally refer to the N arrays you selected when preparing the filter and 1 column titled "Entries" that contains numeric values. The first row will always contain the number of observations in the statistical analysis. The next N rows contain the mean for each of the N arrays you selected. The remaining rows contain covariances of pairs of arrays.<br />
<br />
==== Correlations ====<br />
<br />
[[Image:MulticorrelativeDerivedData.png|thumb|Storage for multicorrelative models.]]<br />
The second table contains information derived from the raw covariance data of the first table. The first N rows of the first column contain the name of one array you selected for analysis. These rows are followed by a single entry labeled "Cholesky" for a total of N+1 rows. The second column, ''Mean'' contains the mean of each variable in the first N entries and the number of observations processed in the final (N+1) row.<br />
<br />
The remaining columns (there are N, one for each array) contain 2 matrices in triangular format.<br />
The upper right triangle contains the covariance matrix (which is symmetric, so its lower triangle may be inferred).<br />
The lower left triangle contains the Cholesky decomposition of the covariance matrix (which is triangular, so its upper triangle is zero).<br />
Because the diagonal must be stored for both matrices, an additional row is required &mdash; hence the N+1 rows and the final entry of the column named "Column".<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each ''(i,j)''-entry of the covariance matrix<br />
is normalized by sqrt(cov''(i,i)'' cov''(j,j)'') before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".<br />
<br />
[[Image:PCADerivedData.png|thumb|Storage for PCA model results.]]<br />
Since the PCA filter uses the multicorrelative filter's analysis, it shares the same raw covariance table specified above.<br />
The second table in the multiblock dataset comprising the model output is an expanded version of the multicorrelative version.<br />
<br />
==== PCA Derived Data Output ====<br />
<br />
As above, the second model table contains the mean values, the upper-triangular portion of the symmetric covariance matrix, and the non-zero lower-triangular portion of the Cholesky decomposition of the covariance matrix.<br />
Below these entries are the eigenvalues of the covariance matrix (in the column labeled "Mean") and the eigenvectors (as row vectors) in an additional NxN matrix.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16550Statistical analysis2009-09-16T06:05:42Z<p>Dcthomp: /* Contingency Statistics */ Correction</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a [http://kitware.com/InfovisWiki/index.php/Statistics_Engines statistics] plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Caveats ==<br />
<br />
<font color="#aa3333">'''Warning''': When computing statistics on point arrays ''and'' running pvserver with data distributed across more than a single process, the statistics will be skewed</font> because points stored on both processes (due to cells that neighbor each other on different processes) will be counted once for each process they appear in. We are working to resolve this issue but without forcing a redistribution of the data, it is not simple.<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes.<br />
These contingency tables are empirical joint probability distributions;<br />
given a pair of attribute values, the observed frequency per observation is retured.<br />
Thus the result of analysis is a tabular bivariate probability distribution.<br />
This table serves as a Bayesian-style prior model when assessing a set of observations.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
The learned model output format is rather dense and can be confusing, so it is discussed here.<br />
The first filter output is a multiblock dataset consisting of 2 tables.<br />
* Raw covariance data.<br />
* Covariance matrix and its Cholesky decomposition.<br />
<br />
==== Raw covariances ====<br />
<br />
The first table has 3 meaningful columns: 2 titled "Column1" and "Column2" whose entries generally refer to the N arrays you selected when preparing the filter and 1 column titled "Entries" that contains numeric values. The first row will always contain the number of observations in the statistical analysis. The next N rows contain the mean for each of the N arrays you selected. The remaining rows contain covariances of pairs of arrays.<br />
<br />
==== Correlations ====<br />
<br />
[[Image:MulticorrelativeDerivedData.png|thumb|Storage for multicorrelative models.]]<br />
The second table contains information derived from the raw covariance data of the first table. The first N rows of the first column contain the name of one array you selected for analysis. These rows are followed by a single entry labeled "Cholesky" for a total of N+1 rows. The second column, ''Mean'' contains the mean of each variable in the first N entries and the number of observations processed in the final (N+1) row.<br />
<br />
The remaining columns (there are N, one for each array) contain 2 matrices in triangular format.<br />
The upper right triangle contains the covariance matrix (which is symmetric, so its lower triangle may be inferred).<br />
The lower left triangle contains the Cholesky decomposition of the covariance matrix (which is triangular, so its upper triangle is zero).<br />
Because the diagonal must be stored for both matrices, an additional row is required &mdash; hence the N+1 rows and the final entry of the column named "Column".<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each ''(i,j)''-entry of the covariance matrix<br />
is normalized by sqrt(cov''(i,i)'' cov''(j,j)'') before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".<br />
<br />
[[Image:PCADerivedData.png|thumb|Storage for PCA model results.]]<br />
Since the PCA filter uses the multicorrelative filter's analysis, it shares the same raw covariance table specified above.<br />
The second table in the multiblock dataset comprising the model output is an expanded version of the multicorrelative version.<br />
<br />
==== PCA Derived Data Output ====<br />
<br />
As above, the second model table contains the mean values, the upper-triangular portion of the symmetric covariance matrix, and the non-zero lower-triangular portion of the Cholesky decomposition of the covariance matrix.<br />
Below these entries are the eigenvalues of the covariance matrix (in the column labeled "Mean") and the eigenvectors (as row vectors) in an additional NxN matrix.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:MulticorrelativeDerivedData.png&diff=16543File:MulticorrelativeDerivedData.png2009-09-16T00:14:38Z<p>Dcthomp: This figure shows how information is stored in the output of a multicorrelative analysis on a set of 4 arrays: "BrownianVectors_0", "BrownianVectors_1", "BrownianVectors_2", and "Result". A total of 1587 observations were used to obtain the model.</p>
<hr />
<div>This figure shows how information is stored in the output of a multicorrelative analysis on a set of 4 arrays: "BrownianVectors_0", "BrownianVectors_1", "BrownianVectors_2", and "Result". A total of 1587 observations were used to obtain the model.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16542Statistical analysis2009-09-16T00:13:25Z<p>Dcthomp: /* Multicorrelative Statistics */</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a [http://kitware.com/InfovisWiki/index.php/Statistics_Engines statistics] plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Caveats ==<br />
<br />
<font color="#aa3333">'''Warning''': When computing statistics on point arrays ''and'' running pvserver with data distributed across more than a single process, the statistics will be skewed</font> because points stored on both processes (due to cells that neighbor each other on different processes) will be counted once for each process they appear in. We are working to resolve this issue but without forcing a redistribution of the data, it is not simple.<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
The learned model output format is rather dense and can be confusing, so it is discussed here.<br />
The first filter output is a multiblock dataset consisting of 2 tables.<br />
* Raw covariance data.<br />
* Covariance matrix and its Cholesky decomposition.<br />
<br />
==== Raw covariances ====<br />
<br />
The first table has 3 meaningful columns: 2 titled "Column1" and "Column2" whose entries generally refer to the N arrays you selected when preparing the filter and 1 column titled "Entries" that contains numeric values. The first row will always contain the number of observations in the statistical analysis. The next N rows contain the mean for each of the N arrays you selected. The remaining rows contain covariances of pairs of arrays.<br />
<br />
==== Correlations ====<br />
<br />
[[Image:MulticorrelativeDerivedData.png|thumb|Storage for multicorrelative models.]]<br />
The second table contains information derived from the raw covariance data of the first table. The first N rows of the first column contain the name of one array you selected for analysis. These rows are followed by a single entry labeled "Cholesky" for a total of N+1 rows. The second column, ''Mean'' contains the mean of each variable in the first N entries and the number of observations processed in the final (N+1) row.<br />
<br />
The remaining columns (there are N, one for each array) contain 2 matrices in triangular format.<br />
The upper right triangle contains the covariance matrix (which is symmetric, so its lower triangle may be inferred).<br />
The lower left triangle contains the Cholesky decomposition of the covariance matrix (which is triangular, so its upper triangle is zero).<br />
Because the diagonal must be stored for both matrices, an additional row is required &mdash; hence the N+1 rows and the final entry of the column named "Column".<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each ''(i,j)''-entry of the covariance matrix<br />
is normalized by sqrt(cov''(i,i)'' cov''(j,j)'') before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".<br />
<br />
[[Image:PCADerivedData.png|thumb|Storage for PCA model results.]]<br />
Since the PCA filter uses the multicorrelative filter's analysis, it shares the same raw covariance table specified above.<br />
The second table in the multiblock dataset comprising the model output is an expanded version of the multicorrelative version.<br />
<br />
==== PCA Derived Data Output ====<br />
<br />
As above, the second model table contains the mean values, the upper-triangular portion of the symmetric covariance matrix, and the non-zero lower-triangular portion of the Cholesky decomposition of the covariance matrix.<br />
Below these entries are the eigenvalues of the covariance matrix (in the column labeled "Mean") and the eigenvectors (as row vectors) in an additional NxN matrix.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:PCADerivedData.png&diff=16541File:PCADerivedData.png2009-09-16T00:11:37Z<p>Dcthomp: This figure shows how information is stored in the output of a principal component analysis on a set of 4 arrays: "BrownianVectors_0", "BrownianVectors_1", "BrownianVectors_2", and "Result". A total of 1587 observations were used to obtain the model.</p>
<hr />
<div>This figure shows how information is stored in the output of a principal component analysis on a set of 4 arrays: "BrownianVectors_0", "BrownianVectors_1", "BrownianVectors_2", and "Result". A total of 1587 observations were used to obtain the model.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16540Statistical analysis2009-09-16T00:09:10Z<p>Dcthomp: /* Principal Component Analysis */</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a [http://kitware.com/InfovisWiki/index.php/Statistics_Engines statistics] plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Caveats ==<br />
<br />
<font color="#aa3333">'''Warning''': When computing statistics on point arrays ''and'' running pvserver with data distributed across more than a single process, the statistics will be skewed</font> because points stored on both processes (due to cells that neighbor each other on different processes) will be counted once for each process they appear in. We are working to resolve this issue but without forcing a redistribution of the data, it is not simple.<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
The learned model output format is rather dense and can be confusing, so it is discussed here.<br />
The first filter output is a multiblock dataset consisting of 2 tables.<br />
* Raw covariance data.<br />
* Covariance matrix and its Cholesky decomposition.<br />
<br />
==== Raw covariances ====<br />
<br />
The first table has 3 meaningful columns: 2 titled "Column1" and "Column2" whose entries generally refer to the N arrays you selected when preparing the filter and 1 column titled "Entries" that contains numeric values. The first row will always contain the number of observations in the statistical analysis. The next N rows contain the mean for each of the N arrays you selected. The remaining rows contain covariances of pairs of arrays.<br />
<br />
==== Correlations ====<br />
<br />
The second table contains information derived from the raw covariance data of the first table. The first N rows of the first column contain the name of one array you selected for analysis. These rows are followed by a single entry labeled "Cholesky" for a total of N+1 rows. The second column, ''Mean'' contains the mean of each variable in the first N entries and the number of observations processed in the final (N+1) row.<br />
<br />
The remaining columns (there are N, one for each array) contain 2 matrices in triangular format.<br />
The upper right triangle contains the covariance matrix (which is symmetric, so its lower triangle may be inferred).<br />
The lower left triangle contains the Cholesky decomposition of the covariance matrix (which is triangular, so its upper triangle is zero).<br />
Because the diagonal must be stored for both matrices, an additional row is required &mdash; hence the N+1 rows and the final entry of the column named "Column".<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each ''(i,j)''-entry of the covariance matrix<br />
is normalized by sqrt(cov''(i,i)'' cov''(j,j)'') before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".<br />
<br />
[[Image:PCADerivedData.png|thumb|Storage for PCA model results.]]<br />
Since the PCA filter uses the multicorrelative filter's analysis, it shares the same raw covariance table specified above.<br />
The second table in the multiblock dataset comprising the model output is an expanded version of the multicorrelative version.<br />
<br />
==== PCA Derived Data Output ====<br />
<br />
As above, the second model table contains the mean values, the upper-triangular portion of the symmetric covariance matrix, and the non-zero lower-triangular portion of the Cholesky decomposition of the covariance matrix.<br />
Below these entries are the eigenvalues of the covariance matrix (in the column labeled "Mean") and the eigenvectors (as row vectors) in an additional NxN matrix.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16537Statistical analysis2009-09-15T23:23:10Z<p>Dcthomp: /* Multicorrelative Statistics */ Describe output table format.</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a [http://kitware.com/InfovisWiki/index.php/Statistics_Engines statistics] plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Caveats ==<br />
<br />
<font color="#aa3333">'''Warning''': When computing statistics on point arrays ''and'' running pvserver with data distributed across more than a single process, the statistics will be skewed</font> because points stored on both processes (due to cells that neighbor each other on different processes) will be counted once for each process they appear in. We are working to resolve this issue but without forcing a redistribution of the data, it is not simple.<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
The learned model output format is rather dense and can be confusing, so it is discussed here.<br />
The first filter output is a multiblock dataset consisting of 2 tables.<br />
* Raw covariance data.<br />
* Covariance matrix and its Cholesky decomposition.<br />
<br />
==== Raw covariances ====<br />
<br />
The first table has 3 meaningful columns: 2 titled "Column1" and "Column2" whose entries generally refer to the N arrays you selected when preparing the filter and 1 column titled "Entries" that contains numeric values. The first row will always contain the number of observations in the statistical analysis. The next N rows contain the mean for each of the N arrays you selected. The remaining rows contain covariances of pairs of arrays.<br />
<br />
==== Correlations ====<br />
<br />
The second table contains information derived from the raw covariance data of the first table. The first N rows of the first column contain the name of one array you selected for analysis. These rows are followed by a single entry labeled "Cholesky" for a total of N+1 rows. The second column, ''Mean'' contains the mean of each variable in the first N entries and the number of observations processed in the final (N+1) row.<br />
<br />
The remaining columns (there are N, one for each array) contain 2 matrices in triangular format.<br />
The upper right triangle contains the covariance matrix (which is symmetric, so its lower triangle may be inferred).<br />
The lower left triangle contains the Cholesky decomposition of the covariance matrix (which is triangular, so its upper triangle is zero).<br />
Because the diagonal must be stored for both matrices, an additional row is required &mdash; hence the N+1 rows and the final entry of the column named "Column".<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each ''(i,j)''-entry of the covariance matrix<br />
is normalized by sqrt(cov''(i,i)'' cov''(j,j)'') before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16529Statistical analysis2009-09-15T20:42:39Z<p>Dcthomp: /* ParaView Statistics Plugin */ Add a link to the InfovisWiki with the SAND reports.</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a [http://kitware.com/InfovisWiki/index.php/Statistics_Engines statistics] plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Caveats ==<br />
<br />
<font color="#aa3333">'''Warning''': When computing statistics on point arrays ''and'' running pvserver with data distributed across more than a single process, the statistics will be skewed</font> because points stored on both processes (due to cells that neighbor each other on different processes) will be counted once for each process they appear in. We are working to resolve this issue but without forcing a redistribution of the data, it is not simple.<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each ''(i,j)''-entry of the covariance matrix<br />
is normalized by sqrt(cov''(i,i)'' cov''(j,j)'') before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16528Statistical analysis2009-09-15T20:11:37Z<p>Dcthomp: /* Principal Component Analysis */</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a statistics plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Caveats ==<br />
<br />
<font color="#aa3333">'''Warning''': When computing statistics on point arrays ''and'' running pvserver with data distributed across more than a single process, the statistics will be skewed</font> because points stored on both processes (due to cells that neighbor each other on different processes) will be counted once for each process they appear in. We are working to resolve this issue but without forcing a redistribution of the data, it is not simple.<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each ''(i,j)''-entry of the covariance matrix<br />
is normalized by sqrt(cov''(i,i)'' cov''(j,j)'') before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16521Statistical analysis2009-09-15T19:59:17Z<p>Dcthomp: Warnings.</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a statistics plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Caveats ==<br />
<br />
<font color="#aa3333">'''Warning''': When computing statistics on point arrays ''and'' running pvserver with data distributed across more than a single process, the statistics will be skewed</font> because points stored on both processes (due to cells that neighbor each other on different processes) will be counted once for each process they appear in. We are working to resolve this issue but without forcing a redistribution of the data, it is not simple.<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each <math>(i,j)</math>-entry of the covariance matrix<br />
is normalized by <math>\sqrt{\mathrm{cov}(i,i)\mathrm{cov}(j,j)}</math><br />
before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:DescriptiveStatisticsExample.png&diff=16520File:DescriptiveStatisticsExample.png2009-09-15T19:45:39Z<p>Dcthomp: An example of the descriptive statistics filter. The input dataset is shown in the top left pane. The bottom shows summary statistics for each of the selected arrays and the top right pane shows an assessment (the number of deviations from the mean) of th</p>
<hr />
<div>An example of the descriptive statistics filter. The input dataset is shown in the top left pane. The bottom shows summary statistics for each of the selected arrays and the top right pane shows an assessment (the number of deviations from the mean) of the original data set with respect to the summary statistics.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16519Statistical analysis2009-09-15T19:43:35Z<p>Dcthomp: /* Descriptive Statistics */ Caption for the image. Make it a thumbnail.</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a statistics plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
[[Image:DescriptiveStatisticsExample.png|thumb|Descriptive statistics in action. Notice that the filter has 2 outputs: the assessed dataset at the top right and the summary statistics in the bottom pane.]]<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each <math>(i,j)</math>-entry of the covariance matrix<br />
is normalized by <math>\sqrt{\mathrm{cov}(i,i)\mathrm{cov}(j,j)}</math><br />
before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16518Statistical analysis2009-09-15T19:40:49Z<p>Dcthomp: /* Using the plugin */ Purtify.</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a statistics plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the ''Filter&rarr;Statistics'' menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click ''Apply''.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png|thumb|Descriptive statistics filter setup.]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
[[Image:DescriptiveStatisticsExample.png]]<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each <math>(i,j)</math>-entry of the covariance matrix<br />
is normalized by <math>\sqrt{\mathrm{cov}(i,i)\mathrm{cov}(j,j)}</math><br />
before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:DescriptiveStatisticsObjectInspector.png&diff=16517File:DescriptiveStatisticsObjectInspector.png2009-09-15T19:36:11Z<p>Dcthomp: Options for computing descriptive statistics of a dataset.</p>
<hr />
<div>Options for computing descriptive statistics of a dataset.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16516Statistical analysis2009-09-15T19:25:22Z<p>Dcthomp: /* ParaView Statistics Plugin */ Correct link text.</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a statistics plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the _Filter_ menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click _Apply_.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
[[Image:DescriptiveStatisticsExample.png]]<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each <math>(i,j)</math>-entry of the covariance matrix<br />
is normalized by <math>\sqrt{\mathrm{cov}(i,i)\mathrm{cov}(j,j)}</math><br />
before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=Statistical_analysis&diff=16515Statistical analysis2009-09-15T19:20:50Z<p>Dcthomp: Description of the statistics interface.</p>
<hr />
<div>== ParaView Statistics Plugin ==<br />
<br />
The ParaView trunk and the upcoming 3.6.2 release include a statistics plugin.<br />
This plugin provides a way to use<br />
[http://www.vtk.org/doc/nightly/html/classvtkStatisticsAlgorithm.html|vtkStatisticsAlgorithm]<br />
subclasses from within ParaView.<br />
<br />
Once the plugin is loaded you should see a new submenu in the Filters menu bar named ''Statistics'' that contains<br />
<br />
* Contingency Statistics<br />
* Descriptive Statistics<br />
* K-Means<br />
* Multicorrelative Statistics<br />
* Principal Component Analysis<br />
<br />
In the near future, there will also be a bivariate correlative statistics filter that provides more summary information than the multicorrelative version.<br />
<br />
== Using the plugin ==<br />
<br />
In the simplest use case, just<br />
select a dataset in ParaView's pipeline browser,<br />
create a statistics filter from the _Filter_ menu,<br />
hit return to accept the default empty second filter input,<br />
select the arrays you are interested in,<br />
and click _Apply_.<br />
<br />
[[Image:DescriptiveStatisticsObjectInspector.png]]<br />
<br />
The default task for all of the filters (labeled "''Model and assess the same data''")<br />
is to use a ''small, random'' portion of your dataset to create a statistical model and<br />
then use that model to evaluate ''all'' of the data.<br />
There are 4 different tasks that filters can perform:<br />
<br />
# "''Statistics of all the data''," which creates an output table (or tables) summarizing the entire input dataset;<br />
# "''Model a subset of the data''," which creates an output table (or tables) summarizing a randomly-chosen subset of the input dataset;<br />
# "''Assess the data with a model''," which adds attributes to the first input dataset using a model provided on the second input port; and<br />
# "''Model and assess the same data''," which is really just the 2 operations above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.<br />
<br />
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.<br />
You should avoid using a large fraction of the input data for training as you will then not be able to detect [[wikipedia:Overfitting|overfitting]].<br />
The ''Training fraction'' setting will be ignored for tasks 1 and 3.<br />
<br />
The first output of statistics filters is always the model table(s).<br />
The model may be newly-created (tasks 1, 2, or 4) or a copy of the input model (task 3).<br />
The second output will either be empty (tasks 1 and 2) or<br />
a copy of the input dataset with additional attribute arrays (tasks 3 and 4).<br />
<br />
== Filter-specific options ==<br />
<br />
=== Contingency Statistics ===<br />
<br />
This filter computes contingency tables between pairs of attributes (a process known as ''marginalization'').<br />
This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.<br />
Data is assessed by computing<br />
* the probability of observing both variables simultaneously;<br />
* the probability of each variable conditioned on the other (the two values need not be identical); and<br />
* the [[wikipedia:Pointwise mutual information|pointwise mutual information (PMI)]].<br />
Finally, the summary statistics include the [[wikipedia:Information entropy|information entropy]] of the observations.<br />
<br />
=== Descriptive Statistics ===<br />
<br />
This filter computes the min, max, mean, raw moments M2&mdash;M4, standard deviation, skewness, and kurtosis<br />
for each array you select.<br />
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided.<br />
Data is assessed using this model by detrending the data (i.e., subtracting the mean) and<br />
then dividing by the standard deviation.<br />
Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br />
The ''Signed Deviations'' option allows you to control whether the reported number of deviations will always be positive or<br />
whether the sign encodes if the input point was above or below the mean.<br />
<br />
[[Image:DescriptiveStatisticsExample.png]]<br />
<br />
=== K-Means ===<br />
<br />
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select.<br />
The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center.<br />
The model is then a set of cluster centers.<br />
Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br />
<br />
The ''K'' option lets you specify the number of clusters.<br />
The ''Max Iterations'' option lets you specify the maximum number of iterations before the search for cluster centers terminates.<br />
The ''Tolerance'' option lets you specify the relative tolerance on cluster center coordinate changes between iterations before<br />
the search for cluster centers terminates.<br />
<br />
=== Multicorrelative Statistics ===<br />
<br />
This filter computes the covariance matrix for all the arrays you select plus the mean of each array.<br />
The model is thus a multivariate Gaussian distribution with the mean vector and variances provided.<br />
Data is assessed using this model by computing the [[wikipedia:Mahalanobis distance|Mahalanobis distance]] for each input point.<br />
This distance will always be positive.<br />
<br />
=== Principal Component Analysis ===<br />
<br />
This filter performs additional analysis above and beyond the multicorrelative filter.<br />
It computes the eigenvalues and eigenvectors of the covariance matrix from the multicorrelative filter.<br />
Data is then assessed by projecting the original tuples into a possibly lower-dimensional space.<br />
For more information see the Wikipedia entry on [[wikipedia:Principal component analysis|principal component analysis (PCA)]].<br />
<br />
The ''Normalization Scheme'' option allows you to choose between no normalization &mdash; in which case<br />
each variable of interest is assumed to be interchangeable (i.e., of the same dimension and units) &mdash;<br />
or diagonal covariance normalization &mdash; in which case each <math>(i,j)</math>-entry of the covariance matrix<br />
is normalized by <math>\sqrt{\mathrm{cov}(i,i)\mathrm{cov}(j,j)}</math><br />
before the eigenvector decomposition is performed.<br />
This is useful when variables of interest are not comparable but their variances are expected to be<br />
useful indications of their full range, and their full ranges are expected to be useful normalization factors.<br />
<br />
As PCA is frequently used for projecting tuples into a lower-dimensional space that preserves as much information as possible,<br />
several settings are available to control the assessment output.<br />
The ''Basis Scheme'' allows you to control how projection to a lower dimension is performed.<br />
Either no projection is performed (i.e., the output assessment has the same dimension as the number of variables of interest), or<br />
projection is performed using the first N entries of each eigenvector, or<br />
projection is performed using the first several entries of each eigenvector such that the "information energy" of the<br />
projection will be above some specified amount E.<br />
<br />
The ''Basis Size'' setting specifies N, the dimension of the projected space when the ''Basis Scheme'' is set to "''Fixed-size basis''".<br />
The ''Basis Energy'' setting specifies E, the minimum "information energy" when the ''Basis Scheme'' is set to "''Fixed-energy basis''".</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView&diff=16514ParaView2009-09-15T18:58:28Z<p>Dcthomp: /* Other Features */</p>
<hr />
<div><center>[[image:pvsplash1.png]]</center><br />
<br />
<br />
<br />
ParaView is an open-source, multi-platform application designed to visualize data sets of size varying from small to very large. The goals of the ParaView project include developing an open-source, multi-platform visualization application that support distributed computational models to process large data sets. It has an open, flexible, and intuitive user interface. Furthermore, ParaView is built on an extensible architecture based on open standards. ParaView runs on distributed and shared memory parallel as well as single processor systems and has been succesfully tested on Windows, Linux, Mac OS X, IBM Blue Gene, Cray XT3 and various Unix workstations and clusters. Under the hood, ParaView uses the Visualization Toolkit as the data processing and rendering engine and has a user interface written using the Qt cross-platform application framework.<br />
<br />
The goal of this Wiki is to provide up-to-date documentation maintained by the developer and user communities. As such, we welcome volunteers that would like to contribute. If you are interested in contributing, please contact us on the ParaView mailing list http://public.kitware.com/mailman/listinfo/paraview.<br />
<br />
You can find more information about ParaView on the ParaView web site: http://paraview.org. For more help, check out http://paraview.org/New/help.html.<br />
<br />
==Real world concept -> Paraview terminology map==<br />
Often new users may say "Surely Paraview can do X... but I can't find it!". This [[terminology map]] should help!<br />
<br />
==Complete list/description of paraview filters==<br />
If you're looking through the list of filters in paraview, you may want to know what they all do! Here is a [http://paraview.org/OnlineHelpCurrent/ParaViewFilters.html complete list].<br />
<br />
==Useful Programmable Filters==<br />
Here are some [[Programmable Filters]] that are easy to copy/paste to apply. Maybe someday they will become real Paraview filters.<br />
<br />
==ParaView In Use==<br />
* [[ParaView In Action]]<br />
: Some examples of how ParaView is used<br />
<br />
* [http://flickr.com/groups/paraview/pool/ ParaView Screenshots]<br />
: Screenshots generated by ParaView<br />
<br />
== Documentation ==<br />
{| border="0" align="center" width="98%" valign="top" cellspacing="7" cellpadding="2"<br />
|-<br />
! width="33%"|<br />
! |<br />
! width="33%"|<br />
! |<br />
! width="33%"|<br />
|- <br />
|valign="top"|<br />
<br />
===Compile/Install===<br />
* [http://www.paraview.org/New/download.html Download ParaView]<br />
: Instructions for downloading source as well as pre-compiled binaries for common platforms.<br />
* [[ParaView:Build And Install|Building and Installation instructions]]<br />
: Compiling and installing ParaView from source.<br />
<br />
===Server Setup===<br />
* [[Setting up a ParaView Server| ParaView Server Setup]]<br />
:Configuring your cluster to act as a ParaView server.<br />
* [[Starting the server| ParaView Server Startup Using GUI]]<br />
:Using the ParaView client to start the servers.<br />
* [[ParaView:Server Configuration| Server Configuration]] <br />
:Customizing server startup and connection processes using XML-based configuration scripts.<br />
<br />
===Generating Data===<br />
* [[Generating data]]<br />
:How to write out data in a format that Paraview understands<br />
<br />
===Python Scripting===<br />
* [[ParaView/Python Scripting|Python Scripting]] <font color="green">* updated to 3.6</font><br />
: Scripting ParaView using python<br />
* [[Python Programmable Filter]]<br />
: Generating/Processing data using python.<br />
* [[Python GUI Tools]] <font color="green">* new in 3.6</font><br />
: Using the python shell interface in paraview<br />
* [[Python recipes]] for ParaView<br />
* [[SNL ParaView 3 Python Tutorials]]<br />
: Beginning and advanced tutorial sets, each presented as 2 hour classes by Sandia National Laboratories<br />
<br />
===Animation===<br />
* [[Animating legacy VTK file series]]<br />
: Animating file series.<br />
* [[Disconnecting from server while still saving an animation|Unattended saving of animation]]<br />
: Saving animations on the server without client connection.<br />
* [[Animation View]]<br />
: Using ''Animation View'' to setup animations.<br />
* [[Animating the Camera]] <font color="green">* new in 3.6</font><br />
: Creating animations involving camera movements.<br />
<br />
===Plugins===<br />
* [[Plugin HowTo | Extending ParaView Using Plugins]]<br />
:Using and writing new plugins to extend ParaView's functionality.<br />
* [[Extending ParaView at Compile Time]]<br />
:Including extensions into ParaView at compile time.<br />
* [http://pluginwizard.mirarco.org/ Plugin Wizard]<br />
:A simple wizard application developed by MIRARCO that provides boilerplate code for some of the most common plugin types.<br />
<br />
|bgcolor="#CCCCCC"|<br />
|valign="top"|<br />
<br />
===Other Features===<br />
* [[Color Palettes]] <font color="green">* new in 3.6</font><br />
: Creating visualizations for Print and Screen.<br />
* [[Manually Creating a Colormap]]<br />
: Specify a colormap and save it as an xml file for later use.<br />
* [[Camera and Property Linking]]<br />
: Synchronizing filters, clip planes, camera etc.<br />
* [[Restarted Simulation Readers]]<br />
: Loading restarted data for different file formats.<br />
* [[Custom Filters]]<br />
: Packaging pipelines into a single composite.<br />
* [[Data Selection]]<br />
: Selecting and focusing on subset of a dataset.<br />
* [[Exporting Scenes]]<br />
: Exporting scenes as VRML, X3D etc.<br />
* [[Backwards compatibility in state files]]<br />
: Backwards compatibility for ParaView state files (*.pvsm).<br />
* [[ParaView Settings Files]]<br />
: The locations where ParaView saves settings.<br />
* [[Statistical analysis]] <font color="green">* coming in 3.6.2</font><br />
: Computing statistics and using them to assess datasets.<br />
<br />
=== Books and Tutorials ===<br />
* [http://www.kitware.com/products/books.html The ParaView Guide]<br />
: The official ParaView guide available from Kitware.<br />
* [[IEEE Vis09 ParaView Tutorial]]<br />
: Slides for the advanced topics tutorial by Sandia, Kitware, and LANL.<br />
* [[IEEE Cluster 2009 ParaView Tutorial]]<br />
: Slides on topics for installing and using ParaView on visualization clusters.<br />
* [[SNL ParaView 3 Tutorials]]<br />
: Beginning and advanced tutorial sets, each presented as 2 hour classes by Sandia National Laboratories<br />
* [[SC08 ParaView Tutorial]]<br />
: Material and notes from the Supercomputing '08 tutorial by Sandia and Kitware.<br />
* [[IEEE Vis08 ParaView Tutorial]]<br />
: Slides for the advanced topics tutorial by Sandia, Kitware, and CSCS.<br />
* [[SC07 ParaView Tutorial]]<br />
: Material and notes from the Supercomputing '07 tutorial by Sandia.<br />
* [https://visualization.hpc.mil/paraview HPCMP DAAC - Information & Tutorials on ParaView ]. <br />
: This Wiki is full of useful information and tutorials about ParaView.<br />
* [[Related Publications]]<br />
: ParaView related books, articles and papers<br />
* [[ParaView 2 Tutorials]]<br />
<br />
<br />
|bgcolor="#CCCCCC"|<br />
|valign="top"|<br />
<br />
===Design & Implementation===<br />
* [[Testing design]]<br />
: ParaView GUI Testing framework.<br />
* [[Block Hierarchy Meta Data]]<br />
: Providing details about blocks, hierarchies, assemblies etc. to the client.<br />
* [[Multiple views]]<br />
: Details on handling multiple views in client-server framework.<br />
* [[Composite Datasets in VTK|Composite Datasets]]<br />
: Dealing with composite datasets in VTK.<br />
* [[Representations and Views]]<br />
: Understanding ParaView's views and representations.<br />
* [[Time in ParaView]]<br />
: Understanding Time implementation.<br />
* [[Cross compiling ParaView3 and VTK|Cross-compiling ParaView]]<br />
: Compiling ParaView and VTK on BlueGene and Cray Xt3/Catamount.<br />
* [[Selection Implementation in VTK and ParaView III]]<br />
* [[Suggested online help documentation changes]]<br />
: Suggestions for online help documentation changes.<br />
<br />
===Miscellaneous===<br />
* [http://kitware.com/products/thesource.html The Kitware Source]<br />
: Quarterly newsletter for developers designed to deliver detailed technical articles related to Kitware's open source products including ParaView.<br />
* [[Book Errata]]<br />
* [http://paraview.org/New/help.html More information about ParaView]<br />
<br />
|}<br />
<br />
{{ParaView/Template/Footer}}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK/FAQ&diff=16476VTK/FAQ2009-09-13T07:56:35Z<p>Dcthomp: /* Can I use STL with VTK? */ Move some documentation to the coding guidelines.</p>
<hr />
<div>== General information and availability ==<br />
<br />
=== What is the Visualization Toolkit? ===<br />
<br />
The '''Visualization ToolKit (vtk)''' is a software system for 3D Computer<br />
Graphics and Visualization.<br />
<br />
VTK includes a textbook published by Kitware Inc. ([http://www.kitware.com/products/vtktextbook.html The Visualization<br />
Toolkit, An Object-Oriented Approach to 3D Graphics]),<br />
a C++ class library, and Tcl, Python and Java implementations based on<br />
the class library.<br />
<br />
For more information, see http://www.vtk.org and http://www.kitware.com.<br />
<br />
=== What is the current release? ===<br />
<br />
The current release of vtk is 5.4.0 (released on 2009-3-26). This release for download available from:<br />
<br />
http://www.vtk.org/VTK/resources/software.html<br />
<br />
Nightly development releases are available at:<br />
<br />
http://www.vtk.org/files/nightly<br />
<br />
=== Can I contribute code or bug fixes? ===<br />
<br />
We encourage people to contribute bug fixes as well as new contributions<br />
to the code. We will try to incorporate these into future releases so<br />
that the entire user community will benefit from them.<br />
<br />
See http://www.vtk.org/contribute.php for information on contributing to<br />
VTK.<br />
<br />
For some ideas take a look at some of the entries in the "Changes to the<br />
VTK API" FAQ section, for example: <br />
[[VTK_FAQ#Roadmap:_What_changes_are_being_considered_for_VTK|What changes are being considered for VTK]]<br />
<br />
We now have a bug tracker that allow keeping track of any bug you could find. See [http://www.vtk.org/Bug BugTracker].<br />
You'll need an email to report a bug.<br />
To improve the chance of a bug being fixed, do not hesisitate to add as many details as possible, a demo sample code + sample data is always a good idea.<br />
Providing a patch almost guarantees that your patch will be incorporated into VTK.<br />
<br />
=== Can I contribute money? ===<br />
<br />
Please don't send money. Not that we think you're going to send in<br />
unsolicited money. But if you were thinking about it, stop. It would<br />
just complicate our lives and make for all sorts of tax problems.<br />
<br />
(Note: if you are a company or funding institution, and would like to fund<br />
features or development, please contact Kitware http://www.kitware.com .)<br />
<br />
=== Is there a mailing list or Usenet newsgroup for VTK? ===<br />
<br />
There is a mailing list: vtkusers@vtk.org<br />
<br />
To subscribe or unsubscribe to the mailing list, go to:<br />
http://www.vtk.org/mailman/listinfo/vtkusers<br />
<br />
To search the list archives go to: http://www.kitware.com/search.html<br />
<br />
There is also a newsgroup that mirrors the mailinglist. At this point it<br />
seems that mirror is down. Mail to the mailinglist used to be posted the<br />
newsgroup, but posts on the newsgroup were not sent to the mailinglist.<br />
The newsgroup was located at:<br />
news://scully.esat.kuleuven.ac.be/vtk.mailinglist<br />
<br />
http://www.gmane.org is a bidirectional mail-to-news gateway that carries the vtkusers mailing list. Its located here: news://news.gmane.org/gmane.comp.lib.vtk.user or here: http://news.gmane.org/gmane.comp.lib.vtk.user. vtkusers mails have been archived since April 2002 and they never expire. You can read and send mails to the vtkusers list but sent mail will bounce back without having subscribed to the list first.<br />
<br />
=== Is the VTK mailing list archived anywhere? ===<br />
<br />
The mailing list is archived at:<br />
http://www.vtk.org/pipermail/vtkusers/<br />
<br />
You can search the archive at: http://www.kitware.com/search.html<br />
<br />
=== Are answers for the exercises in the VTK book available? ===<br />
<br />
Not anymore.<br />
<br />
The answers to the exercises of the textbook used to be maintained by<br />
Martin Stoufer (kudos), and will be made available by Kitware in the<br />
near future.<br />
<br />
=== Is VTK regression tested on a regular basis? Can I help? ===<br />
<br />
Yes, it is.<br />
<br />
You can view the current regression test results at:<br />
http://public.kitware.com/dashboard.php?name=vtk<br />
<br />
VTK uses Dart to perform builds, run tests, and generate dashboards. You<br />
can find more information about Dart at: http://public.kitware.com/Dart/<br />
<br />
You can help improve the quality of VTK by supplying the authors with<br />
Tcl scripts that can be used as or turned into regression tests. A good<br />
regression test will:<br />
<br />
# Cover code that is not already covered.<br />
# Illustrate a bug that is occuring now or that has occurred in the past.<br />
# Use data that is on the 2nd Edition book CDROM or use "small" data files or use no data at all.<br />
# Optionally, produce an interesting result. <br />
<br />
Currently almost all regression tests are written in Tcl.<br />
<br />
Please send your Tcl regression tests to:<br />
mailto:wlorens1@mail.nycap.rr.com<br />
<br />
Bill will evaluate them for applicability and integrate them into the<br />
nightly test process.<br />
<br />
=== What's the best way to learn VTK? ===<br />
<br />
There are five things you might want to try:<br />
<br />
# Purchase the book [http://www.kitware.com/products/vtktextbook.html The Visualization Toolkit] from Kitware Inc.<br />
# Purchase the book [http://www.kitware.com/products/vtkguide.html VTK Users Guide] from Kitware Inc. <br />
# [http://www.vtk.org/get-software.php Download the source code and/or binaries] (available on Windows) and work through the examples (there are 400-500 examples). <br />
# To learn the innards of VTK, you can attend a [http://www.kitware.com/products/proftrain.html#VTKCourse VTK course] or [http://www.kitware.com/products/proftrain.html sponsor a VTK course at your site] through Kitware. http://www.kitware.com/products/index.html<br />
# Buy Bill a beer and get him talking about VTK<br />
<br />
=== How should I ask questions on the mailing lists? ===<br />
<br />
The best online resource for this question is Eric S. Raymond's<br />
excellent guide on the topic titled [[http://www.catb.org/~esr/faqs/smart-questions.html How to ask questions the smart way]]. [[http://www.mikeash.com/getting_answers.html Getting Answers]] is a good starting point too.<br />
<br />
Please do read it and follow his advice. Thanks!<br />
<br />
Please also remember the following when you post your messages to the<br />
VTK mailing lists.<br />
<br />
* Mention the version of VTK you are using and the version of the compiler or scripting language you are using.<br />
<br />
* Mention your platform, OS and their versions.<br />
<br />
* Include hardware details if relevant.<br />
<br />
* Include all relevant error messages (appropriately trimmed of course).<br />
<br />
* The lists have a very large number of subscribers (in the thousands), so please keep messages to the point.<br />
<br />
* Avoid HTML emails.<br />
<br />
* Use a sensible and descriptive subject line.<br />
<br />
* Do NOT post large data files or images to the list. Instead put them in your web page and mention the URLs.<br />
<br />
* Quote the messages you reply to appropriately. Remove unnecessary details.<br />
<br />
When asking a question or reporting a problem try to include a small<br />
example program that demonstrates the problem. Make sure that this<br />
example program is as small as you can make it, simple (and uses VTK<br />
alone), complete and demonstrates the problem adequately. Doing this<br />
will go a *long way* towards getting a quick and meaningful response.<br />
<br />
Sometimes you might not get any acceptable response. This happens<br />
bacause the others think the question has either been already answered<br />
elsewhere (the archives, FAQ and google are your friends), or believe<br />
that you have not done enough homework to warrant their attention, or<br />
they don't know the answer or simply don't have the time to answer.<br />
Please do be patient and understanding. Most questions are answered by<br />
people volunteering their time to help you.<br />
<br />
Happy posting!<br />
<br />
=== How NOT to go about a programming assignment ===<br />
<br />
This is really a link you should read before posting to the mailing list. <br />
[This article is an attempt to show these irrational attitudes in an ironical way, <br />
intending to make our students aware of bad habits without admonishing them.]<br />
<br />
http://www.di.uniovi.es/~cernuda/noprog_ENG.html<br />
<br />
=== Accessing VTK CVS from behind a firewall ===<br />
<br />
Use the sourceforge project:<br />
<br />
http://cvsgrab.sourceforge.net/<br />
<br />
Just download the script and type something like:<br />
<br />
cvsgrab -rootUrl http://public.kitware.com/cgi-bin/cvsweb.cgi/ -packagePath VTK -destDir . <br />
-proxyUser xxx -proxyPassword xxx -proxyHost xxx -proxyPort xx<br />
<br />
(Thanks to Ingo H. de Boer)<br />
<br />
Also cvsgrab support the following option to access a particular branch:<br />
<br />
-tag <version tag> [optional] The version tag of the files to download<br />
<br />
For example to get the latest 4.4 branch:<br />
<br />
cvsgrab -rootUrl http://public.kitware.com/cgi-bin/cvsweb.cgi/ -packagePath VTK -destDir . <br />
-proxyUser xxx -proxyPassword xxx -proxyHost xxx -proxyPort xxx<br />
-tag release-4-4<br />
<br />
=== Where can I obtain test and sample datasets? ===<br />
<br />
See [[VTK Datasets|this page]] for details on downloading datasets that VTK can read.<br />
<br />
== Language bindings ==<br />
<br />
=== Are there bindings to languages other than Tcl? ===<br />
<br />
Aside from C++ (which it's written in) and Tcl, vtk is also bound into<br />
Java as of JDK 1.1 and Python 1.5, 1.6 and 2.X. All of the<br />
Tcl/Java/Python wrapper code is generated from some LEX and YACC code<br />
that parses our classes and extracts the required information to<br />
generate the wrapper code.<br />
<br />
=== What version of Tcl/Tk should I use with VTK? ===<br />
<br />
Currently we recommend that you use Tcl/Tk 8.2.3 with VTK. This is the<br />
best-supported version combination at this time.<br />
<br />
VTK has also been tested with Tcl/Tk 8.3.2 and works well.<br />
<br />
Tcl/Tk 8.3.4 has been tested to a limited extent but seems to have more<br />
memory leaks that Tcl 8.3.2 has.<br />
<br />
Tcl/Tk 8.4.x seems to work well with VTK too, but you might have to<br />
change a couple of configuration settings depending on the version of<br />
VTK you are using. Check the [[VTK_FAQ#Does_VTK_support_Tcl.2FTk_8.4_.3F|Does VTK support Tcl/Tk 8.4?]].<br />
<br />
=== Where can I find Python 2.x binaries? ===<br />
<br />
All of the Python binaries available on the kitware site are built for<br />
Python 1.5.2. This includes the official release VTK3.2 and the nightly<br />
builds (as at 2001-07-16).<br />
<br />
For Python 2.x binaries, you will have to compile your own from source.<br />
It is worth checking the mailing list archives for comments by others<br />
who have been through this process.<br />
<br />
There are some user-contributed binaries available at other sites. Check<br />
the mailing list archives for possible leads. Some win32 binaries for<br />
Python 2.1 are available at;<br />
<br />
http://basic.netmeg.net/godzilla/<br />
<br />
YMMV...<br />
<br />
=== Why do I get the Python error -- ValueError: method requires a VTK object? ===<br />
<br />
You just built VTK with Python support and everything went smoothly.<br />
After you install everything and try running a Python-VTK script you get<br />
a traceback with this error:<br />
<br />
ValueError: method requires a VTK object.<br />
<br />
This error occurs if you have two copies of the VTK libraries on your<br />
system. These copies need not be in your linkers path. The VTK libraries<br />
are usually built with an rpath flag (under *nix). This is necessary to<br />
be able to test the build in place. When you install VTK into another<br />
directory in your linkers path and then run a Python script the Python<br />
modules remember the old path and load the libraries in the build<br />
directory as well. This triggers the above error since the object you<br />
passed the method was instantiated from the other copy.<br />
<br />
So how do you fix it? The easiest solution is to simply delete the copy<br />
of the libraries inside your build directory or move the build directory<br />
to another place. For example, if you build the libraries in VTK/bin<br />
then move VTK/bin to VTK/bin1 or remove all the VTK/bin/*.so files. The<br />
error should no longer occur.<br />
<br />
Another way to fix the error is to turn the CMAKE_SKIP_RPATH boolean to<br />
ON in your CMakeCache.txt file and then rebuild VTK. You shouldn't have<br />
to rebuild all of VTK, just delete the libraries (*.so files) and then<br />
re-run cmake and make. The only trouble with this approach is that you<br />
cannot have BUILD_TESTING to ON when you do this.<br />
<br />
Alternatively, starting with recent VTK CVS versions (post Dec. 6, 2002)<br />
and with VTK versions greater than 4.1 (i.e. 4.2 and beyond) there is a<br />
special VTK-Python interpreter built as part of VTK called 'vtkpython'<br />
that should eliminate this problem. Simply use vtkpython in place of the<br />
usual python interpreter when you use VTK-Python scripts and the problem<br />
should not occur. This is because vtkpython uses the libraries inside<br />
the build directory.<br />
<br />
2002 by Prabhu Ramachandran<br />
<br />
=== Does VTK support Tcl/Tk 8.4 ? ===<br />
<br />
Short answer: yes, but it might require some adjustments, depending on<br />
the VTK and CMake versions you are using.<br />
<br />
# The VTK 4.x CVS nightly/development distribution supports Tcl/Tk 8.4 as long as you use a release version of CMake > 1.4.5. Since VTK 4.2 will require CMake 1.6, the next release version will support Tcl/Tk 8.4.<br />
# The VTK 4.0 release distribution does not support Tcl/Tk 8.4 out-of-the-box.<br />
<br />
In either cases, the following solutions will adress the problem. This<br />
basically involves setting two definition symbols that will make Tcl/Tk<br />
8.4 backward compatible with previous versions of Tcl/Tk (i.e. discard<br />
the "const correctness" and Tk_PhotoPutBlock compositing rule features) :<br />
<br />
a) Edit your C/C++ flags:<br />
<br />
Run your favorite CMake cache editor (i.e. CMakeSetup, or ccmake),<br />
display the advanced values and add the USE_NON_CONST and<br />
USE_COMPOSITELESS_PHOTO_PUT_BLOCK definition symbols to the end of any<br />
of the following CMake variables (if they exist): CMAKE_CXX_FLAGS,<br />
CMAKE_C_FLAGS.<br />
<br />
Example: On Unix your CMAKE_CXX_FLAGS will probably look like:<br />
<br />
-g -O2 -DUSE_NON_CONST -DUSE_COMPOSITELESS_PHOTO_PUT_BLOCK<br />
<br />
On Windows (Microsoft MSDev nmake mode):<br />
<br />
/W3 /Zm1000 /GX /GR /YX /DUSE_NON_CONST /DUSE_COMPOSITELESS_PHOTO_PUT_BLOCK<br />
<br />
b) or a more intrusive solution:<br />
<br />
Edit the top VTK/CMakeList.txt file and the following lines add '''at the<br />
top''' of this file:<br />
<br />
ADD_DEFINITIONS(<br />
-DUSE_NON_CONST<br />
-DUSE_COMPOSITELESS_PHOTO_PUT_BLOCK<br />
)<br />
<br />
<br />
=== When I try to run my program with Java-wrapped VTK, why do I get "java.lang.NoClassDefFoundError: vtk/vtkSomeClassName"? ===<br />
The file '''vtk.jar''' is not in your CLASSPATH in your execution environment.<br />
<br />
<br />
=== When I try to run my program with Java-wrapped VTK, why do I get "java.lang.UnsatisfiedLinkError: no vtkSomeLibraryName"? ===<br />
Some or all of the library (e.g., dll) files cannot be found. Make sure the files exist and that the PATH environment variable of your execution environment points to them.<br />
<br />
<br />
=== When I try to run my program with Java-wrapped VTK, why do I get Exception in thread "main" java.lang.UnsatisfiedLinkError: GetOutput_2 at vtk.vtkPolyDataAlgorithm.GetOutput_2(Native Method) ? ===<br />
<br />
== Using VTK ==<br />
<br />
=== The C++ compiler cannot convert some pointer type to another pointer type in my little program ===<br />
<br />
For instance, the C++ compiler cannot convert a <b><tt>vtkDataSet *</tt></b> type to a <b><tt>vtkImageData *</tt></b> type.<br />
<br />
It means the compiler does not know the relationship between a <b><tt>vtkDataSet</tt></b> and a <b><tt>vtkImageData</tt></b>. This relationship is actually inheritance: <b><tt>vtkImageData</tt></b> is a subclass of <b><tt>vtkDataSet</tt></b>. The only way for the compiler to know this relationship is to include the header file of the subclass, that is:<br />
<br />
#include "vtkImageData.h"<br />
<br />
If you wonder why the compiler did not complain about an unknown type, it is because somewhere (probably in a filter header file) there is a forward class declaration, like:<br />
<br />
class vtkImageData;<br />
<br />
=== Accessing a pointer in Python ===<br />
<br />
If you use VTK code with Python and need to pass some VTK data onto them, there are 2 approaches to wrap your code:<br />
# first, you can use the VTK wrapper (already used for the wrapping of VTK code)<br />
# you can use SWIG, which results in a light-weight module.<br />
<br />
In the second case, you will need to convert some VTK data, say a vtkPolyData, to a void pointer (no, it is not sufficient to just pass the object). For that, you can use the __this__ member variable in Python for the VTK data - see mailing archives:<br />
<br />
* [http://public.kitware.com/pipermail/vtkusers/2003-October/070054.html vtk, Python and SWIG - 'state of the union']<br />
<br />
=== What object/filter should I use to do ??? ===<br />
<br />
Frequently when starting out with a large visualization system people<br />
are not sure what object to use to achieve a desired effect.<br />
<br />
The most up-to-date information can be found in the VTK User's Guide<br />
(http://www.kitware.com/products/vtkguide.html).<br />
<br />
Alternative sources for information are the appendix of the book which<br />
has nice one line descriptions of what the different objects do and the<br />
VTK man pages (http://www.vtk.org/doc/nightly/html/classes.html).<br />
<br />
Additionally, the VTK man pages feature a "Related" section that provide<br />
links from each class to all the examples or tests using that class<br />
(http://www.vtk.org/doc/nightly/html/pages.html). This information is<br />
also provided in each class man page under the "Tests" or "Examples"<br />
sub-section.<br />
<br />
Some useful books are listed at http://www.vtk.org/buy-books.php<br />
<br />
=== What 3D file formats can VTK import and export? ===<br />
<br />
The following table identifies the file formats that VTK can read and<br />
write. Importer and Exporter classes move full scene information into or<br />
out of VTK. Reader and Writer classes move just geometry.<br />
<br />
{| border="1" cellpadding="2" cellspacing="0"<br />
|- bgcolor="#abcdef"<br />
! File Format !! Read !! Write<br />
|-<br />
| 3D Studio || vtk3DSImporter || <br />
|-<br />
| AVS "UCD" format || vtkAVSucdReader || <br />
|-<br />
| Movie BYU || vtkBYUReader || vtkBYUWriter<br />
|-<br />
| Renderman || || vtkRIBExporter<br />
|-<br />
| Open Inventor 2.0 || || vtkIVExporter/vtkIVWriter<br />
|-<br />
| CAD STL || vtkSTLReader || vtkSTLWriter<br />
|-<br />
| Fluent GAMBIT ASCII || vtkGAMBITReader || <br />
|-<br />
| Unigraphics Facet Files || vtkUGFacetReader || <br />
|-<br />
| Marching Cubes || vtkMCubesReader || vtkMCubesWriter<br />
|-<br />
| Wavefront OBJ || || vtkOBJExporter<br />
|-<br />
| VRML 2.0 || || vtkVRMLExporter<br />
|-<br />
| VTK Structured Grid &dagger; || vtkStructuredGridReader || vtkStructuredWriter<br />
|-<br />
| VTK Poly Data &dagger; || vtkPolyDataReader || vtkPolyDataWriter<br />
|-<br />
| PLOT3D || vtkPLOT3DReader || <br />
|-<br />
| CGM || || vtkCGMWriter<br />
|-<br />
| OBJ || vtkOBJReader || <br />
|-<br />
| Particle || vtkParticleReader || <br />
|-<br />
| PDB || vtkPDBReader || <br />
|-<br />
| PLY || vtkPLYReader || vtkPLYWriter<br />
|-<br />
| Gaussian || vtkGaussianCubeReader || <br />
|-<br />
| Facet || vtkFacetReader || vtkFacetWriter<br />
|-<br />
| XYZ || vtkXYZMolReader || <br />
|-<br />
| Ensight &Dagger; || vtkGenericEnSightReader || <br />
|}<br />
<br />
&dagger; See the books [http://www.kitware.com/products/vtktextbook.html The<br />
Visualization Toolkit, An Object-Oriented Approach to 3D Graphics] or<br />
[http://www.kitware.com/products/vtkguide.html the User's Guide] for details<br />
about structured grid and poly data file formats.<br />
<br />
&Dagger; The class vtkGenericEnSightReader allows the user to read an EnSight<br />
data set without a priori knowledge of what type of EnSight data set it<br />
is (among vtkEnSight6BinaryReader, vtkEnSight6Reader,<br />
vtkEnSightGoldBinaryReader, vtkEnSightGoldReader,<br />
vtkEnSightMasterServerReader, vtkEnSightReader).<br />
<br />
For any other file format you may want to search for a converter to a<br />
known VTK file format, more info on:<br />
http://www.tech-edv.co.at/lunix/UTILlinks.html<br />
<br />
=== Why can't I find vtktcl (vtktcl.c)? ===<br />
<br />
In versions of VTK prior to 4.0 VTK Tcl scripts would require a:<br />
<br />
catch {load vtktcl} <br />
<br />
so that they could be executed directly from wish. In VTK 4.0 the<br />
correct mechanism is to use:<br />
<br />
package require vtk<br />
<br />
For people using versions earlier than 4.0, vtktcl is a shared library<br />
that is built only on the PC. Most examples used the "catch" notation so<br />
that they will work on UNIX and on the PC. On UNIX you must use the vtk<br />
executable/shell which should be in vtk/tcl/vtk.<br />
<br />
=== Why does this filter not produce any output? eg. GetPoints()==0 ===<br />
<br />
This is a very common question for VTK users. VTK uses a pipeline mechanism for rendering, which has multiple benefits, including the fact that filters that aren't used don't get called. This means that when you call a function such as x->GetOutput()->GetPoints() this will return 0 if the filter has not yet been executed. Just call x->Update() beforehand to make the pipeline update everything up to that point and it should work. -timh<br />
<br />
=== Problems with vtkDecimate and vtkDecimatePro ===<br />
<br />
''vtkDecimate'' and ''vtkDecimatePro'' have been tested fairly heavily so<br />
all known bugs have been removed. However, there are three situations<br />
where you can encounter weird behavior:<br />
<br />
# The mesh is not all triangles. Solution: use ''vtkTriangleFilter'' to triangulate polygons.<br />
# The mesh consists of independent triangles (i.e., not joined at vertices - no decimation occurs). Solution: use ''vtkCleanPolyData'' to link triangles.<br />
# Bad triangles are present: e.g., triangles with duplicate vertices such as (1,2,1) or (100,100,112), or (57,57,57), and so on. Solution: use ''vtkCleanPolyData''.<br />
<br />
=== How can I read DICOM files ? ===<br />
<br />
Starting with VTK 4.4, you can use the [http://www.vtk.org/doc/nightly/html/classvtkDICOMImageReader.html vtkDICOMImageReader class] to read DICOM files. Note however that DICOM is a huge protocol, and vtkDICOMImageReader is not able to read every DICOM file out there. If it does not meet your needs, we suggest you look for an existing converter before coding your own. Some of them are listed in the [http://www.dclunie.com/medical-image-faq/html/part8.html The Medical Image Format FAQ (Part 8)].<br />
<br />
==== GDCM ====<br />
<br />
For a more elaborate DICOM library that supports more image format, you might try [http://gdcm.sourceforge.net GDCM].<br />
Specifically: [http://gdcm.sourceforge.net/html/classvtkGDCMImageReader.html vtkGDCMImageReader] & [http://gdcm.sourceforge.net/html/classvtkGDCMImageWriter.html vtkGDCMImageWriter]<br />
<br />
Grassroots DiCoM is a C++ library for DICOM medical files. It is automatically wrapped to python/C#/Java (using swig). It supports RAW,JPEG (lossy/lossless),J2K,JPEG-LS,RLE and deflated. It also comes with DICOM Part 3,6 & 7 of the standard as XML files.<br />
<br />
If GDCM is too complex to integrate in your environment you can also consider simply using the command line converter: [http://apps.sourceforge.net/mediawiki/gdcm/index.php?title=Gdcmconv gdcmconv] to convert an unsupported DICOM file into something that vtkDICOMImageReader, can support. Typically you would want:<br />
<br />
gdcmconv --raw compressed_input.dcom uncompressed_output.dcom<br />
<br />
==== dicom2 ====<br />
<br />
Sebastien BARRE wrote a free DICOM converter, named dicom2, that can be<br />
used to convert medical images to raw format. This tool is a command<br />
line program and does not provide any GUI at the moment.<br />
http://dicom2.barre.nom.fr/<br />
<br />
There is a special section dedicated to the VTK:<br />
http://dicom2.barre.nom.fr/how-to.html, then "Convert to raw (vtk)"<br />
<br />
The following page also provide links to several other DICOM converters:<br />
http://www.barre.nom.fr/medical/samples/index.html#links<br />
<br />
==== vtkVolume16Reader ====<br />
<br />
When searching the vtkusers mailing list a lot of posts are still using vtkVolume16Reader to read in DICOM file. It will works in the following case:<br />
* You know the dimension (cols & rows) of your image<br />
* You know the spacing of your image<br />
* You know the pixel type (pixel type & #components) of your image<br />
* You know Pixel Data (7fe0,0010) is the last element in the image<br />
* You know Pixel Data (7fe0,0010) was sent in uncompressed format (not encapsulated)<br />
<br />
All those requirements are a stronger set of requirements than vtkDICOMImageReader, therefore it is encourage to use vtkDICOMImageReader instead.<br />
<br />
==== The spacing in my DICOM files are wrong ====<br />
<br />
Image Position (Patient) (0020,0032) is the only attribute that can be relied on to determine the "reconstruction interval" or "space between the center of slices".<br />
<br />
If the distance between Image Position (Patient) (0020,0032) of two parallel slices along the normal to Image Orientation (Patient) (0020,0037) is not the same as whatever happens to be in the DICOM Spacing Between Slices (0018,0088) attribute, then (0018,0088) is incorrect, without question<br />
<br />
This is a known bug in some scanners.<br />
<br />
When Slice Thickness (0018,0050) + Spacing Between Slices (0018,0088) equals the computed reconstruction interval, then chances are the modality implementor has made the obvious mistake of misinterpreting the definition of<br />
(0018,0088) to mean the distance between edges (gap) rather than the distance between centers.<br />
<br />
Further, one should never use Slice Location (0020,1041) either, an optional and purely annotative attribute, though chances are that the distance between the Slice Location (0020,1041) values of two slices will match the distance along the<br />
normal to the orientation derived from the position.<br />
<br />
The GDCM library simply discard any information present in the (0018,0088) tag and instead recompute the spacing by computing the distance in between two consecutive slices (along the normal).<br />
<br />
GDCM 1.x:<br />
typedef std::vector<gdcm::File *> FileList;<br />
FileList l;<br />
gdcm::SerieHelper sh;<br />
sh.OrderFileList(l); // calls ImagePositionPatientOrdering()<br />
zspacing = sh.GetZSpacing();<br />
<br />
GDCM 2.x:<br />
IPPSorter ipp;<br />
ipp.Sort( filenames );<br />
zspacing = ipp.GetZSpacing();<br />
<br />
=== How to handle large data sets in VTK ===<br />
<br />
One of the challenges in VTK is to efficiently handle large datasets. By<br />
default VTK is tuned towards smaller datasets. For large datasets there<br />
are a couple of changes you can make that should yield a much smaller<br />
memory footprint (less swapping) and also improve rendering performance.<br />
The solution is to:<br />
<br />
# Use ReleaseDataFlag,<br />
# Turn on ImmediateModeRendering<br />
# Use triangle strips via vtkStripper<br />
# Use a different filter or mapper<br />
<br />
Each of these will be discussed below.<br />
<br />
==== Using ReleaseDataFlag ====<br />
<br />
By default VTK keeps a copy of all intermediate results between filters<br />
in a pipeline. For a pipeline with five filters this can result in<br />
having six copies of the data in memory at once. This can be controlled<br />
using ReleaseDataFlag and GlobalReleaseDataFlag. If ReleaseDataFlag is<br />
set to one on a data object, then once a filter has finished using that<br />
data object, it will release its memory. Likewise, if<br />
GlobalReleaseDataFlag is set on ANY data object, all data objects will<br />
release their memory once their dependent filter has finished executing.<br />
For example in Tcl and C++<br />
<br />
# Tcl<br />
vtkPolyDataReader reader<br />
[reader GetOutput] ReleaseDataFlagOn<br />
<br />
// C++<br />
vtkPolyDataReader *reader = vtkPolyDataReader::New();<br />
reader->GetOutput()->ReleaseDataFlagOn();<br />
<br />
or<br />
<br />
// C++<br />
vtkPolyDataReader *reader = vtkPolyDataReader::New();<br />
reader->GetOutput()->GlobalReleaseDataFlagOn();<br />
<br />
While turning on the ReleaseDataFlag will reduce your memory footprint,<br />
the disadvantage is that none of the intermediate results are kept in<br />
memory. So if you interactively change a parameter of a filter (such as<br />
the isosurface value), all the filters will have to re-execute to<br />
produce the new result. When the intermediate results are stored in<br />
memory, only the downstream filters would have to re-execute.<br />
<br />
One hint for good interactive performance. If only one stage of the<br />
pipeline can have its parameters changed interactively (such as the<br />
target reduction in a decimation filter), only retain the data just<br />
prior to that step (which is the default) and turn ReleaseDataFlag on<br />
for all other steps.<br />
<br />
==== Use ImmediateModeRendering ====<br />
<br />
By default, VTK uses OpenGL display lists which results in another copy<br />
of the data being stored in memory. For most large datasets you will be<br />
better off saving memory by not using display lists. You can turn off<br />
display lists by turning on ImmediateModeRendering. This can be<br />
controlled on a mapper by mapper basis using ImmediateModeRendering, or<br />
globally for all mappers in a process by using<br />
GlobalImmediateModeRendering. For example:<br />
<br />
# Tcl<br />
vtkPolyDataMapper mapper<br />
mapper ImmediateModeRenderingOn<br />
<br />
// C++<br />
vtkPolyDataMapper *mapper = vtkPolyDataMapper::New();<br />
mapper->ImmediateModeRenderingOn();<br />
<br />
or<br />
<br />
// C++<br />
vtkPolyDataMapper *mapper = vtkPolyDataMapper::New();<br />
mapper->GlobalImmediateModeRenderingOn();<br />
<br />
The disadvantage to using ImmediateModeRendering is that if memory is<br />
not a problem, your rendering rates will typically be slower with<br />
ImmediateModeRendering turned on.<br />
<br />
==== Use triangle strips via vtkStripper. ====<br />
<br />
Most filters in VTK produce independent triangles or polygons which are<br />
not the most compact or efficient to render. To create triangle strips<br />
from polydata you can first use vtkTriangleFilter to convert any<br />
polygons to triangles (not required if you only have triangles to start<br />
with) then run it through a vtkStipper to convert the triangles into<br />
triangle strips. For example in C++<br />
<br />
vtkPolyDataReader *reader = vtkPolyDataReader::New();<br />
reader->SetFileName("yourdatafile.vtk");<br />
reader->GetOutput()->ReleaseDataFlagOn();<br />
<br />
vtkTriangleFilter *tris = vtkTriangleFilter::New();<br />
tris->SetInput(reader->GetOutput());<br />
tris->GetOutput()->ReleaseDataFlagOn();<br />
<br />
vtkStripper *strip = vtkStripper::New();<br />
strip->SetInput(tris->GetOutput());<br />
strip->GetOutput()->ReleaseDataFlagOn();<br />
<br />
vtkPolyDataMapper *mapper = vtkPolyDataMapper::New();<br />
mapper->ImmediateModeRenderingOn();<br />
mapper->SetInput(tris->GetOutput());<br />
<br />
The only disadvantage to using triangle strips is that they require time<br />
to compute, so if your data is changing every time you render, it could<br />
actually be slower.<br />
<br />
==== Use a different filter or mapper ====<br />
<br />
This is a tough issue. In VTK there are typically a couple of ways to<br />
solve any problem. For example an image can be rendered as a polygon for<br />
each pixel, or it can be rendered as a single polygon with a texture map<br />
on it. For almost all cases the second approach will be much faster than<br />
the first event though VTK supports both. There isn't a single good<br />
answer for how to find the best approach. If you suspect that it is<br />
running more slowly than it should, try posting to the mailing list or<br />
looking for other ways to achieve the same result.<br />
<br />
=== VTK is slow, what is wrong? ===<br />
<br />
We have heard people say that VTK is really slow. In many of these<br />
cases, changing a few parameters can make a huge difference in performance.<br />
<br />
If you find that VTK is slower than other visualization systems running<br />
the same problem first take a look at the FAQ section dealing with large<br />
data: [[VTK_FAQ#How_to_handle_large_data_sets_in_VTK|How to handle large data sets in VTK]]. Many of its suggestions<br />
will improve VTK's performance significantly for many datasets.<br />
<br />
If you still find VTK slow, please let us know and send us an example<br />
(to mailto:kitware@kitware.com). In the past there<br />
have been some filters that simply were not written to be fast. When we<br />
come across one of these we frequently can make minor changes to the<br />
filter that will make it run much more quickly. In fact many changes in<br />
the past couple years have been this type of performance improvement.<br />
<br />
=== Is VTK thread-safe ? ===<br />
<br />
The short answer is no.<br />
<br />
Many VTK sources and filters cache information and will not perform as<br />
expected when used in multiple threads. When writing a multithreaded<br />
filter, the developer has to be very careful about how she accesses data.<br />
<br />
For example, GetXXX() methods which return a pointer should only be used<br />
to read. If the pointer returned by these methods are used to change<br />
data in multiple threads (without mutex locks), the result will most<br />
probably be wrong and unpredictable. In many cases, there are<br />
alternative methods which copy the data referred by the pointer. For<br />
example:<br />
<br />
float* vtkDataArray::GetTuple(const vtkIdType i);<br />
<br />
is thread-safe only for reading whereas:<br />
<br />
void vtkDataArray::GetTuple (const vtkIdType i, float * tuple);<br />
<br />
copies the requested tuple and is thread safe even if tuple is modified<br />
afterwards (as long as the same pointer is not passed as the argument<br />
tuple simultaneously by different threads).<br />
<br />
Unfortunately, only very few methods are clearly marked as<br />
thread-(un)safe and, in many situations, the developer has to dig into<br />
the source code to figure out whether an accessor is thread safe or not.<br />
<br />
''vtkDataSet'' and most of it's sub-classes are well documented and almost<br />
all methods are marked thread-safe or not thread-safe. This might be a<br />
good place to start. Most of the filters in imaging and some filters in<br />
graphics (like ''vtkStreamer'') are good examples of how a multi-threaded<br />
filter can be written in VTK.<br />
<br />
However, if you are not interested in developing multithreaded filters<br />
but want to process some data in parallel using the same (or similar)<br />
pipeline, your job is much easier. To do this, create a different copy<br />
of the pipeline on each thread and execute them in parallel on a<br />
different piece of the data. This is best accomplished by using<br />
''vtkThreadedController'' (instead of ''vtkMultiThreader''). See the<br />
documentation of ''vtkMultiProcessController'' and ''vtkThreadedController''<br />
and the examples in the parallel directory for details on how this can<br />
be done.<br />
<br />
Also, note that most of the OpenGL libraries are not thread-safe.<br />
Therefore, if you are rendering to multiple render windows from<br />
different threads, you are likely to get in trouble, even if you have<br />
mutex locks around the render calls.<br />
<br />
=== Can I use STL with VTK? ===<br />
<br />
As of VTK version 4.2, you can use the STL.<br />
However, see the [[VTK Coding Standards]] for limitations.<br />
Here's an example (from vtkInterpolateVelocityField):<br />
<br />
In the .h file (the PIMPL) forward declare<br />
<br />
class vtkInterpolatedVelocityFieldDataSetsType;<br />
//<br />
class VTK_COMMON_EXPORT vtkInterpolatedVelocityField : public vtkFunctionSet<br />
{<br />
private:<br />
vtkInterpolatedVelocityFieldDataSetsType* DataSets;<br />
};<br />
<br />
In the .cxx file define the class (here deriving from the STL vector<br />
container)<br />
<br />
# include <vtkstd/vector><br />
typedef vtkstd::vector< vtkSmartPointer<vtkDataSet> > DataSetsTypeBase;<br />
class vtkInterpolatedVelocityFieldDataSetsType: public DataSetsTypeBase<br />
{};<br />
<br />
In the .cxx file construct and destruct the class:<br />
<br />
vtkInterpolatedVelocityField::vtkInterpolatedVelocityField()<br />
{<br />
this->DataSets = new vtkInterpolatedVelocityFieldDataSetsType;<br />
}<br />
vtkInterpolatedVelocityField::~vtkInterpolatedVelocityField()<br />
{<br />
delete this->DataSets;<br />
}<br />
<br />
And in the .cxx file use the container as you would any STL container:<br />
<br />
for ( DataSetsTypeBase::iterator i = this->DataSets->begin();<br />
i != this->DataSets->end(); ++i)<br />
{<br />
ds = i->GetPointer();<br />
....<br />
}<br />
<br />
=== What image file formats can VTK read and write? ===<br />
<br />
The following table identifies the image file formats that VTK can read<br />
and write.<br />
<br />
{| border="1" cellpadding="2" cellspacing="0"<br />
|- bgcolor="#abcdef"<br />
! Image File !! Read !! Write<br />
|-<br />
| AVI || || vtkAVIWriter<br />
|-<br />
| Bitmap || vtkBMPReader || vtkBMPWriter<br />
|-<br />
| Digital Elevation Model (DEM) || vtkDEMReader || <br />
|-<br />
| DICOM || vtkDICOMImageReader || <br />
|-<br />
| GE Signal || vtkGESignaReader || <br />
|-<br />
| JPEG || vtkJPEGReader || vtkJPEGWriter<br />
|-<br />
| FFMPEG || || vtkFFMPEGWriter<br />
|-<br />
| MINC (1.1) || vtkMINCImageReader || vtkMINCImageWriter<br />
|-<br />
| MPEG2 || || vtkMPEG2Writer<br />
|-<br />
| Binary UNC meta image data || vtkMetaImageReader || vtkMetaImageWriter<br />
|-<br />
| PNG || vtkPNGReader || vtkPNGWriter<br />
|-<br />
| PNM || vtkPNMReader || vtkPNMWriter<br />
|-<br />
| PostScript || || vtkPostScriptWriter <br />
|-<br />
| SLC || vtkSLCReader || <br />
|-<br />
| TIFF || vtkTIFFReader || vtkTIFFWriter<br />
|-<br />
| RAW files &dagger; || vtkImageReader, vtkVolumeReader || <br />
|}<br />
<br />
&dagger; A typical example of use is:<br />
<br />
# Image pipeline<br />
reader = vtkImageReader()<br />
reader.SetDataByteOrderToBigEndian()<br />
reader.SetDataExtent(0,511,0,511,0,511)<br />
reader.SetFilePrefix("Ser397")<br />
reader.SetFilePattern("%s/I.%03d")<br />
reader.SetDataScalarTypeToUnsignedShort()<br />
reader.SetHeaderSize(5432)<br />
<br />
=== Printing an object. ===<br />
<br />
Sometimes when debugging you need to print an object to a string, either<br />
for logging purposes, or in the case of windows applications, to a window.<br />
<br />
Here is a way to do this:<br />
<br />
std::ostringstream os;<br />
//<br />
// "SomeVTKObject" could be, for example, <br />
// declared somewhere as: vtkCamera *SomeVTKObject;<br />
//<br />
SomeVTKObject->Print(os);<br />
vtkstd::string str = os.str();<br />
//<br />
// Process the string as you want<br />
<br />
=== Writing a simple CMakeLists.txt. ===<br />
<br />
If you get something that looks like:<br />
<br />
undefined reference to<br />
`__imp___ZN13vtkTIFFReader3NewEv'<br />
collect2: ld returned 1 exit status <br />
<br />
You certainly forgot to pass in a library to your executable. The easisest way is to use CMakeLists.txt file.<br />
<br />
For example the minimal project is:<br />
<br />
FIND_PACKAGE(VTK)<br />
IF (VTK_FOUND)<br />
INCLUDE (${VTK_USE_FILE})<br />
ENDIF (VTK_FOUND)<br />
ADD_EXECUTABLE(tiff tiff.cxx )<br />
TARGET_LINK_LIBRARIES (tiff<br />
vtkRendering<br />
)<br />
<br />
Since vtkRendering is link against all other vtk lib. Except if you are building VTK with Hybrid or Parallel in that case you need to explicitely specify which library you want to link against.<br />
<br />
=== Testing for VTK within a configure script ===<br />
<br />
VTK uses CMake as build tool but if you VTK-based application wants to use autoconf and/or automake, then you will find very useful an M4 macro file which detects from your configure script the presence/absence of VTK on the user system. VTK won't add such file into the official distribution but you can always write your own, as I did.<br />
Look in [[VTK_Autoconf]] page for more info.<br />
<br />
=== How do I get my C++ code editor to do VTK-style indentation? ===<br />
<br />
If you are writing code with VTK, you may want to follow the [[VTK Coding Standards]]. This is particularly important if you plan to contribute back to VTK. Most C++ code editors will help you with indenting, but the indenting may differ significantly from that prescribed by the [[VTK Coding Standards]]. Fortunately, most editors have enough options to allow you to change the indention enough to get at least close to the VTK-style indentation.<br />
<br />
Below is a list of C++ editors and some suggestions on getting the indentation VTK compliant. If you use a popular editor that is not listed here, please feel free to contribute.<br />
<br />
==== Microsoft Visual C++ .NET indentation ====<br />
<br />
Under the "Tools" menu, select "Options". Go to the options under "Text Editor" and then "C/C++". Click the "Tabs" options. Set "Indenting" to "Smart", "Indent Size" to 2, and select "Insert spaces". Click the "Formatting" options enable "Indent braces".<br />
<br />
This will make most of the indentation correct. However, it will indent all of the braces. In VTK classes, most of the braces are indented, but those starting a class, method, or function are typically flush left. You will have to correct this on your own.<br />
<br />
==== Emacs indentation ====<br />
<br />
Place the [[Elisp Code for VTK-Style C Indentation]] in your .emacs file.<br />
<br />
==== Vim indentation ====<br />
<br />
[[user talk:Andy|Andy Cedilnik]] has some information on following the VTK coding guidelines using vim. You may place the following in your <code>~/.vimrc</code> file<br />
set tabstop=2 " Tabs are two characters<br />
set shiftwidth=2 " Indents are two charactes too<br />
set expandtab " Do not use tabs<br />
set cinoptions={1s,:0,l1,g0,c0,(0,(s,m1<br />
"Keep tabs in makefiles as they are significant:<br />
:autocmd BufRead,BufNewFile [Mm]akefile :set noexpandtab<br />
<br />
=== How to display transparent objects? ===<br />
(keywords: alpha, correct, depth, geometry, object, opacity, opaque, order, ordering, peel, peeling, sorting, translucent, transparent.)<br />
<br />
When opaque geometry is rendered, there is no need to sort it because the depth buffer (or z-buffer) is used and the sorting is done automatically by keeping the geometry closest to the viewpoint at<br />
a given pixel. (It is easy because it is a MAX/MIN calculation, not a real sorting).<br />
<br />
With translucent geometry the final color of a pixel is the contribution of all the geometry primitives visible through the pixel. The color of the pixel is the result of <b>a</b> blending operation between the colors of all visible primitives. Blending operations themselves are usually order-dependent (ie not commutative). That's why depth sorting is required. There are two ways to fix the ordering in VTK:<br />
<br />
*1. Append all your polygonal geometry with [http://www.vtk.org/doc/nightly/html/classvtkAppendPolyData.html vtkAppendPolyData] and pass it to [http://www.vtk.org/doc/nightly/html/classvtkDepthSortPolyData.html vtkDepthSortPolyData]. See this tcl [http://public.kitware.com/cgi-bin/viewcvs.cgi/*checkout*/Hybrid/Testing/Tcl/depthSort.tcl?root=VTK&content-type=text/plain example]. Depth sorting is done per centroid of geometry primitives, not per pixel. For this reason it is not exact but it solves <b>most</b> of the ordering and gives result usually good enough.<br />
* 2. If the graphics card supports it, use "[[VTK/Depth_Peeling | depth peeling]]". It performs per pixel sorting (better result) but it is really slow.<br />
<br />
== Platform-specific questions ==<br />
<br />
=== What platforms does vtk run on? ===<br />
<br />
VTK should compile and run on most versions of Unix, Linux, Windows, and Mac OS X. It has been tested on Suns, SGIs, HPs, Alphas, RS6000s and many Windows and Mac workstations.<br />
<br />
=== What Graphics Cards work with VTK ===<br />
<br />
VTK uses OpenGL to perform almost all of its rendering and some graphics cards/drivers have better support for OpenGL than others. This is not a listing of what cards perform well. It is a listing of what cards actually produce correct results. Here is a list of cards and their status roughly in best to worst order.<br />
<br />
* Any Nvidia desktop card on Windows -- 100% compatible<br /> <br />
* Any ATI desktop cards on Windows -- 100% compatible<br /><br />
* Mesa -- most releases pass all VTK tests<br /><br />
* Microsoft Software OpenGL -- passes all VTK tests but does have a couple bugs<br /><br />
* Mac graphics cards -- these usually pass all VTK tests. Older cards may have some issues, for example, the ATI Rage 128 Pro does not support textures larger than 1024x1024.<br /><br />
* Non-linux UNIX cards (Sun HP SGI) -- These generally work<br /><br />
* Any Nvidia card under linux -- these usually pass all VTK tests but have some issues<br /><br />
* Any ATI card under linux -- these usually pass all VTK tests but have some issues<br /><br />
* Nvidia laptop graphics cards under Windows -- known to have some issues, newer cards pass all tests<br /><br />
* ATI laptop graphics cards under Windows -- known to have some issues, newer cards pass all tests (e.g. [http://public.kitware.com/pipermail/vtkusers/2004-August/075966.html ATI Mobility Radeon 9600])<br /><br />
* Intel Extreme Graphics -- fails some VTK tests<br /><br />
<br />
=== How do I build the examples on the PC running Windows? ===<br />
<br />
Since building the C++ examples on the PC isn't all that easy, here are<br />
some instructions from Jack McInerney.<br />
<br />
Steps for creating a VTK C++ project 8/14/96<br />
<br />
This is based on what I learned creating a project to run the Mace<br />
example. These steps allowed me to successfully built and run this example.<br />
<br />
# Create a console project (File, New, then select Console application).<br />
# Add the files of interest to the project. (e.g., Mace.cxx)<br />
# Under Build, select Update all Dependencies. A long list of .hh files will show up under dependencies<br /> For this to work, Visual C++ needs to know where to look to find the include files. In my case they are at C:\VTK\VTK12SRC\INCLUDE. To tell Visual C++ to look there, go to Tools, Options. Select the tab Directories. Under the list for Include files add: C:\VTK\VTK12SRC\INCLUDE<br />
# Compile the file Mace.cxx. This will lead to many warnings about data possibly lost as double variables are converted to float variables. These can be gotten rid of by going to Build, Settings, and select the C++ tab. Under the General catagory, set Warning Level to 1* (instead of 3).<br />
# Before linking, some additional settings must be modified. Go to Build, Settings, and select the Link tab. In the General catagory, add the libraries opengl32.lib and glaux.lib to the Object/Library Modules. Put a space between each file name. Then select the C++ tab and the Category: Code Generation. Under Use Run-Time Library, select Debug Multithreaded DLL. Select OK to exit the dialog box. The above libraries are available from Microsoft's Web site at: http://www.microsoft.com/softlib/mslfiles/Opengl95.exe or ftp://ftp.microsoft.com/softlib/mslfiles/Opengl95.exe <br /> This is a self extracting archive which contains these files. Simply place them in your windows system directory.<br />
# Link the code by selecting Build, Build MaceProject.exe. I still get one warning when I do this, but it appears to be harmless<br /><br />
<br />
When you go to run the program, it will bomb out unless it can find 2<br />
DLLs: Opengl32.dll and Glu32.dll. These need to be located either in the<br />
project directory or the C:\WINDOWS directory. These files are supplied<br />
on the vtk CD-ROM (in the vtk\bin directory).<br />
<br />
=== How do I build the Java examples on the PC running Windows? ===<br />
One common issue building the examples is missing one or all of vtkPanel, vtkCanvas and AxesActor<br />
classes. For whatever reason these are not in the vtk.jar (at least for 4.2.2).<br />
But you can get them from the source distribution (just unzip the source and extract<br />
these needed .java files, and point your Java-compiler to them).<br />
<br />
Another common issue appears to be class loading dependency errors. Make sure the<br />
directory with the .dll files is in your classpath when you run (default location<br />
is C:\Program Files\vtk42\bin\). Yet this still seems insufficient for some of the<br />
libraries. One possible solution is to copy the Java awt.dll to this directory as<br />
well.<br />
<br />
=== 64-bit System Issues ===<br />
<br />
vtk builds on 64 bit systems, that is, systems where sizeof(void*) is 64 bits. However, parts of the vtk codebase are not 64 bit clean and so runtime problems are likely if that code is used.<br />
<br />
===== General =====<br />
VTK binary files are not compatible between 32-bit and 64-bit systems. For portability, use the default file type, ASCII, for vtkPolyDataWriter, etc. You may be able to write a binary file on a 64-bit system and read it back in.<br />
<br />
===== Mac OS X Specific =====<br />
Mac OS X 10.3 and earlier have no support for 64 bit. On Mac OS X 10.4, VTK cannot be built as 64 bit because it requires Carbon, Cocoa, or X11, none of which are available to 64 bit processes. On Mac OS X 10.5, Cocoa is available to 64 bit processes, but Carbon is not. VTK is known to work reasonably with 64 bit Cocoa.<br />
<br />
===== Windows Specific =====<br />
todo<br />
<br />
=== What size swap space should I use on a PC? ===<br />
<br />
Building vtk on the PC requires a significant amount of memory (at least<br />
when using Visual C++)... but the final product is nice and compact. To<br />
build vtk on the PC, we recommend setting the min/max swap space to at<br />
least 400MB/500MB (depending on how much RAM you have... the sum of RAM<br />
and swap space should be roughly 500+ MB).<br />
<br />
=== Are there any benchmarks of VTK and/or the hardware it runs on? ===<br />
<br />
Take a look at the "Simple Sphere Benchmark":<br />
<br />
http://www.barre.nom.fr/vtk/bench.html<br />
<br />
It is not a "real world" benchmark, but provide synthetic results<br />
comparing different hardware running VTK:<br />
<br />
http://purl.oclc.org/NET/rriv/vtk/sphere-bench<br />
<br />
=== Why is XtString undefined when using VTK+Python on Unix? ===<br />
<br />
This is a side effect of dynamic linking on (some?) Unix systems. It<br />
appears often on Linux with the Mesa libraries at least. The solution is<br />
to make sure your Mesa libraries are linked with the Xt library. One way<br />
to do this is to add "-lXt" to MESA_LIB in your user.make file.<br />
<br />
=== How do I get the Python bindings to work when building VTK with Borland C++? ===<br />
<br />
If you've built VTK with the freely downloadable Borland C++ 5.5 (or its<br />
commercial counterpart) and you're using the Python binaries from<br />
http://www.python.org/, you'll note that when you try to run a VTK<br />
Python example you get something similar to the following error message:<br />
<br />
from vtkCommonPython import * <br />
ImportError: dynamic module does not define init function<br />
(initvtkCommonPython)<br />
<br />
This is because BCC32 prepends an underscore ("_") to all exported<br />
functions, so (in this case) the vtkCommonPython.dll contains a symbol<br />
_initvtkCommonPython which Python does not find. All kits (e.g.<br />
Rendering, Filtering, Patented) will suffer from this problem.<br />
<br />
The solution is to create Borland module definition in the VTK binary<br />
(output) directory, in my case VTK/bin. You have to do this for all kits<br />
that you are planning to use in Python. Each .def file must have the<br />
same basename as the DLL, e.g. "vtkCommonPython.def" for<br />
vtkCommonPython.dll and it must be present at VTK link time. The def<br />
file contains an export alias, e.g.:<br />
<br />
EXPORTS<br />
initvtkCommonPython=_initvtkCommonPython<br />
<br />
The Borland compiler will create an underscore-less alias in the DLL<br />
file and Python will be able to load it as a module.<br />
<br />
=== How do I build Python bindings on AIX? ===<br />
<br />
There is a problem with dynamic loading on AIX. Old AIX did not have<br />
dlopen/dlsym, but they used load mechanism. Python still reflects this.<br />
VTK is however not compatible with the old load mechanism.<br />
<br />
The following patch to Python 2.2.2 makes python use dlopen/dlsym on AIX<br />
5 or greater.<br />
<br />
http://www.vtk.org/files/misc/python_aix.diff<br />
<br />
=== How to build VTK for offscreen rendering? ===<br />
<br />
<b>[this section is obsolete. Mangle Mesa is not supported anymore in VTK>=5.2]</b> (not sure about 5.0)<br />
<br />
Struggled a few hours to get VTK to do offscreen rendering. I use it to<br />
batch process medical images. Without actually producing output on the<br />
screen, I still print resulting images in a report to easily review the<br />
results of an experiment.<br />
<br />
Here is how I solved this problem for VTK version 4.2.2.<br />
<br />
1. Download Mesa-4.0.4 source<br />
<br />
Modify Mesa-4.0.4/Make-config in the 'linux:' target the following vars:<br />
<br />
GL_LIB = libVTKMesaGL.so<br />
GLU_LIB = libVTKMesaGLU.so<br />
GLUT_LIB = libVTKMesaglut.so<br />
GLW_LIB = libVTKMesaGLw.so<br />
OSMESA_LIB = libOSVTKMesa.so<br />
<br />
In Mesa 6.2.1 you need to edit Mesa/configs/default instead:<br />
<br />
# Library names (base name)<br />
GL_LIB = VTKMesaGL<br />
GLU_LIB = VTKMesaGLU<br />
GLUT_LIB = VTKMesaglut<br />
GLW_LIB = VTKMesaGLw<br />
OSMESA_LIB = VTKMesaOSMesa<br />
<br />
<br />
And then export this env var:<br />
<br />
export CFLAGS="-O -g -ansi -pedantic -fPIC -ffast-math-DUSE_MGL_NAMESPACE -D_POSIX_SOURCE -D_POSIX_C_SOURCE=199309L-D_SVID_SOURCE -D_BSD_SOURCE -DUSE_XSHM -DPTHREADS -I/usr/X11R6/include"<br />
<br />
then<br />
<br />
For Mesa 4.0.4<br />
<br />
make -f Makefile.X11 linux <br />
cp Mesa-4.0.4/lib/* /data/usr/mesa404/lib/<br />
<br />
in Mesa 6.2.1:<br />
<br />
make linux-x86<br />
make install<br />
(I generally use /opt/VTKMesa/*)<br />
<br />
I use 'VTKMesa' name extension to avoid conflicts with my RH9.0 libs<br />
(especially OSMesa lib in XFree!). I'm using shared libraries, because<br />
that allows me to use dynamic libs from VTK and not the vtk program<br />
itself without explicitly having to load VTKMesaGL with my app. I copied<br />
the 'VTKMesa' libs in /data/usr/mesa404/lib/, but any odd place probably<br />
will work. Avoid /usr/lib /usr/local/lib for now.<br />
<br />
2. Follow normal instructions to get a proper working vtk, then<br />
<br />
ccmake <br />
<br />
with the following options:<br />
<br />
{| border="1" cellpadding="2" cellspacing="0"<br />
| VTK_USE_MANGLED_MESA || ON<br />
|-<br />
| MANGLED_MESA_INCLUDE_DIR || /data/usr/mesa404/include<br />
|-<br />
| MANGLED_MESA_LIBRARY || /data/usr/mesa404/lib/libVTKMesaGL.so<br />
|-<br />
| MANGLED_OSMESA_INCLUDE_DIR || /data/usr/mesa404/include<br />
|-<br />
| MANGLED_OSMESA_LIBRARY || /data/usr/mesa404/lib/libVTKMesaOSMesa.so<br />
|-<br />
| OPENGL_xmesa_INCLUDE_DIR || /data/usr/mesa404/include<br />
|}<br />
<br />
test using /data/prog/VTK-4.2.2/Examples/MangledMesa/Tcl scripts<br />
<br />
<br />
If you're doing things on UNIX, you should also look at [[VTK Classes]]. It has links to RenderWindow objects that are probably easier to use than rebuilding VTK with Mesa.<br />
<br />
=== How to get keyboard events working on Mac OS X? ===<br />
<br />
On Mac OS X, there are (at least) two kinds of executables:<br />
* [http://developer.apple.com/documentation/MacOSX/Conceptual/BPInternational/Articles/InternatSupport.html#//apple_ref/doc/uid/20000278-73764 Application Bundles]<br />
* plain UNIX executables<br />
<br />
For a program to be able to display a graphical interface (that is, display windows that allow mouse and keyboard interaction) it really should be an Application Bundle. If a plain UNIX executable tries, there will be various bugs, such as keyboard and mouse events not working reliably.<br />
<br />
Many, but not all, of the example VTK applications are built as plain UNIX executables, and thus have these problems. This is [http://www.vtk.org/Bug/bug.php?op=show&bugid=2025 VTK bug 2025].<br />
<br />
When you build your own VTK application, it is best to make it in the form of an Application Bundle. With CMake 2.0.5 or later, simply add the following to your CMakeLists.txt file:<br />
<br />
IF(APPLE)<br />
SET(EXECUTABLE_FLAG MACOSX_BUNDLE)<br />
ENDIF(APPLE)<br />
<br />
If for some reason you cannot build as an Application Bundle (perhaps because your app needs command line parameters) you might be able to avoid the above problems by adding an [http://developer.apple.com/documentation/MacOSX/Conceptual/BPRuntimeConfig/Articles/ConfigFiles.html#//apple_ref/doc/uid/20002091-SW1 __info_plist section] to your Mach-O executable. If you succeed, please post to the VTK list.<br />
<br />
=== Can VTK be built as a Universal Binary on Mac OS X? ===<br />
<br />
For VTK 5.0.4 and older, the short answer is "no".<br />
<br />
For VTK CVS the short answer is "mostly". You need to set CMAKE_OSX_ARCHITECTURES to the architectures you want and CMAKE_OSX_SYSROOT to a Mac OS X SDK that supports Universal builds. The usual settings are:<br />
<br />
CMAKE_OSX_ARCHITECTURES=ppc;i386 <br />
CMAKE_OSX_SYSROOT=/Developer/SDKs/MacOSX10.4u.sdk <br />
<br />
This will result in a Universal build. However, there may be runtime bugs due to VTK's use of TRY_RUN. Work is being done to improve this situation.<br />
<br />
=== How can I stop Java Swing or AWT components from flashing or bouncing between values? ===<br />
<br />
While not strictly a VTK problem, this comes up fairly often when using Java-wrapped VTK. Try the following two JRE arguments to stop the Swing/AWT components flashing:<br />
-Dsun.java2d.ddoffscreen=false -Dsun.java2d.gdiblit=false<br />
Note that these are classified as "unsupported properties," so may not work on all platform or installations (in particular, ddoffscreen refers to DirectDraw and, as such, is specific to Windows).<br />
<br />
=== How can a user process access more than 2 GB of ram in 32-bit Windows? ===<br />
<br />
By default on Windows, the most memory that a user process can access is 2 GB, no matter how much RAM you have installed in your system. With Windows XP Professional you can make it possible for a process to use up to 3 GB of memory by doing two things:<br />
<br />
1) Modify the boot parameters in boot.ini (on my 32 bit WinXP Pro machine, it's in: "C:\boot.ini") to tell the operating system that you want user processes to have access to up to 3GB of RAM (This is a really important file, and if you don't know what you are doing, stop reading this and go back to work!). This is done by adding the /3GB flag to the line of the file that tells the boot loader where the operating system is. My boot.ini file looks like:<br />
<br />
[boot loader]<br />
timeout=30<br />
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS<br />
[operating systems]<br />
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /3GB<br />
<br />
This is a very bad file to make mistakes on, so don't - it may be very difficult to repair your computer to boot if you mess up this file. There is a nice description of this in the Microsoft article <br />
[http://www.microsoft.com/whdc/system/platform/server/PAE/PAEmem.mspx Memory Support and Windows Operating Systems].<br />
<br />
2) The other thing that you need to do is make your executable LARGEADDRESSAWARE. Assuming that you have a Windows binary that you want to try this on, you can use the 'editbin' utility that comes with Visual Studio to change the setting of one bit (the IMAGE_FILE_LARGE_ADDRESS_AWARE bit) in the image header of the executable. For a program 'prog.exe' you can make the change by<br />
<br />
editbin /LARGEADDRESSAWARE prog.exe<br />
<br />
Of course, depending on how your program handles memory you might find that it crashes when you try to use the extra memory, but that's a separate issue. If you are compiling your program with a version of Visual Studio you should be able to find the switch to make your program /LARGEADDRESSAWARE.<br />
<br />
=== Shared builds of VTK and debugging QVTKWidget using Visual Studio ===<br />
<br />
Assuming that you have built a shared build of VTK and you may or may<br />
not have a set it up such that there is a path to the release version<br />
of VTK in your PATH statement.<br />
<br />
Then if you debug a project that is using QVTKWidget, you will come<br />
across a problem in that if you are debugging a debug version; the<br />
application depends upon the debug version of QVTK.dll which will<br />
depend upon QtGui4d.dll (among others) and load it. But, because the<br />
release version of QVTK.dll is in the path, QtGiu4.dll will also be<br />
loaded preventing the application from running. You will get a<br />
"QWidget: Must construct a QApplication before a QPaintDevice"<br />
message.<br />
<br />
The solution to this problem is to set the path to the correct build<br />
of VTK on the "'''Debugging'''" properties of your project. Right click on<br />
your project, bring up the properties dialog, and select "'''Debugging'''"<br />
from the list on the left. There should be an "'''Environment'''" line. You<br />
can add variables here using key=value pairs.<br />
For example, add the following line:<br />
PATH=<Path To VTK>\bin\$(OutDir);%PATH%<br />
You can then add the same line to other configurations, such as the release one, by selecting<br />
them from the top left drop down box labelled '''Configuration'''.<br />
<br />
$(OutDir) will be set by Visual Studio to either Debug or Release,<br />
depending upon what configuration you have selected. Make sure <br />
that ;%PATH% is appended so that Qt and other files can be appended <br />
to the PATH statement.<br />
<br />
<br />
== Changes to the VTK API ==<br />
<br />
=== What is the policy on Changes to the API ===<br />
<br />
Between patch releases maintain the API unless there is a really strong reason not to. <br />
<br />
Between regular releases maintain backwards compatibility to the API with prior releases of VTK when doing so does not increase the complexity or readability of the current VTK or when the benefits of breaking the API are negligible.<br />
<br />
Clearly these statements have a lot of wiggle room. For example in vtkLightKit BackLight and Headlight were used and released. Now BackLight and HeadLight might make more sense and probably will be easier for non-native English speakers, but is it worth breaking the API for it, probably not. Another factor is how long the API has been around and how widely used it is. These all indicate how painful it will be to change the API which is half of the cost/benefit decision.<br />
<br />
=== Change to vtkIdList::IsId() ===<br />
<br />
vtkIdList::IsId(int id) used to return a 0 or 1 to indicate whether the<br />
specified id is in the list. Now it returns -1 if the id is not in the<br />
list; or a non-negative number indicating the position in the list.<br />
<br />
=== Changes vtkEdgeTable ===<br />
<br />
vtkEdgeTable had two changes. The constructor now takes no arguments,<br />
and you use InitEdgeInsertion() to tell the class how many points are in<br />
the dataset. Also, IsEdge(p1,p2) now returns a -1 if the edge (defined<br />
by points p1,p2) is not defined. otherwise a non-negative integer value<br />
is returned.<br />
<br />
These changes were made to support the association of attributes with<br />
edges.<br />
<br />
=== Changes between VTK 4.2 and VTK 4.4 (and how to update) ===<br />
<br />
We have removed the CVS date, revision, and the language from the<br />
copyright on all the files. This information wasn't being used much and<br />
it created extra work for developers. For example you edit vtkObject.h<br />
rebuild all of VTK, check in you change, then you must rebuild all of<br />
VTK again because commiting the header file causes it to be changed by<br />
CVS (because the revision number changed) This change will also make it<br />
easier to compare different branches of VTK since these revision number<br />
differences will no longer show up. The CVS revision number is still in<br />
the cxx file in the RevisionMacro. You don't need to make any changes to<br />
your code for this.<br />
<br />
The DataArray classes now use a templated intermediate class to share<br />
their implementation. Again there is no need for you to make changes to<br />
your code.<br />
<br />
Legacy code has been removed. Specifically none of the old style<br />
callbacks are supported and observers should be used instead. So where<br />
you used a filter->SetStartMethod(myFunc) you should do a<br />
filter->AddObserver(vtkCommand::StartEvent,myCommand) Usually this will<br />
require you to create a small class for the observer.<br />
vtkImageOpenClose3D.cxx has an example of using an observer and there<br />
are a few other examples in VTK. If you switch to using Observers your<br />
code should also work with versions of VTK from 3.2 or later since the<br />
Observers have been in VTK since VTK 3.2.<br />
<br />
Many functions that previously took or returned float now take or return<br />
double. To change your code to work with VTK 4.4 or later you can just<br />
replace float with double for the appropriate calls and variables. If<br />
you want your code to work with both old and new versions of VTK you can<br />
use vtkFloatingPointType which is defined to be double in VTK 4.4 and<br />
later and it is float in vtk 4.2.5. In versions of VTK prior to 4.2.5<br />
you can use something like:<br />
<br />
#ifndef vtkFloatingPointType<br />
#define vtkFloatingPointType vtkFloatingPointType<br />
typedef float vtkFloatingPointType;<br />
#endif<br />
<br />
at the beginning of your code. That will set it to the correct value for<br />
all versions of VTK old and new.<br />
<br />
=== Use of New() and Delete() now enforced (vs. new & delete) ===<br />
<br />
Constructors and destructors in VTK are now protected. This means you<br />
can no longer use little "new" or "delete" to create VTK instances.<br />
You'll have to use the methods ::New() and ::Delete() (as has been<br />
standard practice for some time).<br />
<br />
The reason for this is to enforce the use of New() and Delete(). Not<br />
using New() and Delete() can lead to bad mojo, mainly reference counting<br />
problems or not taking advantage of special procedures incorporated into<br />
the New() method (e.g., selecting the appropriate hardware interface<br />
during instance creation time).<br />
<br />
If you've used New() and Delete() in your code, these changes will not<br />
affect you at all. If you're using little "new" or "delete", your code<br />
will no longer and compile and you'll have to switch to New() and Delete().<br />
<br />
=== Changes between VTK 4.4 and VTK 4.6 ===<br />
<br />
Collection Changes<br />
<br />
Collections have had some small changes (originally started by Chris<br />
Volpe) to better support reentrant iteration. Specifically all the<br />
collection have an InitTraversal(sit) and GetNextFoobar(sit) methods.<br />
(where Foobar is what the collection contains, for example<br />
GetNextActor(sit)) The argument to both of these methods is a<br />
vtkCollectionSimpleIterator. Most of the collection use in VTK has been<br />
modified to use these new methods. The advantage is that these new<br />
methods support having the same collection be iterated through in a<br />
reentrant safe manner. In the past this was not true and led to a number<br />
of problems. In the future for C++ class development please use this<br />
approach to iterating through a collection. These changes are fully<br />
backwards compatible and no old APIs were harmed in the making of these<br />
changes. So in summary, for the future, where you would have written:<br />
<br />
for (actors->InitTraversal();<br />
(actor = actors->GetNextActor());)<br />
<br />
you would now have:<br />
<br />
vtkCollectionSimpleIterator actorIt;<br />
for (actors->InitTraversal(actorIt);<br />
(actor = actors->GetNextActor(actorIt));)<br />
<br />
=== Changes in VTK between 3.2 and 4.0 ===<br />
<br />
* Changes to vtkDataSetAttributes, vtkFieldData and vtkDataArray: All attributes (scalars, vectors...) are now stored in the field data as vtkDataArray's. vtkDataSetAttributes became a sub-class of vtkFieldData. For backwards compatibility, the interface which allows setting/getting the attributes the old way (by passing in a sub-class of vtkAttributeData such as vtkScalars) is still supported but it will be removed in the future. Therefore, the developers should use the new interface which requires passing in a vtkDataArray to set an attribute. vtkAttributeData and it's sub-classes (vtkScalars, vtkVectors...) will be deprectated in the near future; developers should use vtkDataArray and it's sub-classes instead. We are in the process of removing the use of these classes from vtk filters.<br />
<br />
* Subclasses of vtkAttributeData (vtkScalars, vtkVectors, vtkNormals, vtkTCoords, vtkTensors) were removed. As of VTK 4.0, vtkDataArray and it's sub-classes should be used to represent attributes and fields. Detailed description of the changes and utilities for upgrading from 3.2 to 4.0 can be found in the package: http://www.vtk.org/files/misc/Upgrading.zip<br />
<br />
* Added special methods to data arrays to replace methods like<br />
<br />
tc SetTCoord i x y 0<br />
<br />
or<br />
<br />
vc SetVector i vx vy vz<br />
<br />
in interpreted languages (Tcl, Python, Java). Use:<br />
<br />
tc SetTuple2 i x y<br />
<br />
or<br />
<br />
vc SetTuple3 i vx vy vz<br />
<br />
* Improved support for parallel visualization: vtkMultiProcessController and it's sub-classes have been re-structured and mostly re-written. The functionality of vtkMultiProcessController have been re-distributed between vtkMultiProcessController and vtkCommunicator. vtkCommunicator is responsible of sending/receiving messages whereas vtkMultiProcessController (and it's subclasses) is responsible of program flow/control (for example processing rmi's). New classes have been added to the Parallel directory. These include vtkCommunicator, vtkMPIGroup, vtkMPICommunicator, vtkSharedMemoryCommunicator, vtkMPIEventLog... There is now a tcl interpreter which supports parallel scripts. It is called pvtk and can be build on Windows and Unix. Examples for both Tcl and C++ can be found in the examples directories.<br />
<br />
* vtkSocketCommunicator and vtkSocketController have been added. These support message passing via BSD sockets. Best used together with input-output ports.<br />
<br />
* Since it was causing very long compile times (it essentially includes every vtk header file) and it was hard to maintain (you had to add a line whenever you added a class to VTK) vtk.h was removed. You will have to identify the header files needed by your application and include them one by one.<br />
<br />
* vtkIterativeClosestPointTransform has been added. This class is an implementation of the ICP algorithm. It matches two surfaces using the iterative closest point (ICP) algorithm. The core of the algorithm is to match each vertex in one surface with the closest surface point on the other, then apply the transformation that modify one surface to best match the other (in a least square sense).<br />
<br />
* The SetFileName, SaveImageAsPPM and related methods in vtkRenderWindow have been removed. vtkWindowToImageFilter combined with any of the image writers provides greater functionality.<br />
<br />
* Support for reading and writing PGM and JPEG images has been included.<br />
<br />
* Methods with parameters of the form "type param[n]" are wrapped. Previously, these methods were only wrapped if the array was declared 'const'. The python wrappers will allow values to be returned in the array.<br />
<br />
* The directory structure was completely reorganized. There are now subdirectories for Common (core common classes) Filtering (superclasses for filtering operations) Imaging (filters and sources that produce images or structured points) Graphics (filters or sources that produce data types other than ImageData and StructuredPoints) IO (file IO classes that do not require Rendering support) Rendering (all actors mappers annotation and rendering classes) Hybrid (typically filters and sources that require support from Rendering or both Imaging and Graphics) Parallel (parallel visualization support classes) Patented (patented classes) Examples (documented examples) Wrapping (support for the language wrappers). In many directories you will see a Testing subdirectory. The Testing subdirectories contain tests used to validate VTKs operation. Some tests may be useful as examples but they are not well documented.<br />
<br />
* The Build process for VTK now uses CMake (found at www.cmake.org) This replaces pcmaker on windows and configure on UNIX. This resolves some longstanding problems and limitation we were having with pcmaker and configure, and unifies the build process into one place.<br />
<br />
=== Changes to VTK between 4.0 and 4.2 ===<br />
<br />
* Use of macros to support serialization, standardize the New method, and provide the Superclass typedef.<br />
<br />
* Subclassing of VTK classes in the python wrappers (virtual method hooks are not provided).<br />
<br />
* vtkImageWindow, vtkImager, vtkTkImageWindowWidget and their subclasses have been removed to reduce duplicated code and enable interation in ImageWindows. Now people should use vtkRenderer and vtkRenderWindow instead. vtkImageViewer still works as a turn key image viewing class although it now uses vtkRenderWindow and vtkRenderer internally instead of vtkImageWindow and vtkImager.<br />
<br />
* New class: vtkBandedPolyDataContourFilter. Creates solid colored bands (like you find on maps) of scalar value.<br />
<br />
* Event processing: Several new events to VTK were added (see vtkCommand.h). Also event processing can now be prioritized and aborted. This allows applications to manage who processes which events, and terminates the processing of a particular event if desired.<br />
<br />
* 3D Widgets: A new class vtkInteractorObserver was added to observe events on vtkRenderWindowInteractor. Using the new event processing infrastructure, multiple 3D widgets (subclasses of vtkInteractorObserver) can be used simultaneously to process interactions. Several new 3D widgets have been added including:<br />
** vtkLineWidget<br />
** vtkPlaneWidget<br />
** vtkImagePlaneWidget<br />
** vtkBoxWidget<br />
** vtkSphereWidget<br />
<br />
* Besides providing a representation, widgets also provide auxiliary functionality such as providing transforms, implicit functions, plane normals, sphere radius and center, etc.<br />
<br />
* New class: vtkInstantiator provides a means by which one can create an instance of a VTK class using only the name of the class as a string.<br />
<br />
* New class: vtkXMLParser provides a wrapper around the Expat XML parsing library. A new parser can be written by subclassing from vtkXMLParser and providing a few simple virtual method implementations.<br />
<br />
* TIFF reader is now implemented using libtiff, which makes it capable of reading almost all available TIFF formats. The libtiff is also available internally as vtktiff.<br />
<br />
* New method (all sub-classes of vtkObject): Added a virtual function called NewInstance to vtkTypeMacro. NewInstance creates and returns an object of the same type as the current one. It does not copy any properties. The returned pointer is of the same type as the pointer the method was invoked with. This method should replace all the MakeObject methods scattered through VTK.<br />
<br />
* vtkSetObject macro is depricated for use inside the VTK. It is still a valid construct in projects that use VTK. Instead use vtkCxxSetObjectMacro which does the same thing.<br />
<br />
* vtkPLOT3DReader have been improved. It now supports:<br />
** multigrid (each block is one output)<br />
** ascii<br />
** fortran-style byte counts<br />
** little/big endian<br />
** i-blanking (partial)<br />
<br />
* A new vtkTextProperty class has been created, and duplicated text API s have been obsoleted accordingly. Check the<br />
[[VTK_FAQ#Text_properties_in_VTK_4.2|Text properties in VTK 4.2]] FAQ entry for a full description of the change.<br />
<br />
=== How do I upgrade my existing C++ code from 3.2 to 4.x? ===<br />
<br />
This is (a corrected version of) an email that was posted to vtkusers.<br />
Please feel free to correct or add anything.<br />
<br />
{| cellspacing="3" <br />
|- valign="top"<br />
|width="55%" bgcolor="#f0f0ff" style="border:1px solid #ffc9c9;padding:1em;padding-top:0.5em;"|<br />
<br />
I've just ported my medium-sized (40K lines) application from vtk3.2 to<br />
vtk4.x. I thought I would share my experiences with you, in case there<br />
were people out there contemplating it but a bit scared.<br />
<br />
One source of information for upgrading code is:<br />
<br />
http://www.vtk.org/files/misc/Upgrading.zip<br />
<br />
I'm using VC++6 + MFC on Win2K and was unable/unwilling to run the<br />
script in the zip file.<br />
<br />
So,<br />
<br />
I switched all my include directories to the new VTK ones and<br />
recompiled. 337 errors, not unexpectedly. Most concerned vtkScalars and<br />
vtkTCoords which have both been removed. Where I was using single value<br />
scalars, like this:<br />
<br />
vtkScalars *scalars = vtkScalars::New();<br />
scalars->SetNumberOfScalars(N_POINTS);<br />
...<br />
polydata->GetPointData()->SetScalars(scalars);<br />
...<br />
scalars->SetScalar(i,2.3);<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkFloatArray *scalars = vtkFloatArray::New();<br />
scalars->SetNumberOfComponents(1);<br />
scalars->SetNumberOfTuples(N_POINTS);<br />
...<br />
polydata->GetPointData()->SetScalars(scalars);<br />
...<br />
scalars->SetTuple1(i,2.3);<br />
...<br />
<br />
OK so far, far fewer errors.<br />
<br />
Where I had 2D texture coordinates:<br />
<br />
vtkTCoords *tcoords = vtkTCoords::New();<br />
tcoords->SetNumberOfTCoords(N);<br />
...<br />
float p[3];<br />
p[0]=x; p[1]=y;<br />
tcoords->SetTCoord(i,p);<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkFloatArray *tcoords = vtkFloatArray::New();<br />
tcoords->SetNumberOfComponents(2);<br />
tcoords->SetNumberOfTuples(N);<br />
...<br />
float p[2];<br />
p[0]=x; p[1]=y;<br />
tcoords->SetTuple(i,p);<br />
....<br />
<br />
All well and good, still fewer errors. Make sure you call<br />
SetNumberOfComponents *before* SetNumberOfTuples else you'll get<br />
problems (I did!).<br />
<br />
Where I was creating 0-255 image data and had been using:<br />
<br />
vtkScalars* scalars = vtkScalars::New();<br />
scalars->SetDataTypeToUnsignedChar();<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkUnsignedCharArray *scalars = vtkUnsignedCharArray::New()<br />
...<br />
<br />
Going well!<br />
<br />
When creating RGB images, I had been using:<br />
<br />
vtkScalars *scalars = vtkScalars::New();<br />
scalars->SetDataTypeToUnsignedChar();<br />
scalars->SetNumberOfComponents(3);<br />
scalars->SetNumberOfScalars(X*Y);<br />
...<br />
scalars->SetActiveComponent(0);<br />
scalars->SetScalar(i,val1);<br />
scalars->SetActiveComponent(1);<br />
scalars->SetScalar(i,val2);<br />
scalars->SetActiveComponent(2);<br />
scalars->SetScalar(i,val3);<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkUnsignedCharArray *scalars = vtkUnsignedCharArray::New()<br />
scalars->SetNumberOfComponents(3);<br />
scalars->SetNumberOfTuples(X*Y);<br />
...<br />
scalars->SetComponent(i,0,val1);<br />
scalars->SetComponent(i,1,val2);<br />
scalars->SetComponent(i,2,val3);<br />
...<br />
<br />
My remaining errors concerned vtkWin32OffscreenRenderWindow that has<br />
been removed. Where I had been using:<br />
<br />
vtkWin32OffscreenRenderWindow *offscreen = vtkWin32OffscreenRenderWindow::New();<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkWin32OpenGLRenderWindow *offscreen = vtkWin32OpenGLRenderWindow::New();<br />
offscreen->SetOffScreenRendering(1);<br />
...<br />
<br />
All done. I'd had to throw in some #include "vtkFloatArray.h" and things<br />
like that of course. Zero compile errors.<br />
<br />
Had to remember to link against the new vtk lib files, so I removed<br />
<br />
vtkdll.lib <br />
<br />
and added<br />
<br />
vtkCommon.lib<br />
vtkGraphics.lib<br />
<br />
etc.<br />
<br />
Zero link errors. My program is up and running again, no apparant<br />
problems. Plus now I can use all the new features of vtk4. (And I'm sure<br />
it's faster but maybe that's my imagination.)<br />
<br />
All this took me about three hours.<br />
<br />
Bye!<br />
<br />
Tim.<br />
|}<br />
<br />
=== What is the release schedule for VTK ===<br />
<br />
VTK has a formal release every eight to sixteen months. VTK 4.0 was cut in December 2001 and released in March 2002. VTK 4.2 was releaseed in February 2003. VTK 4.4 (which was an interim release) was released at the end of 2003. VTK 5.0 was released in January 2006, 5.0.1 in July 2006, 5.0.2 in September 2006, 5.0.3 in March 2007, and 5.0.4 in January 2008.<br />
<br />
=== Roadmap: What changes are being considered for VTK ===<br />
<br />
This is a list of changes that are being considered for inclusion into<br />
VTK. Some of these changes will happen while other changes we would like<br />
to see happen but may not due to funding or time issues. For each change<br />
we try to list what the change is, when we hope to complete it, if it is<br />
actively being developed. Detailed discussion on changes is limited to<br />
the vtk-developers mailing list.<br />
<br />
# Modify existing image filters to use the new vtkImageIterator etc. Most simple filters have been modified to use ithe iterator in VTK 4.2. It would be nice to have some sort of efficient neighborhood iterators but so far we haven't come up with any.<br />
# Rework the polydata and unstructured grid structures (vtkMesh ??). Related ideas include:<br />
#* Make UnstructuredGrid more compact by removing the cell point count from the vtkCellArray. This will reduce the storage required by each cell by 4 bytes.<br />
#* Make vtkPolyData an empty subclass of vtkUnstructuredGrid. There are a number of good reasons for this but it is a tricky task and backwards compatibility needs to be maintained.<br />
# More parallel support, including parallel compositing algorithms<br />
# Algorithms like LIC (http://www-courses.cs.uiuc.edu/~cs419/lic.pdf), maybe a couple terrain-decimation algorithms<br />
# Further integration of STL and other important C++ constructs (like templates)<br />
<br />
VTK 4.4 (intermediate release, end of 2003)<br />
<br />
* convert APIs to double (done)<br />
* remove old callbacks (done)<br />
* blanking<br />
* ref count observers (done)<br />
* switch collections to use iterators (done)<br />
* improve copyright (done)<br />
<br />
VTK 5.0 (major release, early 2005)<br />
<br />
* new pipeline mechanism (see [[Media:Pipeline.pdf|Pipeline.pdf]])<br />
* time support<br />
* true AMR support<br />
<br />
=== Changes to Interactors ===<br />
<br />
The Interactors have been updated to use the Command/Observer events of<br />
vtk. The vtkRenderWindowInteractor now has ivars for all the event<br />
information. There is a new class called<br />
vtkGenericRenderWindowInteractor that can be used to set up the bindings<br />
from other languages like python, Java or TCl.<br />
<br />
A new class vtkInteractorObserver was also added. It has a<br />
SetInteractor() method. It observes the keypress and delete events<br />
invoked by the render window interactor. The keypress activation value<br />
for a widget is now 'i' (although this can be programmed).<br />
vtkInteractorObserver has the state ivar Enabled. All subclasses must<br />
have the SetEnabled(int) method. Convenience methods like On(), Off(),<br />
EnabledOn(), and EnabledOff() are available. The state of the interactor<br />
observer is obtained using GetEnabled(). The SetEnabled(1) method adds<br />
observers to watch the interactor (appropriate to the particular<br />
interactor observer) ; SetEnabled(0) removes the observers . There are<br />
two new events: EnableEvent and DisableEvent which are invoked by the<br />
SetEnabled() method.<br />
<br />
The events also support the idea of priority now. When you add an<br />
observer, you can specify a priority from 0 to 1. Higher values will be<br />
called back first. An observer can also tell the object not to call any<br />
more observers. This way you can handle an event, and stop further<br />
processing. In this way you can add handlers to InteractorStyles without<br />
sub-classing and from wrapped languages.<br />
<br />
For more information see: vtkGenericRenderWindowInteractor,<br />
vtkRenderWindowInteractor, vtkInteractorObserver.<br />
<br />
=== Header files and vtkSetObjectMacro ===<br />
<br />
On some platforms such as MS Visual Studio .NET, compiler is not capable<br />
of handling too big input files. Some VTK files with all includes do<br />
become big enough to overwhelm the compiler. The solution is to minimize<br />
the amount of includes. This especially goes for header files, because<br />
they propagate to other files. Every class header file should include<br />
only the parent class header file. If there is no other alternative, you<br />
should put a comment next to include file explaining why the file has to<br />
be included.<br />
<br />
The related issue is with vtkSetObjectMacro. This file calles some<br />
methods on an argument class, which implies that the argument class<br />
header file has to be included. The result is bloat on the header files.<br />
The solution is to not use vtkSetObjectMacro but vtkCxxSetObjectMacro.<br />
The difference is that vtkCxxSetObjectMacro goes in the Cxx file and not<br />
in the header file.<br />
<br />
Example: Instead of<br />
<br />
#include "vtkBar.h"<br />
class vtkFoo : public vtkObject<br />
{ ...<br />
vtkSetObjectMacro(Bar, vtkBar);<br />
...<br />
};<br />
<br />
Do:<br />
<br />
class vtkBar;<br />
class vtkFoo : public vtkObject<br />
{<br />
...<br />
virtual void SetBar(vtkBar*);<br />
...<br />
};<br />
<br />
and add the following line to vtkFoo.cxx<br />
<br />
vtkCxxSetObjectMacro(vtkFoo,Bar,vtkBar);<br />
<br />
=== Text properties in VTK 4.2 ===<br />
<br />
A new<br />
[http://public.kitware.com/VTK/doc/nightly/html/classvtkTextProperty.html vtkTextProperty]<br />
class has been added to VTK 4.2.<br />
<br />
This class factorizes text attributes that used to be spread out and<br />
duplicated in many different classes (mostly 2D actors and text<br />
mappers). Among those attributes, font family, font size,<br />
bold/italic/shadow properties, horizontal and vertical justification,<br />
line spacing and offset have been retained, whereas new attributes like<br />
color and opacity have been introduced.<br />
<br />
We tried to make sure that you can use a vtkTextProperty to modify text<br />
properties in the same way a vtkProperty can be used to modify the<br />
surface properties of a geometric object. In that regard, you should be<br />
able to share a vtkTextProperty between different actors or assign the<br />
same vtkTextProperty to an actor that offers multiple vtkTextProperty<br />
attributes ([http://www.vtk.org/doc/nightly/html/classvtkXYPlotActor.html vtkXYPlot]<br />
for example).<br />
<br />
Here is a quick example:<br />
<br />
vtkTextActor *actor0 = vtkTextActor::New();<br />
actor0->GetTextProperty()->SetItalic(1);<br />
//<br />
vtkTextProperty *tprop = vtkTextProperty::New();<br />
tprop->SetBold(1);<br />
//<br />
vtkTextActor *actor1 = vtkTextActor::New();<br />
actor1->SetTextProperty(tprop);<br />
//<br />
vtkTextActor *actor2 = vtkTextActor::New();<br />
actor2->SetTextProperty(tprop);<br />
<br />
*Backward compatibility issues*:<br />
<br />
1) Color and Opacity:<br />
<br />
The text color and text opacity settings are now controlled by the<br />
vtkTextProperty Color and Opacity attributes instead of the<br />
corresponding actor's color and opacity attributes. In the following<br />
example, those settings were controlled by the attributes of the<br />
vtkProperty2D attached to the vtkActor2D (vtkTextActor). The<br />
vtkTextProperty attributes should be used instead:<br />
<br />
vtkTextActor *actor = vtkActor::New();<br />
actor->GetProperty()->SetColor(...);<br />
actor->GetProperty()->SetOpacity(...);<br />
<br />
becomes:<br />
<br />
actor->GetTextProperty()->SetColor(...);<br />
actor->GetTextProperty()->SetOpacity(...);<br />
<br />
To make migration easier for a while, we have set the vtkTextProperty<br />
default color to ''(-1.0, -1.0, -1.0)'' and the default opacity to ''-1.0''.<br />
These "magic" values are checked by the underlying text mappers at<br />
rendering time. If they are found, the color and opacity of the 2D<br />
actor's vtkProperty2D are used, just as it was in VTK 4.1.<br />
<br />
2) API (i.e. SetBold(), SetItalic(), etc)<br />
<br />
Most of the VTK classes involving text used to provide their own text<br />
attributes like Bold, Italic, Shadow, FontFamily. Thus, each of those<br />
classes would duplicate the vtkTextMapper API through methods like<br />
SetItalic(), SetBold(), SetFontFamily(), etc.<br />
<br />
Moreover, if one class had different text elements (say, for example,<br />
the title and the labels of a scalar bar), there was no way to modify<br />
the text properties of these elements separately.<br />
<br />
The vtkTextProperty class has been created to address both issues, by<br />
obsoleting those duplicated attributes and methods and providing a<br />
unified way to access text properties, and by allowing each class to<br />
associate different vtkTextProperty to different text elements.<br />
<br />
Migrating your code basically involves using the old API on your actor's<br />
vtkTextProperty instead of the actor itself. For example:<br />
<br />
actor->SetBold(1);<br />
<br />
becomes:<br />
<br />
actor->GetTextProperty()->SetBold(1);<br />
<br />
When a class provides different vtkTextProperty for different text<br />
elements, the TextProperty attribute is usually prefixed with that<br />
element type. Example: AxisTitleTextProperty, or AxisLabelTextProperty.<br />
This allows you to set different aspect for each text elements. If you<br />
want to use the same properties, you can either set the same values on<br />
each vtkTextProperty, or make both vtkTextProperty point to the same<br />
vtkTextProperty object. Example:<br />
<br />
actor->GetAxisLabelTextProperty()->SetBold(1);<br />
actor->GetAxisTitleTextProperty()->SetBold(1);<br />
<br />
or:<br />
<br />
vtkTextProperty *tprop = vtkTextProperty::New();<br />
tprop->SetBold(1);<br />
actor->SetAxisLabelTextProperty(tprop);<br />
actor->SetAxisTitleTextProperty(tprop);<br />
<br />
or:<br />
<br />
actor->SetAxisLabelTextProperty(actor->GetAxisTitleTextProperty());<br />
actor->GetAxisTitleTextProperty()->SetBold(1);<br />
<br />
The following list specifies the name of the text properties used in the<br />
VTK classes involving text.<br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkTextMapper.html vtkTextMapper]:<br />
* you can still use the vtkTextMapper + vtkActor2D combination, but we would advise you to use a single vtkTextActor instead, this will give you maximum flexibility.<br />
* has 1 text prop: TextProperty, but although you have access to it, do not twwak it unless you are using vtkTextMapper with a vtkActor2D. In all other cases, use the text prop provided by the actor (see below).<br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkTextActor.html vtkTextActor]:<br />
* has 1 text prop: TextProperty. <br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkLabeledDataMapper.html vtkLabeledDataMapper]:<br />
* has 1 text prop: LabelTextProperty. <br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkCaptionActor2D.html vtkCaptionActor2D]:<br />
* has 1 text prop: CaptionTextProperty. <br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkLegendBoxActor.html vtkLegendBoxActor]:<br />
* has 1 text prop: EntryTextProperty.<br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkAxisActor2D.html vtkAxisActor2D],<br />
[http://www.vtk.org/doc/nightly/html/classvtkParallelCoordinatesActor.html vtkParallelCoordinatesActor], and<br />
[http://www.vtk.org/doc/nightly/html/classvtkScalarBarActor.html vtkScalarBarActor]:<br />
* have 2 text props: TitleTextProperty, LabelTextProperty.<br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkXYPlotActor.html vtkXYPlotActor]:<br />
* has 3 text prop: TitleTextProperty (plot title), AxisTitleTextProperty, AxisLabelTextProperty (title and labels of all axes)<br />
* the legend box text prop (i.e. entry text prop) can be retrieved through actor->GetLegendBoxActor()->GetEntryTextProperty()<br />
* the X (or Y) axis text props (i.e. title and label text props) can be retrieved through actor->GetX/YAxisActor2D->GetTitle/LabelTextProperty(), and will override the corresponding AxisTitleTextProperty or AxisLabelTextProperty props as long as they remain untouched. <br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkCubeAxesActor2D.html vtkCubeAxesActor2D]:<br />
* has 2 text props: AxisTitleTextProperty, AxisLabelTextProperty (title and label of all axes)<br />
* the X (Y or Z) axis text props (i.e. title and label text props) can be retrieved through actor->GetX/Y/ZAxisActor2D->GetTitle/LabelTextProperty(), and will override the corresponding AxisTitleTextProperty or AxisLabelTextProperty props as long as they remain untouched.<br />
<br />
=== Forward declaration in VTK 4.x ===<br />
<br />
Since VTK 4.x all classes have been carefully inspected to only include the necessesary headers, and do what is called 'forward declaration' for all other needed classes. Thus, when you compile your projects using a filter that takes as input a dataset and you are passing an imagedata: you need to explicitely include imagedata within your implementation file. This is true for all data types.<br />
<br />
For example, if you get this error:<br />
<br />
no matching function for call to `vtkContourFilter::SetInput(vtkImageData*)'<br />
VTK/Filtering/vtkDataSetToPolyDataFilter.h:44:<br />
candidates are: virtual void vtkDataSetToPolyDataFilter::SetInput(vtkDataSet*)<br />
<br />
This means you need to add in your code : #include "vtkImageData.h"<br />
<br />
=== Using Volume Rendering in VTK ===<br />
<br />
I recently updated my VTK CVS version. And my c++ code that use to work fine are now complaining about:<br />
<br />
undefined reference to `vtkUnstructuredGridAlgorithm::SetInput(vtkDataObject*)'<br />
undefined reference to `vtkUnstructuredGridAlgorithm::GetOutput()' <br />
<br />
There is now a new subfolder and a new option to enable building the VolumeRendering library. You have to turn VTK_USE_VOLUMERENDERING to ON in order to use it. Also make sure that your executable is linking properly to this new library:<br />
<br />
ADD_EXECUTABLE(foo foo.cxx)<br />
TARGET_LINK_LIBRARIES(foo vtkVolumeRendering)<br />
<br />
=== API Changes in VTK 5.2 ===<br />
<br />
==== <tt>vtkProp::RenderTranslucentGeometry()</tt> is gone ====<br />
<br />
<tt>vtkProp::RenderTranslucentGeometry()</tt> is gone and has been broken down into 3 methods:<br />
* <tt>HasTranslucentPolygonalGeometry()</tt><br />
* <tt>RenderTranslucentPolygonalGeometry()</tt><br />
* <tt>RenderVolumetricGeometry()</tt><br />
<br />
Here is what to change in a vtkProp subclass:<br />
* If <tt>RenderTranslucentGeometry()</tt> was used to render translucent polygonal geometry only, override <tt>HasTranslucentPolygonalGeometry()</tt> and <tt>RenderTranslucentPolygonalGeometry()</tt>. <b>Just renaming <tt>RenderTranslucentGeometry()</tt> as <tt>RenderTranslucentPolygonalGeometry()</tt> is not enough!</b><br />
* If <tt>RenderTranslucentGeometry()</tt> was used to render translucent volumetric geometry only, override <tt>RenderVolumetricGeometry()</tt>. In this case, just renaming <tt>RenderTranslucentGeometry()</tt> as <tt>RenderVolumetricGeometry()</tt> is OK.<br />
* If <tt>RenderTranslucentGeometry()</tt> was used to render translucent polygonal geometry and translucent volumetric geometry, override all 3 methods.<br />
<br />
The reason of this change is that <tt>HasTranslucentPolygonalGeometry()</tt> is used to decide if an expensive initialization of the new rendering algorithm of translucent polygonal geometry (depth peeling) is necessary. <tt>RenderTranslucentPolygonalGeometry()</tt> is called multiple times during the rendering of the translucent polygonal geometry of the scene. <tt>RenderVolumetricGeometry()</tt> is called in an additional pass, after depth peeling. For this reason, <b><tt>RenderTranslucentGeometry()</tt> cannot just be marked as deprecated but had to be removed from the API</b>.<br />
<br />
<br />
<br />
==== <tt>vtkImagePlaneWidget</tt> has action names changed ====<br />
from:<br />
enum<br />
{<br />
CURSOR_ACTION = 0,<br />
SLICE_MOTION_ACTION = 1,<br />
WINDOW_LEVEL_ACTION = 2<br />
};<br />
to:<br />
enum<br />
{<br />
VTK_CURSOR_ACTION = 0,<br />
VTK_SLICE_MOTION_ACTION = 1,<br />
VTK_WINDOW_LEVEL_ACTION = 2<br />
};<br />
<br />
==== <tt>GetOutput()</tt> now returns <tt>vtkDataObject</tt> for some algorithms ====<br />
<br />
The following algorithms now work on <tt>vtkGraph</tt> as well as <tt>vtkDataSet</tt>, so no <tt>GetOutput()</tt> longer returns <tt>vtkDataSet</tt>. To obtain the dataset, use <tt>vtkDataSet::SafeDownCast(filter->GetOutput())</tt><br />
* <tt>vtkArrayCalculator</tt><br />
* <tt>vtkAssignAttribute</tt><br />
* <tt>vtkProgrammableFilter</tt><br />
<br />
=== API Changes in VTK 5.4 ===<br />
* empty right now.<br />
=== API Changes in VTK 5.5 ===<br />
<br />
* vtkStreamTracer<br />
Changed<br />
enum Units <br />
{ <br />
TIME_UNIT, <br />
LENGTH_UNIT, <br />
CELL_LENGTH_UNIT <br />
}<br />
to<br />
enum Units<br />
{ <br />
LENGTH_UNIT = 1, <br />
CELL_LENGTH_UNIT = 2 <br />
}<br />
<br />
<br />
Changed<br />
* OUT_OF_TIME = 4<br />
to<br />
* OUT_OF_LENGTH = 4<br />
in enum ''ReasonForTermination''<br />
<br />
<br />
Changed<br />
* LastUsedTimeStep<br />
to<br />
* LastUsedStepSize<br />
<br />
<br />
Changed<br />
* MaximumPropagation<br />
* MaximumIntegrationStep<br />
* MinimumIntegrationStep<br />
* InitialIntegrationStep <br />
from type ''IntervalInformation'' to type ''double''.<br />
<br />
<br />
Added a member variable to the class<br />
* int IntegrationStepUnit<br />
<br />
<br />
The following APIs were '''removed''' from the class:<br />
* void SetMaximumProgration(int unit, double max)<br />
* void SetMaximumProgrationUnit(int unit)<br />
* int GetMaximumPropagationUnit()<br />
* void SetMaximumPropagationUnitToTimeUnit()<br />
* void SetMaximumPropagationUnitToLengthUnit()<br />
* void SetMaximumPropagationUnitToCellLengthUnit()<br />
* void SetMinimumIntegrationStep(int unit, double step)<br />
* void SetMinimumIntegrationStepUnit(int unit)<br />
* int GetMinimumIntegrationStepUnit()<br />
* void SetMinimumIntegrationStepUnitToTimeUnit()<br />
* void SetMinimumIntegrationStepUnitToLengthUnit()<br />
* void SetMinimumIntegrationStepUnitToCellLengthUnit()<br />
* void SetMaximumIntegrationStep(int unit, double step)<br />
* void SetMaximumIntegrationStepUnit(int unit)<br />
* int GetMaximumIntegrationStepUnit()<br />
* void SetMaximumIntegrationStepUnitToTimeUnit()<br />
* void SetMaximumIntegrationStepUnitToLengthUnit()<br />
* void SetMaximumIntegrationStepUnitToCellLengthUnit()<br />
* void SetInitialIntegrationStep(int unit, double step)<br />
* void SetInitialIntegrationStepUnit(int unit)<br />
* int GetInitialIntegrationStepUnit()<br />
* void SetInitialIntegrationStepUnitToTimeUnit()<br />
* void SetInitialIntegrationStepUnitToLengthUnit()<br />
* void SetInitialIntegrationStepUnitToCellLengthUnit()<br />
* void SetIntervalInformation(int unit, double interval, IntervalInformation& currentValues)<br />
* void SetIntervalInformation(int unit,IntervalInformation& currentValues)<br />
* void ConvertIntervals(double& step, double& minStep, double& maxStep, int direction, double cellLength, double speed)<br />
* static double ConvertToTime(IntervalInformation& interval, double cellLength, double speed)<br />
* static double ConvertToLength(IntervalInformation& interval, double cellLength, double speed)<br />
* static double ConvertToCellLength(IntervalInformation& interval, double cellLength, double speed)<br />
* static double ConvertToUnit(IntervalInformation& interval, int unit, double cellLength, double speed)<br />
<br />
<br />
The following APIs were added to the class:<br />
* int GetIntegrationStepUnit()<br />
* void SetIntegrationStepUnit(int unit)<br />
* void ConvertIntervals(double& step, double& minStep, double& maxStep, int direction, double cellLength)<br />
* static double ConvertToLength(double interval, int unit, double cellLength)<br />
* static double ConvertToLength(IntervalInformation& interval, double cellLength)<br />
<br />
<br />
* vtkInterpolatedVelocityField<br />
Added a new member variable and two associated functions:<br />
* bool NormalizeVector<br />
* vtkSetMacro(NormalizeVector, bool)<br />
* vtkGetMacro(NormalizeVector, bool)<br />
<br />
== OpenGL requirements ==<br />
<br />
=== Terminology ===<br />
<br />
* a software component using OpenGL (like VTK) <b>requires</b> some minimal version of OpenGL and some minimal set of OpenGL extensions at runtime. At compile time, it <b>requires</b> an OpenGL header file (<tt>gl.h</tt>) compatible with some minimal version of the OpenGL API.<br />
* an OpenGL implementation (software (like Mesa) or hardware (combination of a graphic card and its driver) ) <b>supports</b> some OpenGL versions and a set of extensions.<br />
<br />
=== How do I check which OpenGL versions or extensions are supported by my graphic card or OpenGL implementation? ===<br />
<br />
==== Linux/Unix ====<br />
<br />
Two ways:<br />
<br />
* General method<br />
<pre><br />
$ glxinfo<br />
</pre><br />
<br />
* vendor specific tool<br />
<br />
if you have an nVidia card and nvidia-settings installed on it, run it and go to the OpenGL/GLX Information item under the X Screen 0 item.<br />
<br />
==== Windows ====<br />
<br />
You can download and use GLview http://www.realtech-vr.com/glview<br />
<br />
==== Mac OS X ====<br />
<br />
With Xcode installed, Macintosh HD->Developer->Applications->Graphic Tools->OpenGL Driver Monitor.app->Monitors->Renderer Info-><name of the OpenGL driver>->OpenGL Extensions<br />
<br />
=== VTK 5.0 ===<br />
<br />
==== What is the minimal OpenGL version of the API required to compile VTK5.0? ====<br />
<br />
The <tt>gl.h</tt> file provided by your compiler/system/SDK has to define at least the OpenGL 1.1 API.<br />
<br />
(Note: the functions and macros defined in higher OpenGL API versions or in other OpenGL extensions are provided by <tt>glext.h</tt>, <tt>glxext.h</tt> and <tt>wglext.h</tt>. Those 3 files are official files taken from http://opengl.org/registry/ and already part of the VTK source tree).<br />
<br />
==== What is the minimal OpenGL version required by VTK5.0 at runtime? ====<br />
<br />
All the VTK classes using OpenGL require an OpenGL implementation (software or hardware) >=1.1 except for <tt>vtkVolumeTextureMapper3D</tt>.<br />
<br />
If you want to use <tt>vtkVolumeTextureMapper3D</tt>, the following extensions or OpenGL versions are required (at runtime):<br />
* extension <tt>GL_EXT_texture3D</tt> or OpenGL>=1.2<br />
and<br />
* extension <tt>GL_ARB_multitexture</tt> or OpenGL>=1.3<br />
and either:<br />
* extensions <tt>GL_ARB_fragment_program</tt> and <tt>GL_ARB_vertex_program</tt><br />
or:<br />
* extensions <tt>GL_NV_texture_shader2</tt> and <tt>GL_NV_register_combiners</tt> and <tt>GL_NV_register_combiners2</tt><br />
<br />
=== VTK 5.2 ===<br />
<br />
==== What is the minimal OpenGL version of the API required to compile VTK5.2? ====<br />
<br />
Same answer than for VTK 5.0.<br />
<br />
==== What is the minimal OpenGL version required by VTK5.2 at runtime? ====<br />
<br />
All the VTK classes using OpenGL require an OpenGL implementation (software or hardware) >=1.1 except for <tt>vtkVolumeTextureMapper3D</tt>, <tt>vtkHAVSVolumeMapper</tt>,<br />
<tt>vtkGLSLShaderProgram</tt>, depth peeling and some hardware offscreen rendering using framebuffer objects (FBO).<br />
<br />
If you want to use <tt>vtkVolumeTextureMapper3D</tt>, the following extensions or OpenGL versions are required (at runtime):<br />
* extension <tt>GL_EXT_texture3D</tt> or OpenGL>=1.2<br />
and<br />
* extension <tt>GL_ARB_multitexture</tt> or OpenGL>=1.3<br />
and either:<br />
* extensions <tt>GL_ARB_fragment_program</tt> and <tt>GL_ARB_vertex_program</tt><br />
or:<br />
* extensions <tt>GL_NV_texture_shader2</tt> and <tt>GL_NV_register_combiners</tt> and <tt>GL_NV_register_combiners2</tt><br />
<br />
If you want to use <tt>vtkHAVSVolumeMapper</tt>, the following extensions or OpenGL versions are required (at runtime):<br />
* OpenGL>=1.3<br />
* <tt>GL_ARB_draw_buffers</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_fragment_program</tt><br />
* <tt>GL_ARB_vertex_program</tt><br />
* <tt>GL_EXT_framebuffer_object</tt><br />
* either <tt>GL_ARB_texture_float</tt> or <tt>GL_ATI_texture_float</tt><br />
<br />
The following extension or OpenGL version is used by <tt>vtkHAVSVolumeMapper</tt> if provided (at runtime), but it is optional:<br />
* <tt>GL_ARB_vertex_buffer_object</tt> or OpenGL>=1.5<br />
<br />
If you want to use <tt>vtkGLSLShaderProgram</tt>, the following extensions or OpenGL versions are required (at runtime):<br />
* OpenGL>=1.3<br />
* <tt>GL_ARB_shading_language_100</tt> or OpenGL>=2.0,<br />
* <tt>GL_ARB_shader_objects</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_vertex_shader</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_fragment_shader</tt> or OpenGL>=2.0.<br />
<br />
Depth peeling ( see [[VTK/Depth_Peeling | VTK Depth Peeling]] for more information) requires (at runtime):<br />
* <tt>GL_ARB_depth_texture</tt> or OpenGL>=1.4<br />
* <tt>GL_ARB_shadow</tt> or OpenGL>=1.4<br />
* <tt>GL_EXT_shadow_funcs</tt> or OpenGL>=1.5<br />
* <tt>GL_ARB_vertex_shader</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_fragment_shader</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_shader_objects</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_occlusion_query</tt> or OpenGL>=1.5<br />
* <tt>GL_ARB_multitexture</tt> or OpenGL>=1.3<br />
* <tt>GL_ARB_texture_rectangle</tt><br />
* <tt>GL_SGIS_texture_edge_clamp</tt> or <tt>GL_EXT_texture_edge_clamp</tt> or OpenGL>=1.2<br />
<br />
Hardware-based offscreen rendering using framebuffer object (FBO) will be used as the default offscreen method if the following extensions or OpenGL version are available (at runtime):<br />
* <tt>GL_EXT_framebuffer_object</tt><br />
and either <br />
* <tt>GL_ARB_texture_non_power_of_two</tt> or OpenGL>=2.0<br />
or<br />
* <tt>GL_ARB_texture_rectangle</tt><br />
In addition, if the the framebuffer needs a stencil buffer, extension <tt>GL_EXT_packed_depth_stencil</tt> is required. Even if all those extensions are supported, the chosen FBO format might<br />
not be supported by the card; in this case, this method of offscreen rendering is not used.<br />
<br />
== Miscellaneous questions ==<br />
<br />
=== Can't you split up the data file? ===<br />
<br />
The data is now in one file that is about 15 Megabytes. This is smaller<br />
than the original data files VTK used and we hope that this size is not<br />
a problem for people anymore. If it is please let us know.<br />
<br />
=== Do you have any shared library tips? ===<br />
<br />
VTK version 4.0 and later supports both shared and static libraries on<br />
most all platforms. For development we typically use shared libraries<br />
since they are faster to link when making small changes. You can control<br />
how VTK builds by setting the BUILD_SHARED_LIBS option in CMake.<br />
<br />
== Legal issues ==<br />
<br />
=== Is VTK FDA-Approved ? ===<br />
<br />
Given the fact that VTK is a software toolkit, it cannot be the<br />
subject of FDA approval as a medical device. We have discussed<br />
this topic in several occasions and received advice from FDA<br />
representatives, that can be summarized as follow:<br />
<br />
<br />
VTK is to be considered as an off-the-shelf (OTS) product that<br />
is used for supporting a higher level medical application/product.<br />
The developer of such application/product will be responsible for<br />
performing the validation processes described in FDA published<br />
guidelines for the development of software-related medical devices.<br />
<br />
For mode details see the page [[FDA Guidelines for Software Development]]<br />
<br />
=== What are the legal issues? ===<br />
<br />
The Visualization Toolkit software is provided under the following<br />
copyright. We think it's pretty reasonable. We do restrict the<br />
distribution of modified code. This is primarily a revision control<br />
issue. We don't want a bunch of renegade vtks running around without us<br />
having some idea why the changes were made and giving us a chance to<br />
incorporate them into the general release.<br />
<br />
The text of the VTK copyright is available [http://www.vtk.org/copyright.php here].<br />
<br />
=== What is the deal with the patents ===<br />
<br />
As the copyright mentions there are some patents used in VTK. If you use<br />
any code in the Patented/ directory for commercial application you<br />
should contact the patent holder and obtain a license.<br />
<br />
As of VTK4.0 the following classes are known to use algorithms patented<br />
by General Electric Company: vtkDecimate, vtkMarchingCubes,<br />
vtkMarchingSquares, vtkDividingCubes, vtkSliceCubes and vtkSweptSurface.<br />
The GE contact is:<br />
<br />
Carl B. Horton<br />
Sr. Counsel, Intellectual Property<br />
3000 N. Grandview Blvd., W-710<br />
Waukesha, WI 53188<br />
Phone: (262) 513-4022<br />
E-Mail: mailto:Carl.Horton@med.ge.com<br />
<br />
As of VTK4.0 the following classes are known to use algorithms patented<br />
by Kitware, Inc.: vtkGridSynchronizedTemplates3D,<br />
vtkKitwareContourFilter.h, vtkSynchronizedTemplates2D, and<br />
vtkSynchronizedTemplates3D. The Kitware contact is:<br />
<br />
Ken Martin<br />
Kitware<br />
28 Corporate Drive, Suite 204,<br />
Clifton Park, NY 12065<br />
Phone:1-518-371-3971<br />
E-Mail: mailto:kitware@kitware.com<br />
<br />
=== Can VTK be used as part of a project distributed under a GPL License ? ===<br />
<br />
==== Short Answer ====<br />
<br />
Yes, it is fine to take VTK code and to include it in a project that is distributed under a GPL license.<br />
<br />
==== Long Answer ====<br />
<br />
===== Terms =====<br />
<br />
Let's call project X the larger project that:<br />
<br />
# Will include source code from VTK (in part or as a whole)<br />
# Will be distributed under GPL license<br />
<br />
Note in particular that:<br />
<br />
# The copyright notices in VTK files must be kept.<br />
# If VTK files are modified by the developers of project X, that fact must be clearly indicated.<br />
# Only the modifications of VTK files made by the developers of project X will be covered by a GPL license. The original VTK code remains covered by the VTK license.<br />
# The collection of copyrighted works (project X in this case), that includes VTK (in part or as a whole) and their software will be covered by a GPL license.<br />
<br />
===== Details =====<br />
<br />
As the [http://www.vtk.org/copyright.php VTK license] is a variation of the [http://www.opensource.org/licenses/bsd-license.php Modified BSD license], to which only the following term has been added:<br />
<br />
Modified source versions must be plainly marked as such, <br />
and must not be misrepresented as being the original software.<br />
<br />
and that the Modified BSD license is itself compatible with the GPL <br />
<br />
http://www.gnu.org/philosophy/license-list.html (Modified BSD license)<br />
<br />
Then the VTK license is also compatible with the GPL license. Since the terms of the GPL license do not preclude the additional term of the VTK license from being followed.<br />
<br />
NOTE: The licenses are only '''one way compatible'''.<br />
<br />
* You can use VTK code inside a GPL licensed project.<br />
* You '''can not''' use GPL licensed code inside VTK.<br />
<br />
That is the reason why there are no GPL third party libraries in VTK. Having GPL third party libraries in VTK would prevent closed source projects from being built against VTK.<br />
<br />
{{VTK/Template/Footer}}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK_Coding_Standards&diff=16475VTK Coding Standards2009-09-13T07:54:39Z<p>Dcthomp: Move some notes from the FAQ to the guidelines and add note on checking in STL header includes</p>
<hr />
<div>We only have a few coding standards but they have proved very useful.<br />
<br />
* We only put one public class per file. Some classes have helper classes that they use, but these are not accessible to the user.<br />
<br />
* Every class, macro, etc starts with either vtk or VTK, this avoids name clashes with other libraries. Classes should all start with vtk and macros or constants can start with either.<br />
<br />
* Class names and file names are the same. This makes it easier to find the correct file for a specific class.<br />
<br />
* We only use alphanumeric characters in names, [a-zA-z0-9]. So names like Extract_Surface are not welcome. We use capitalization to indicate words within a name. For example ExtractVectorTopology could be an instance variable. If it were a class it would be called vtkExtractVectorTopology. We capitalize the first letter of a name (excluding any preceding vtk). For local variables almost anything goes. Ideally we would suggest using same convention as instance variables except start their names with a lower case letter. e.g. extractVectorSurface.<br />
<br />
* We try to always spell out a name and not use abbreviations. This leads to longer names but it makes using the software easier because you know that the SetRasterFontRange method will always be called that, not SetRFRange or SetRFontRange or SetRFR. When the name includes a natural abbreviation such as OpenGL, we keep the abbreviation and capitalize the abbreviated letters.<br />
<br />
* We try to keep all instance variables protected. The user and application developer should access instance variables through Set/Get methods. To aid in this there are a number of macros defined in vtkSetGet.h that can be used. They expand into inline functions that Set/Get the instance variable and invoke a Modified() method if the value has changed.<br />
<br />
* Use "this" inside of methods even though C++ doesn't require you to. This really seems to make the code more readable because it disambiguates between instance variables and local or global variables. It also disambiguates between member functions and other functions.<br />
<br />
* Do not use default argument values for C++ method parameters. When the method is wrapped for tcl, the method appears with however many parameters it has and is impossible to call without specifying the parameter values anyway. Rather, use method overloading to achieve the same effect, even in tcl. With overloading, you can have the signature that has all the required parameters and signatures with extra parameters and have both be callable from C++ or tcl. The implementation of one signature should be in terms of the other signature with the default values for the parameters encoded in the .cxx file method implementation.<br />
<br />
* Make sure your code compiles without any warnings with -Wall and -O2.<br />
<br />
* The indentation style can be characterized as the "indented brace" style. Indentations are two spaces, and the curly brace (scope delimiter) is placed on the following line and indented along with the code (i.e., the curly brace lines up with the code). Example:<br />
<br />
if (this->Locator == locator)<br />
{<br />
return;<br />
}<br />
for (i = 0; i < this->Source->GetNumberOfPoints(); i++)<br />
{<br />
p1 = this->Source->GetPoint(i);<br />
[...]<br />
}<br />
<br />
* The header file of the class should include only the superclass's header file. If you do not, the header test run as part of the VTK dashboard will report an error. If any other includes are absolutely necessary, include comment at each one describing why it should be included:<br />
<br />
#include "vtkKWWindow.h"<br />
#include "vtkClientServerID.h" // Needed for InteractorID<br />
#include "vtkPVConfig.h" // Needed for PARAVIEW_USE_LOOKMARKS<br />
<br />
* Avoid using vtkSetObjectMacro since it will require including the header file of another class. Use the vtkCxxSetObjectMacro instead. For example:<br />
<br />
// Class declaration: <br />
// Description:<br />
// Set/Get the array used to store the visibility flags.<br />
virtual void SetVisibilityById(vtkUnsignedCharArray* vis);<br />
<br />
// Cxx file<br />
vtkCxxSetObjectMacro(vtkStructuredVisibilityConstraint,<br />
VisibilityById,<br />
vtkUnsignedCharArray);<br />
<br />
* All subclasses of vtkObject should include a PrintSelf() method that prints all publicly accessible ivars. For example:<br />
<br />
void vtkObject::PrintSelf(ostream& os, vtkIndent indent)<br />
{<br />
os << indent << "Debug: " << (this->Debug ? "On\n" : "Off\n");<br />
os << indent << "Modified Time: " << this->GetMTime() << "\n";<br />
this->Superclass::PrintSelf(os, indent);<br />
os << indent << "Registered Events: ";<br />
if ( this->SubjectHelper )<br />
{<br />
os << endl;<br />
this->SubjectHelper->PrintSelf(os,indent.GetNextIndent());<br />
}<br />
else<br />
{<br />
os << "(none)\n";<br />
}<br />
}<br />
<br />
* All subclasses of vtkObject should include a type macro in their class declaration. For example:<br />
<br />
class VTK_COMMON_EXPORT vtkBox : public vtkImplicitFunction<br />
{<br />
public:<br />
vtkTypeRevisionMacro(vtkBox,vtkImplicitFunction);<br />
void PrintSelf(ostream& os, vtkIndent indent);<br />
...<br />
}<br />
<br />
* STL usage:<br />
** STL is for implementation, not interface. STL references should be contained in a .cxx class or the private section of the .h header file.<br />
** Use the PIMPL idiom to forward reference/contain STL classes in heavily used superclasses. STL is big, fat, and slow to compile so we do not want to include STL headers in any .h files that are included by most of VTK, e.g. vtkObject.h vtkSource.h etc.<br />
** Include the VTK wrapper header files: vtkstd/map instead of map.<br />
** Use the vtkstd:: namespace to refer to STL classes and functions (except for iostreams as mentioned below)<br />
** For an example of STL usage, see the [[VTK FAQ#Can I use STL with VTK?|VTK FAQ]].<br />
** If you do need to include STL files within a VTK header, you must add a comment after it so that the automated repository commit checks will accept your changes:<br />
#include <vtksys/stl/vector> // STL Header <additional Comment here><br />
<br />
* When using anything declared in iostream, do not use std:: or vtkstd::. Examples include cerr, cout, ios... Do not include the iostream header. It is already included.<br />
<br />
* Do not use 'id' as a variable name in public headers as it is a reserved word in Objective-C++.<br />
<br />
= Common Pitfalls =<br />
<br />
== Set method in constructor ==<br />
<br />
<tt>Set</tt> methods defined with <tt>vtkSetMacro</tt> cannot be used in the default constructor to initialize ivars because the first line of a <tt>Set</tt> method is to compare the current value of the ivar with the value in argument. As at this point (in the constructor), the ivar is not initialized yet, the comparison happens against an uninitialized value.<br />
<br />
valgrind will detect this kind of fault.<br />
<br />
For this reason, <b>in the constructor, the value of an ivar has to be initialized directly with the assignment operator</b> not through a <tt>Set</tt> method defined with <tt>vtkSetMacro</tt>.<br />
<br />
Example:<br />
<pre><br />
class vtkFoo : public vtkObject<br />
{<br />
public:<br />
...<br />
vtkGetMacro(X,int);<br />
vtkSetMacro(X,int);<br />
...<br />
protected:<br />
vtkFoo();<br />
int X;<br />
...<br />
};<br />
<br />
vtkFoo::vtkFoo<br />
{<br />
this->SetX(12); // neh<br />
}<br />
<br />
vtkFoo::vtkFoo<br />
{<br />
this->X=12; // good<br />
}<br />
</pre><br />
<br />
The issue looks pretty obvious in an isolated example like the one shown above. Sometimes it can also happen with an indirect call:<br />
<br />
<pre><br />
void vtkFoo::SetXToZero()<br />
{<br />
this->SetX(0);<br />
}<br />
<br />
vtkFoo::vtkFoo<br />
{<br />
this->SetXToZero(); // neh<br />
}<br />
</pre><br />
<br />
<br />
{{VTK/Template/Footer}}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK/FAQ&diff=15537VTK/FAQ2009-06-05T19:34:08Z<p>Dcthomp: /* General information and availability */ Add link to VTK Datasets page.</p>
<hr />
<div>== General information and availability ==<br />
<br />
=== What is the Visualization Toolkit? ===<br />
<br />
The '''Visualization ToolKit (vtk)''' is a software system for 3D Computer<br />
Graphics and Visualization.<br />
<br />
VTK includes a textbook published by Kitware Inc. ([http://www.kitware.com/products/vtktextbook.html The Visualization<br />
Toolkit, An Object-Oriented Approach to 3D Graphics]),<br />
a C++ class library, and Tcl, Python and Java implementations based on<br />
the class library.<br />
<br />
For more information, see http://www.vtk.org and http://www.kitware.com.<br />
<br />
=== What is the current release? ===<br />
<br />
The current release of vtk is 5.4.0 (released on 2009-3-26). This release for download available from:<br />
<br />
http://www.vtk.org/VTK/resources/software.html<br />
<br />
Nightly development releases are available at:<br />
<br />
http://www.vtk.org/files/nightly<br />
<br />
=== Can I contribute code or bug fixes? ===<br />
<br />
We encourage people to contribute bug fixes as well as new contributions<br />
to the code. We will try to incorporate these into future releases so<br />
that the entire user community will benefit from them.<br />
<br />
See http://www.vtk.org/contribute.php for information on contributing to<br />
VTK.<br />
<br />
For some ideas take a look at some of the entries in the "Changes to the<br />
VTK API" FAQ section, for example: <br />
[[VTK_FAQ#Roadmap:_What_changes_are_being_considered_for_VTK|What changes are being considered for VTK]]<br />
<br />
We now have a bug tracker that allow keeping track of any bug you could find. See [http://www.vtk.org/Bug BugTracker].<br />
You'll need an email to report a bug.<br />
To improve the chance of a bug being fixed, do not hesisitate to add as many details as possible, a demo sample code + sample data is always a good idea.<br />
Providing a patch almost guarantees that your patch will be incorporated into VTK.<br />
<br />
=== Can I contribute money? ===<br />
<br />
Please don't send money. Not that we think you're going to send in<br />
unsolicited money. But if you were thinking about it, stop. It would<br />
just complicate our lives and make for all sorts of tax problems.<br />
<br />
(Note: if you are a company or funding institution, and would like to fund<br />
features or development, please contact Kitware http://www.kitware.com .)<br />
<br />
=== Is there a mailing list or Usenet newsgroup for VTK? ===<br />
<br />
There is a mailing list: vtkusers@vtk.org<br />
<br />
To subscribe or unsubscribe to the mailing list, go to:<br />
http://www.vtk.org/mailman/listinfo/vtkusers<br />
<br />
To search the list archives go to: http://www.kitware.com/search.html<br />
<br />
There is also a newsgroup that mirrors the mailinglist. At this point it<br />
seems that mirror is down. Mail to the mailinglist used to be posted the<br />
newsgroup, but posts on the newsgroup were not sent to the mailinglist.<br />
The newsgroup was located at:<br />
news://scully.esat.kuleuven.ac.be/vtk.mailinglist<br />
<br />
http://www.gmane.org is a bidirectional mail-to-news gateway that carries the vtkusers mailing list. Its located here: news://news.gmane.org/gmane.comp.lib.vtk.user or here: http://news.gmane.org/gmane.comp.lib.vtk.user. vtkusers mails have been archived since April 2002 and they never expire. You can read and send mails to the vtkusers list but sent mail will bounce back without having subscribed to the list first.<br />
<br />
=== Is the VTK mailing list archived anywhere? ===<br />
<br />
The mailing list is archived at:<br />
http://www.vtk.org/pipermail/vtkusers/<br />
<br />
You can search the archive at: http://www.kitware.com/search.html<br />
<br />
=== Are answers for the exercises in the VTK book available? ===<br />
<br />
Not anymore.<br />
<br />
The answers to the exercises of the textbook used to be maintained by<br />
Martin Stoufer (kudos), and will be made available by Kitware in the<br />
near future.<br />
<br />
=== Is VTK regression tested on a regular basis? Can I help? ===<br />
<br />
Yes, it is.<br />
<br />
You can view the current regression test results at:<br />
http://public.kitware.com/dashboard.php?name=vtk<br />
<br />
VTK uses Dart to perform builds, run tests, and generate dashboards. You<br />
can find more information about Dart at: http://public.kitware.com/Dart/<br />
<br />
You can help improve the quality of VTK by supplying the authors with<br />
Tcl scripts that can be used as or turned into regression tests. A good<br />
regression test will:<br />
<br />
# Cover code that is not already covered.<br />
# Illustrate a bug that is occuring now or that has occurred in the past.<br />
# Use data that is on the 2nd Edition book CDROM or use "small" data files or use no data at all.<br />
# Optionally, produce an interesting result. <br />
<br />
Currently almost all regression tests are written in Tcl.<br />
<br />
Please send your Tcl regression tests to:<br />
mailto:wlorens1@mail.nycap.rr.com<br />
<br />
Bill will evaluate them for applicability and integrate them into the<br />
nightly test process.<br />
<br />
=== What's the best way to learn VTK? ===<br />
<br />
There are five things you might want to try:<br />
<br />
# Purchase the book [http://www.kitware.com/products/vtktextbook.html The Visualization Toolkit] from Kitware Inc.<br />
# Purchase the book [http://www.kitware.com/products/vtkguide.html VTK Users Guide] from Kitware Inc. <br />
# [http://www.vtk.org/get-software.php Download the source code and/or binaries] (available on Windows) and work through the examples (there are 400-500 examples). <br />
# To learn the innards of VTK, you can attend a [http://www.kitware.com/products/proftrain.html#VTKCourse VTK course] or [http://www.kitware.com/products/proftrain.html sponsor a VTK course at your site] through Kitware. http://www.kitware.com/products/index.html<br />
# Buy Bill a beer and get him talking about VTK<br />
<br />
=== How should I ask questions on the mailing lists? ===<br />
<br />
The best online resource for this question is Eric S. Raymond's<br />
excellent guide on the topic titled "How to ask questions the smart<br />
way". Read it here:<br />
<br />
http://www.catb.org/~esr/faqs/smart-questions.html<br />
<br />
Please do read it and follow his advice. Thanks!<br />
<br />
Please also remember the following when you post your messages to the<br />
VTK mailing lists.<br />
<br />
* Mention the version of VTK you are using and the version of the compiler or scripting language you are using.<br />
<br />
* Mention your platform, OS and their versions.<br />
<br />
* Include hardware details if relevant.<br />
<br />
* Include all relevant error messages (appropriately trimmed of course).<br />
<br />
* The lists have a very large number of subscribers (in the thousands), so please keep messages to the point.<br />
<br />
* Avoid HTML emails.<br />
<br />
* Use a sensible and descriptive subject line.<br />
<br />
* Do NOT post large data files or images to the list. Instead put them in your web page and mention the URLs.<br />
<br />
* Quote the messages you reply to appropriately. Remove unnecessary details.<br />
<br />
When asking a question or reporting a problem try to include a small<br />
example program that demonstrates the problem. Make sure that this<br />
example program is as small as you can make it, simple (and uses VTK<br />
alone), complete and demonstrates the problem adequately. Doing this<br />
will go a *long way* towards getting a quick and meaningful response.<br />
<br />
Sometimes you might not get any acceptable response. This happens<br />
bacause the others think the question has either been already answered<br />
elsewhere (the archives, FAQ and google are your friends), or believe<br />
that you have not done enough homework to warrant their attention, or<br />
they don't know the answer or simply don't have the time to answer.<br />
Please do be patient and understanding. Most questions are answered by<br />
people volunteering their time to help you.<br />
<br />
Happy posting!<br />
<br />
=== How NOT to go about a programming assignment ===<br />
<br />
This is really a link you should read before posting to the mailing list. <br />
[This article is an attempt to show these irrational attitudes in an ironical way, <br />
intending to make our students aware of bad habits without admonishing them.]<br />
<br />
http://www.di.uniovi.es/~cernuda/noprog_ENG.html<br />
<br />
=== Accessing VTK CVS from behind a firewall ===<br />
<br />
Use the sourceforge project:<br />
<br />
http://cvsgrab.sourceforge.net/<br />
<br />
Just download the script and type something like:<br />
<br />
cvsgrab -rootUrl http://public.kitware.com/cgi-bin/cvsweb.cgi/ -packagePath VTK -destDir . <br />
-proxyUser xxx -proxyPassword xxx -proxyHost xxx -proxyPort xx<br />
<br />
(Thanks to Ingo H. de Boer)<br />
<br />
Also cvsgrab support the following option to access a particular branch:<br />
<br />
-tag <version tag> [optional] The version tag of the files to download<br />
<br />
For example to get the latest 4.4 branch:<br />
<br />
cvsgrab -rootUrl http://public.kitware.com/cgi-bin/cvsweb.cgi/ -packagePath VTK -destDir . <br />
-proxyUser xxx -proxyPassword xxx -proxyHost xxx -proxyPort xxx<br />
-tag release-4-4<br />
<br />
=== Where can I obtain test and sample datasets? ===<br />
<br />
See [[VTK Datasets|this page]] for details on downloading datasets that VTK can read.<br />
<br />
== Language bindings ==<br />
<br />
=== Are there bindings to languages other than Tcl? ===<br />
<br />
Aside from C++ (which it's written in) and Tcl, vtk is also bound into<br />
Java as of JDK 1.1 and Python 1.5, 1.6 and 2.X. All of the<br />
Tcl/Java/Python wrapper code is generated from some LEX and YACC code<br />
that parses our classes and extracts the required information to<br />
generate the wrapper code.<br />
<br />
=== What version of Tcl/Tk should I use with VTK? ===<br />
<br />
Currently we recommend that you use Tcl/Tk 8.2.3 with VTK. This is the<br />
best-supported version combination at this time.<br />
<br />
VTK has also been tested with Tcl/Tk 8.3.2 and works well.<br />
<br />
Tcl/Tk 8.3.4 has been tested to a limited extent but seems to have more<br />
memory leaks that Tcl 8.3.2 has.<br />
<br />
Tcl/Tk 8.4.x seems to work well with VTK too, but you might have to<br />
change a couple of configuration settings depending on the version of<br />
VTK you are using. Check the [[VTK_FAQ#Does_VTK_support_Tcl.2FTk_8.4_.3F|Does VTK support Tcl/Tk 8.4?]].<br />
<br />
=== Where can I find Python 2.x binaries? ===<br />
<br />
All of the Python binaries available on the kitware site are built for<br />
Python 1.5.2. This includes the official release VTK3.2 and the nightly<br />
builds (as at 2001-07-16).<br />
<br />
For Python 2.x binaries, you will have to compile your own from source.<br />
It is worth checking the mailing list archives for comments by others<br />
who have been through this process.<br />
<br />
There are some user-contributed binaries available at other sites. Check<br />
the mailing list archives for possible leads. Some win32 binaries for<br />
Python 2.1 are available at;<br />
<br />
http://basic.netmeg.net/godzilla/<br />
<br />
YMMV...<br />
<br />
=== Why do I get the Python error -- ValueError: method requires a VTK object? ===<br />
<br />
You just built VTK with Python support and everything went smoothly.<br />
After you install everything and try running a Python-VTK script you get<br />
a traceback with this error:<br />
<br />
ValueError: method requires a VTK object.<br />
<br />
This error occurs if you have two copies of the VTK libraries on your<br />
system. These copies need not be in your linkers path. The VTK libraries<br />
are usually built with an rpath flag (under *nix). This is necessary to<br />
be able to test the build in place. When you install VTK into another<br />
directory in your linkers path and then run a Python script the Python<br />
modules remember the old path and load the libraries in the build<br />
directory as well. This triggers the above error since the object you<br />
passed the method was instantiated from the other copy.<br />
<br />
So how do you fix it? The easiest solution is to simply delete the copy<br />
of the libraries inside your build directory or move the build directory<br />
to another place. For example, if you build the libraries in VTK/bin<br />
then move VTK/bin to VTK/bin1 or remove all the VTK/bin/*.so files. The<br />
error should no longer occur.<br />
<br />
Another way to fix the error is to turn the CMAKE_SKIP_RPATH boolean to<br />
ON in your CMakeCache.txt file and then rebuild VTK. You shouldn't have<br />
to rebuild all of VTK, just delete the libraries (*.so files) and then<br />
re-run cmake and make. The only trouble with this approach is that you<br />
cannot have BUILD_TESTING to ON when you do this.<br />
<br />
Alternatively, starting with recent VTK CVS versions (post Dec. 6, 2002)<br />
and with VTK versions greater than 4.1 (i.e. 4.2 and beyond) there is a<br />
special VTK-Python interpreter built as part of VTK called 'vtkpython'<br />
that should eliminate this problem. Simply use vtkpython in place of the<br />
usual python interpreter when you use VTK-Python scripts and the problem<br />
should not occur. This is because vtkpython uses the libraries inside<br />
the build directory.<br />
<br />
2002 by Prabhu Ramachandran<br />
<br />
=== Does VTK support Tcl/Tk 8.4 ? ===<br />
<br />
Short answer: yes, but it might require some adjustments, depending on<br />
the VTK and CMake versions you are using.<br />
<br />
# The VTK 4.x CVS nightly/development distribution supports Tcl/Tk 8.4 as long as you use a release version of CMake > 1.4.5. Since VTK 4.2 will require CMake 1.6, the next release version will support Tcl/Tk 8.4.<br />
# The VTK 4.0 release distribution does not support Tcl/Tk 8.4 out-of-the-box.<br />
<br />
In either cases, the following solutions will adress the problem. This<br />
basically involves setting two definition symbols that will make Tcl/Tk<br />
8.4 backward compatible with previous versions of Tcl/Tk (i.e. discard<br />
the "const correctness" and Tk_PhotoPutBlock compositing rule features) :<br />
<br />
a) Edit your C/C++ flags:<br />
<br />
Run your favorite CMake cache editor (i.e. CMakeSetup, or ccmake),<br />
display the advanced values and add the USE_NON_CONST and<br />
USE_COMPOSITELESS_PHOTO_PUT_BLOCK definition symbols to the end of any<br />
of the following CMake variables (if they exist): CMAKE_CXX_FLAGS,<br />
CMAKE_C_FLAGS.<br />
<br />
Example: On Unix your CMAKE_CXX_FLAGS will probably look like:<br />
<br />
-g -O2 -DUSE_NON_CONST -DUSE_COMPOSITELESS_PHOTO_PUT_BLOCK<br />
<br />
On Windows (Microsoft MSDev nmake mode):<br />
<br />
/W3 /Zm1000 /GX /GR /YX /DUSE_NON_CONST /DUSE_COMPOSITELESS_PHOTO_PUT_BLOCK<br />
<br />
b) or a more intrusive solution:<br />
<br />
Edit the top VTK/CMakeList.txt file and the following lines add '''at the<br />
top''' of this file:<br />
<br />
ADD_DEFINITIONS(<br />
-DUSE_NON_CONST<br />
-DUSE_COMPOSITELESS_PHOTO_PUT_BLOCK<br />
)<br />
<br />
<br />
=== When I try to run my program with Java-wrapped VTK, why do I get "java.lang.NoClassDefFoundError: vtk/vtkSomeClassName"? ===<br />
The file '''vtk.jar''' is not in your CLASSPATH in your execution environment.<br />
<br />
<br />
=== When I try to run my program with Java-wrapped VTK, why do I get "java.lang.UnsatisfiedLinkError: no vtkSomeLibraryName"? ===<br />
Some or all of the library (e.g., dll) files cannot be found. Make sure the files exist and that the PATH environment variable of your execution environment points to them.<br />
<br />
<br />
=== When I try to run my program with Java-wrapped VTK, why do I get Exception in thread "main" java.lang.UnsatisfiedLinkError: GetOutput_2 at vtk.vtkPolyDataAlgorithm.GetOutput_2(Native Method) ? ===<br />
<br />
== Using VTK ==<br />
<br />
=== The C++ compiler cannot convert some pointer type to another pointer type in my little program ===<br />
<br />
For instance, the C++ compiler cannot convert a <b><tt>vtkDataSet *</tt></b> type to a <b><tt>vtkImageData *</tt></b> type.<br />
<br />
It means the compiler does not know the relationship between a <b><tt>vtkDataSet</tt></b> and a <b><tt>vtkImageData</tt></b>. This relationship is actually inheritance: <b><tt>vtkImageData</tt></b> is a subclass of <b><tt>vtkDataSet</tt></b>. The only way for the compiler to know this relationship is to include the header file of the subclass, that is:<br />
<br />
#include "vtkImageData.h"<br />
<br />
If you wonder why the compiler did not complain about an unknown type, it is because somewhere (probably in a filter header file) there is a forward class declaration, like:<br />
<br />
class vtkImageData;<br />
<br />
=== Accessing a pointer in Python ===<br />
<br />
If you use VTK code with Python and need to pass some VTK data onto them, there are 2 approaches to wrap your code:<br />
# first, you can use the VTK wrapper (already used for the wrapping of VTK code)<br />
# you can use SWIG, which results in a light-weight module.<br />
<br />
In the second case, you will need to convert some VTK data, say a vtkPolyData, to a void pointer (no, it is not sufficient to just pass the object). For that, you can use the __this__ member variable in Python for the VTK data - see mailing archives:<br />
<br />
* [http://public.kitware.com/pipermail/vtkusers/2003-October/070054.html vtk, Python and SWIG - 'state of the union']<br />
<br />
=== What object/filter should I use to do ??? ===<br />
<br />
Frequently when starting out with a large visualization system people<br />
are not sure what object to use to achieve a desired effect.<br />
<br />
The most up-to-date information can be found in the VTK User's Guide<br />
(http://www.kitware.com/products/vtkguide.html).<br />
<br />
Alternative sources for information are the appendix of the book which<br />
has nice one line descriptions of what the different objects do and the<br />
VTK man pages (http://www.vtk.org/doc/nightly/html/classes.html).<br />
<br />
Additionally, the VTK man pages feature a "Related" section that provide<br />
links from each class to all the examples or tests using that class<br />
(http://www.vtk.org/doc/nightly/html/pages.html). This information is<br />
also provided in each class man page under the "Tests" or "Examples"<br />
sub-section.<br />
<br />
Some useful books are listed at http://www.vtk.org/buy-books.php<br />
<br />
=== What 3D file formats can VTK import and export? ===<br />
<br />
The following table identifies the file formats that VTK can read and<br />
write. Importer and Exporter classes move full scene information into or<br />
out of VTK. Reader and Writer classes move just geometry.<br />
<br />
{| border="1" cellpadding="2" cellspacing="0"<br />
|- bgcolor="#abcdef"<br />
! File Format !! Read !! Write<br />
|-<br />
| 3D Studio || vtk3DSImporter || <br />
|-<br />
| AVS "UCD" format || vtkAVSucdReader || <br />
|-<br />
| Movie BYU || vtkBYUReader || vtkBYUWriter<br />
|-<br />
| Renderman || || vtkRIBExporter<br />
|-<br />
| Open Inventor 2.0 || || vtkIVExporter/vtkIVWriter<br />
|-<br />
| CAD STL || vtkSTLReader || vtkSTLWriter<br />
|-<br />
| Fluent GAMBIT ASCII || vtkGAMBITReader || <br />
|-<br />
| Unigraphics Facet Files || vtkUGFacetReader || <br />
|-<br />
| Marching Cubes || vtkMCubesReader || vtkMCubesWriter<br />
|-<br />
| Wavefront OBJ || || vtkOBJExporter<br />
|-<br />
| VRML 2.0 || || vtkVRMLExporter<br />
|-<br />
| VTK Structured Grid &dagger; || vtkStructuredGridReader || vtkStructuredWriter<br />
|-<br />
| VTK Poly Data &dagger; || vtkPolyDataReader || vtkPolyDataWriter<br />
|-<br />
| PLOT3D || vtkPLOT3DReader || <br />
|-<br />
| CGM || || vtkCGMWriter<br />
|-<br />
| OBJ || vtkOBJReader || <br />
|-<br />
| Particle || vtkParticleReader || <br />
|-<br />
| PDB || vtkPDBReader || <br />
|-<br />
| PLY || vtkPLYReader || vtkPLYWriter<br />
|-<br />
| Gaussian || vtkGaussianCubeReader || <br />
|-<br />
| Facet || vtkFacetReader || vtkFacetWriter<br />
|-<br />
| XYZ || vtkXYZMolReader || <br />
|-<br />
| Ensight &Dagger; || vtkGenericEnSightReader || <br />
|}<br />
<br />
&dagger; See the books [http://www.kitware.com/products/vtktextbook.html The<br />
Visualization Toolkit, An Object-Oriented Approach to 3D Graphics] or<br />
[http://www.kitware.com/products/vtkguide.html the User's Guide] for details<br />
about structured grid and poly data file formats.<br />
<br />
&Dagger; The class vtkGenericEnSightReader allows the user to read an EnSight<br />
data set without a priori knowledge of what type of EnSight data set it<br />
is (among vtkEnSight6BinaryReader, vtkEnSight6Reader,<br />
vtkEnSightGoldBinaryReader, vtkEnSightGoldReader,<br />
vtkEnSightMasterServerReader, vtkEnSightReader).<br />
<br />
For any other file format you may want to search for a converter to a<br />
known VTK file format, more info on:<br />
http://www.tech-edv.co.at/lunix/UTILlinks.html<br />
<br />
=== Why can't I find vtktcl (vtktcl.c)? ===<br />
<br />
In versions of VTK prior to 4.0 VTK Tcl scripts would require a:<br />
<br />
catch {load vtktcl} <br />
<br />
so that they could be executed directly from wish. In VTK 4.0 the<br />
correct mechanism is to use:<br />
<br />
package require vtk<br />
<br />
For people using versions earlier than 4.0, vtktcl is a shared library<br />
that is built only on the PC. Most examples used the "catch" notation so<br />
that they will work on UNIX and on the PC. On UNIX you must use the vtk<br />
executable/shell which should be in vtk/tcl/vtk.<br />
<br />
=== Why does this filter not produce any output? eg. GetPoints()==0 ===<br />
<br />
This is a very common question for VTK users. VTK uses a pipeline mechanism for rendering, which has multiple benefits, including the fact that filters that aren't used don't get called. This means that when you call a function such as x->GetOutput()->GetPoints() this will return 0 if the filter has not yet been executed. Just call x->Update() beforehand to make the pipeline update everything up to that point and it should work. -timh<br />
<br />
=== Problems with vtkDecimate and vtkDecimatePro ===<br />
<br />
''vtkDecimate'' and ''vtkDecimatePro'' have been tested fairly heavily so<br />
all known bugs have been removed. However, there are three situations<br />
where you can encounter weird behavior:<br />
<br />
# The mesh is not all triangles. Solution: use ''vtkTriangleFilter'' to triangulate polygons.<br />
# The mesh consists of independent triangles (i.e., not joined at vertices - no decimation occurs). Solution: use ''vtkCleanPolyData'' to link triangles.<br />
# Bad triangles are present: e.g., triangles with duplicate vertices such as (1,2,1) or (100,100,112), or (57,57,57), and so on. Solution: use ''vtkCleanPolyData''.<br />
<br />
=== How can I read DICOM files ? ===<br />
<br />
Starting with VTK 4.4, you can use the [http://www.vtk.org/doc/nightly/html/classvtkDICOMImageReader.html vtkDICOMImageReader class] to read DICOM files. Note however that DICOM is a huge protocol, and vtkDICOMImageReader is not able to read every DICOM file out there. If it does not meet your needs, we suggest you look for an existing converter before coding your own. Some of them are listed in the [http://www.dclunie.com/medical-image-faq/html/part8.html The Medical Image Format FAQ (Part 8)].<br />
<br />
==== GDCM ====<br />
<br />
For a more elaborate DICOM library that supports more image format, you might try [http://gdcm.sourceforge.net GDCM].<br />
Specifically: [http://gdcm.sourceforge.net/html/classvtkGDCMImageReader.html vtkGDCMImageReader] & [http://gdcm.sourceforge.net/html/classvtkGDCMImageWriter.html vtkGDCMImageWriter]<br />
<br />
Grassroots DiCoM is a C++ library for DICOM medical files. It is automatically wrapped to python/C#/Java (using swig). It supports RAW,JPEG (lossy/lossless),J2K,JPEG-LS,RLE and deflated. It also comes with DICOM Part 3,6 & 7 of the standard as XML files.<br />
<br />
If GDCM is too complex to integrate in your environment you can also consider simply using the command line converter: [http://apps.sourceforge.net/mediawiki/gdcm/index.php?title=Gdcmconv gdcmconv] to convert an unsupported DICOM file into something that vtkDICOMImageReader, can support. Typically you would want:<br />
<br />
gdcmconv --raw compressed_input.dcom uncompressed_output.dcom<br />
<br />
==== dicom2 ====<br />
<br />
Sebastien BARRE wrote a free DICOM converter, named dicom2, that can be<br />
used to convert medical images to raw format. This tool is a command<br />
line program and does not provide any GUI at the moment.<br />
http://dicom2.barre.nom.fr/<br />
<br />
There is a special section dedicated to the VTK:<br />
http://dicom2.barre.nom.fr/how-to.html, then "Convert to raw (vtk)"<br />
<br />
The following page also provide links to several other DICOM converters:<br />
http://www.barre.nom.fr/medical/samples/index.html#links<br />
<br />
==== vtkVolume16Reader ====<br />
<br />
When searching the vtkusers mailing list a lot of posts are still using vtkVolume16Reader to read in DICOM file. It will works in the following case:<br />
* You know the dimension (cols & rows) of your image<br />
* You know the spacing of your image<br />
* You know the pixel type (pixel type & #components) of your image<br />
* You know Pixel Data (7fe0,0010) is the last element in the image<br />
* You know Pixel Data (7fe0,0010) was sent in uncompressed format (not encapsulated)<br />
<br />
All those requirements are a stronger set of requirements than vtkDICOMImageReader, therefore it is encourage to use vtkDICOMImageReader instead.<br />
<br />
==== The spacing in my DICOM files are wrong ====<br />
<br />
Image Position (Patient) (0020,0032) is the only attribute that can be relied on to determine the "reconstruction interval" or "space between the center of slices".<br />
<br />
If the distance between Image Position (Patient) (0020,0032) of two parallel slices along the normal to Image Orientation (Patient) (0020,0037) is not the same as whatever happens to be in the DICOM Spacing Between Slices (0018,0088) attribute, then (0018,0088) is incorrect, without question<br />
<br />
This is a known bug in some scanners.<br />
<br />
When Slice Thickness (0018,0050) + Spacing Between Slices (0018,0088) equals the computed reconstruction interval, then chances are the modality implementor has made the obvious mistake of misinterpreting the definition of<br />
(0018,0088) to mean the distance between edges (gap) rather than the distance between centers.<br />
<br />
Further, one should never use Slice Location (0020,1041) either, an optional and purely annotative attribute, though chances are that the distance between the Slice Location (0020,1041) values of two slices will match the distance along the<br />
normal to the orientation derived from the position.<br />
<br />
The GDCM library simply discard any information present in the (0018,0088) tag and instead recompute the spacing by computing the distance in between two consecutive slices (along the normal).<br />
<br />
GDCM 1.x:<br />
typedef std::vector<gdcm::File *> FileList;<br />
FileList l;<br />
gdcm::SerieHelper sh;<br />
sh.OrderFileList(l); // calls ImagePositionPatientOrdering()<br />
zspacing = sh.GetZSpacing();<br />
<br />
GDCM 2.x:<br />
IPPSorter ipp;<br />
ipp.Sort( filenames );<br />
zspacing = ipp.GetZSpacing();<br />
<br />
=== How to handle large data sets in VTK ===<br />
<br />
One of the challenges in VTK is to efficiently handle large datasets. By<br />
default VTK is tuned towards smaller datasets. For large datasets there<br />
are a couple of changes you can make that should yield a much smaller<br />
memory footprint (less swapping) and also improve rendering performance.<br />
The solution is to:<br />
<br />
# Use ReleaseDataFlag,<br />
# Turn on ImmediateModeRendering<br />
# Use triangle strips via vtkStripper<br />
# Use a different filter or mapper<br />
<br />
Each of these will be discussed below.<br />
<br />
==== Using ReleaseDataFlag ====<br />
<br />
By default VTK keeps a copy of all intermediate results between filters<br />
in a pipeline. For a pipeline with five filters this can result in<br />
having six copies of the data in memory at once. This can be controlled<br />
using ReleaseDataFlag and GlobalReleaseDataFlag. If ReleaseDataFlag is<br />
set to one on a data object, then once a filter has finished using that<br />
data object, it will release its memory. Likewise, if<br />
GlobalReleaseDataFlag is set on ANY data object, all data objects will<br />
release their memory once their dependent filter has finished executing.<br />
For example in Tcl and C++<br />
<br />
# Tcl<br />
vtkPolyDataReader reader<br />
[reader GetOutput] ReleaseDataFlagOn<br />
<br />
// C++<br />
vtkPolyDataReader *reader = vtkPolyDataReader::New();<br />
reader->GetOutput()->ReleaseDataFlagOn();<br />
<br />
or<br />
<br />
// C++<br />
vtkPolyDataReader *reader = vtkPolyDataReader::New();<br />
reader->GetOutput()->GlobalReleaseDataFlagOn();<br />
<br />
While turning on the ReleaseDataFlag will reduce your memory footprint,<br />
the disadvantage is that none of the intermediate results are kept in<br />
memory. So if you interactively change a parameter of a filter (such as<br />
the isosurface value), all the filters will have to re-execute to<br />
produce the new result. When the intermediate results are stored in<br />
memory, only the downstream filters would have to re-execute.<br />
<br />
One hint for good interactive performance. If only one stage of the<br />
pipeline can have its parameters changed interactively (such as the<br />
target reduction in a decimation filter), only retain the data just<br />
prior to that step (which is the default) and turn ReleaseDataFlag on<br />
for all other steps.<br />
<br />
==== Use ImmediateModeRendering ====<br />
<br />
By default, VTK uses OpenGL display lists which results in another copy<br />
of the data being stored in memory. For most large datasets you will be<br />
better off saving memory by not using display lists. You can turn off<br />
display lists by turning on ImmediateModeRendering. This can be<br />
controlled on a mapper by mapper basis using ImmediateModeRendering, or<br />
globally for all mappers in a process by using<br />
GlobalImmediateModeRendering. For example:<br />
<br />
# Tcl<br />
vtkPolyDataMapper mapper<br />
mapper ImmediateModeRenderingOn<br />
<br />
// C++<br />
vtkPolyDataMapper *mapper = vtkPolyDataMapper::New();<br />
mapper->ImmediateModeRenderingOn();<br />
<br />
or<br />
<br />
// C++<br />
vtkPolyDataMapper *mapper = vtkPolyDataMapper::New();<br />
mapper->GlobalImmediateModeRenderingOn();<br />
<br />
The disadvantage to using ImmediateModeRendering is that if memory is<br />
not a problem, your rendering rates will typically be slower with<br />
ImmediateModeRendering turned on.<br />
<br />
==== Use triangle strips via vtkStripper. ====<br />
<br />
Most filters in VTK produce independent triangles or polygons which are<br />
not the most compact or efficient to render. To create triangle strips<br />
from polydata you can first use vtkTriangleFilter to convert any<br />
polygons to triangles (not required if you only have triangles to start<br />
with) then run it through a vtkStipper to convert the triangles into<br />
triangle strips. For example in C++<br />
<br />
vtkPolyDataReader *reader = vtkPolyDataReader::New();<br />
reader->SetFileName("yourdatafile.vtk");<br />
reader->GetOutput()->ReleaseDataFlagOn();<br />
<br />
vtkTriangleFilter *tris = vtkTriangleFilter::New();<br />
tris->SetInput(reader->GetOutput());<br />
tris->GetOutput()->ReleaseDataFlagOn();<br />
<br />
vtkStripper *strip = vtkStripper::New();<br />
strip->SetInput(tris->GetOutput());<br />
strip->GetOutput()->ReleaseDataFlagOn();<br />
<br />
vtkPolyDataMapper *mapper = vtkPolyDataMapper::New();<br />
mapper->ImmediateModeRenderingOn();<br />
mapper->SetInput(tris->GetOutput());<br />
<br />
The only disadvantage to using triangle strips is that they require time<br />
to compute, so if your data is changing every time you render, it could<br />
actually be slower.<br />
<br />
==== Use a different filter or mapper ====<br />
<br />
This is a tough issue. In VTK there are typically a couple of ways to<br />
solve any problem. For example an image can be rendered as a polygon for<br />
each pixel, or it can be rendered as a single polygon with a texture map<br />
on it. For almost all cases the second approach will be much faster than<br />
the first event though VTK supports both. There isn't a single good<br />
answer for how to find the best approach. If you suspect that it is<br />
running more slowly than it should, try posting to the mailing list or<br />
looking for other ways to achieve the same result.<br />
<br />
=== VTK is slow, what is wrong? ===<br />
<br />
We have heard people say that VTK is really slow. In many of these<br />
cases, changing a few parameters can make a huge difference in performance.<br />
<br />
If you find that VTK is slower than other visualization systems running<br />
the same problem first take a look at the FAQ section dealing with large<br />
data: [[VTK_FAQ#How_to_handle_large_data_sets_in_VTK|How to handle large data sets in VTK]]. Many of its suggestions<br />
will improve VTK's performance significantly for many datasets.<br />
<br />
If you still find VTK slow, please let us know and send us an example<br />
(to mailto:kitware@kitware.com). In the past there<br />
have been some filters that simply were not written to be fast. When we<br />
come across one of these we frequently can make minor changes to the<br />
filter that will make it run much more quickly. In fact many changes in<br />
the past couple years have been this type of performance improvement.<br />
<br />
=== Is VTK thread-safe ? ===<br />
<br />
The short answer is no.<br />
<br />
Many VTK sources and filters cache information and will not perform as<br />
expected when used in multiple threads. When writing a multithreaded<br />
filter, the developer has to be very careful about how she accesses data.<br />
<br />
For example, GetXXX() methods which return a pointer should only be used<br />
to read. If the pointer returned by these methods are used to change<br />
data in multiple threads (without mutex locks), the result will most<br />
probably be wrong and unpredictable. In many cases, there are<br />
alternative methods which copy the data referred by the pointer. For<br />
example:<br />
<br />
float* vtkDataArray::GetTuple(const vtkIdType i);<br />
<br />
is thread-safe only for reading whereas:<br />
<br />
void vtkDataArray::GetTuple (const vtkIdType i, float * tuple);<br />
<br />
copies the requested tuple and is thread safe even if tuple is modified<br />
afterwards (as long as the same pointer is not passed as the argument<br />
tuple simultaneously by different threads).<br />
<br />
Unfortunately, only very few methods are clearly marked as<br />
thread-(un)safe and, in many situations, the developer has to dig into<br />
the source code to figure out whether an accessor is thread safe or not.<br />
<br />
''vtkDataSet'' and most of it's sub-classes are well documented and almost<br />
all methods are marked thread-safe or not thread-safe. This might be a<br />
good place to start. Most of the filters in imaging and some filters in<br />
graphics (like ''vtkStreamer'') are good examples of how a multi-threaded<br />
filter can be written in VTK.<br />
<br />
However, if you are not interested in developing multithreaded filters<br />
but want to process some data in parallel using the same (or similar)<br />
pipeline, your job is much easier. To do this, create a different copy<br />
of the pipeline on each thread and execute them in parallel on a<br />
different piece of the data. This is best accomplished by using<br />
''vtkThreadedController'' (instead of ''vtkMultiThreader''). See the<br />
documentation of ''vtkMultiProcessController'' and ''vtkThreadedController''<br />
and the examples in the parallel directory for details on how this can<br />
be done.<br />
<br />
Also, note that most of the OpenGL libraries are not thread-safe.<br />
Therefore, if you are rendering to multiple render windows from<br />
different threads, you are likely to get in trouble, even if you have<br />
mutex locks around the render calls.<br />
<br />
=== Can I use STL with VTK? ===<br />
<br />
As of VTK version 4.2, you can use STL. However, the following policy<br />
applies.<br />
<br />
# STL is for implementation, not interface. All STL references should be contained in a .cxx class or the private section of the .h header file.<br />
# Use the PIMPL idiom to forward reference/contain STL classes in heavily used superclasses. STL is big, fat, and slow to compile so we do not want to include STL headers in any .h files that are included by most of VTK, e.g. vtkObject.h vtkSource.h etc.<br />
# Include the VTK wrapper header files: vtkstd/map instead of map.<br />
# Use the vtkstd:: namespace to refer to STL classes and functions.<br />
<br />
Here's an example (from vtkInterpolateVelocityField):<br />
<br />
In the .h file (the PIMPL) forward declare<br />
<br />
class vtkInterpolatedVelocityFieldDataSetsType;<br />
//<br />
class VTK_COMMON_EXPORT vtkInterpolatedVelocityField : public vtkFunctionSet<br />
{<br />
private:<br />
vtkInterpolatedVelocityFieldDataSetsType* DataSets;<br />
};<br />
<br />
In the .cxx file define the class (here deriving from the STL vector<br />
container)<br />
<br />
# include <vtkstd/vector><br />
typedef vtkstd::vector< vtkSmartPointer<vtkDataSet> > DataSetsTypeBase;<br />
class vtkInterpolatedVelocityFieldDataSetsType: public DataSetsTypeBase<br />
{};<br />
<br />
In the .cxx file construct and destruct the class:<br />
<br />
vtkInterpolatedVelocityField::vtkInterpolatedVelocityField()<br />
{<br />
this->DataSets = new vtkInterpolatedVelocityFieldDataSetsType;<br />
}<br />
vtkInterpolatedVelocityField::~vtkInterpolatedVelocityField()<br />
{<br />
delete this->DataSets;<br />
}<br />
<br />
And in the .cxx file use the container as you would any STL container:<br />
<br />
for ( DataSetsTypeBase::iterator i = this->DataSets->begin();<br />
i != this->DataSets->end(); ++i)<br />
{<br />
ds = i->GetPointer();<br />
....<br />
}<br />
<br />
=== What image file formats can VTK read and write? ===<br />
<br />
The following table identifies the image file formats that VTK can read<br />
and write.<br />
<br />
{| border="1" cellpadding="2" cellspacing="0"<br />
|- bgcolor="#abcdef"<br />
! Image File !! Read !! Write<br />
|-<br />
| AVI || || vtkAVIWriter<br />
|-<br />
| Bitmap || vtkBMPReader || vtkBMPWriter<br />
|-<br />
| Digital Elevation Model (DEM) || vtkDEMReader || <br />
|-<br />
| DICOM || vtkDICOMImageReader || <br />
|-<br />
| GE Signal || vtkGESignaReader || <br />
|-<br />
| JPEG || vtkJPEGReader || vtkJPEGWriter<br />
|-<br />
| FFMPEG || || vtkFFMPEGWriter<br />
|-<br />
| MINC (1.1) || vtkMINCImageReader || vtkMINCImageWriter<br />
|-<br />
| MPEG2 || || vtkMPEG2Writer<br />
|-<br />
| Binary UNC meta image data || vtkMetaImageReader || vtkMetaImageWriter<br />
|-<br />
| PNG || vtkPNGReader || vtkPNGWriter<br />
|-<br />
| PNM || vtkPNMReader || vtkPNMWriter<br />
|-<br />
| PostScript || || vtkPostScriptWriter <br />
|-<br />
| SLC || vtkSLCReader || <br />
|-<br />
| TIFF || vtkTIFFReader || vtkTIFFWriter<br />
|-<br />
| RAW files &dagger; || vtkImageReader, vtkVolumeReader || <br />
|}<br />
<br />
&dagger; A typical example of use is:<br />
<br />
# Image pipeline<br />
reader = vtkImageReader()<br />
reader.SetDataByteOrderToBigEndian()<br />
reader.SetDataExtent(0,511,0,511,0,511)<br />
reader.SetFilePrefix("Ser397")<br />
reader.SetFilePattern("%s/I.%03d")<br />
reader.SetDataScalarTypeToUnsignedShort()<br />
reader.SetHeaderSize(5432)<br />
<br />
=== Printing an object. ===<br />
<br />
Sometimes when debugging you need to print an object to a string, either<br />
for logging purposes, or in the case of windows applications, to a window.<br />
<br />
Here is a way to do this:<br />
<br />
std::ostringstream os;<br />
//<br />
// "SomeVTKObject" could be, for example, <br />
// declared somewhere as: vtkCamera *SomeVTKObject;<br />
//<br />
SomeVTKObject->Print(os);<br />
vtkstd::string str = os.str();<br />
//<br />
// Process the string as you want<br />
<br />
=== Writing a simple CMakeLists.txt. ===<br />
<br />
If you get something that looks like:<br />
<br />
undefined reference to<br />
`__imp___ZN13vtkTIFFReader3NewEv'<br />
collect2: ld returned 1 exit status <br />
<br />
You certainly forgot to pass in a library to your executable. The easisest way is to use CMakeLists.txt file.<br />
<br />
For example the minimal project is:<br />
<br />
FIND_PACKAGE(VTK)<br />
IF (VTK_FOUND)<br />
INCLUDE (${VTK_USE_FILE})<br />
ENDIF (VTK_FOUND)<br />
ADD_EXECUTABLE(tiff tiff.cxx )<br />
TARGET_LINK_LIBRARIES (tiff<br />
vtkRendering<br />
)<br />
<br />
Since vtkRendering is link against all other vtk lib. Except if you are building VTK with Hybrid or Parallel in that case you need to explicitely specify which library you want to link against.<br />
<br />
=== Testing for VTK within a configure script ===<br />
<br />
VTK uses CMake as build tool but if you VTK-based application wants to use autoconf and/or automake, then you will find very useful an M4 macro file which detects from your configure script the presence/absence of VTK on the user system. VTK won't add such file into the official distribution but you can always write your own, as I did.<br />
Look in [[VTK_Autoconf]] page for more info.<br />
<br />
=== How do I get my C++ code editor to do VTK-style indentation? ===<br />
<br />
If you are writing code with VTK, you may want to follow the [[VTK Coding Standards]]. This is particularly important if you plan to contribute back to VTK. Most C++ code editors will help you with indenting, but the indenting may differ significantly from that prescribed by the [[VTK Coding Standards]]. Fortunately, most editors have enough options to allow you to change the indention enough to get at least close to the VTK-style indentation.<br />
<br />
Below is a list of C++ editors and some suggestions on getting the indentation VTK compliant. If you use a popular editor that is not listed here, please feel free to contribute.<br />
<br />
==== Microsoft Visual C++ .NET indentation ====<br />
<br />
Under the "Tools" menu, select "Options". Go to the options under "Text Editor" and then "C/C++". Click the "Tabs" options. Set "Indenting" to "Smart", "Indent Size" to 2, and select "Insert spaces". Click the "Formatting" options enable "Indent braces".<br />
<br />
This will make most of the indentation correct. However, it will indent all of the braces. In VTK classes, most of the braces are indented, but those starting a class, method, or function are typically flush left. You will have to correct this on your own.<br />
<br />
==== Emacs indentation ====<br />
<br />
Place the [[Elisp Code for VTK-Style C Indentation]] in your .emacs file.<br />
<br />
==== Vim indentation ====<br />
<br />
[[user talk:Andy|Andy Cedilnik]] has some information on following the VTK coding guidelines using vim. You may place the following in your <code>~/.vimrc</code> file<br />
set tabstop=2 " Tabs are two characters<br />
set shiftwidth=2 " Indents are two charactes too<br />
set expandtab " Do not use tabs<br />
set cinoptions={1s,:0,l1,g0,c0,(0,(s,m1<br />
"Keep tabs in makefiles as they are significant:<br />
:autocmd BufRead,BufNewFile [Mm]akefile :set noexpandtab<br />
<br />
=== How to display transparent objects? ===<br />
(keywords: alpha, correct, depth, geometry, object, opacity, opaque, order, ordering, peel, peeling, sorting, translucent, transparent.)<br />
<br />
When opaque geometry is rendered, there is no need to sort it because the depth buffer (or z-buffer) is used and the sorting is done automatically by keeping the geometry closest to the viewpoint at<br />
a given pixel. (It is easy because it is a MAX/MIN calculation, not a real sorting).<br />
<br />
With translucent geometry the final color of a pixel is the contribution of all the geometry primitives visible through the pixel. The color of the pixel is the result of <b>a</b> blending operation between the colors of all visible primitives. Blending operations themselves are usually order-dependent (ie not commutative). That's why depth sorting is required. There are two ways to fix the ordering in VTK:<br />
<br />
*1. Append all your polygonal geometry with [http://www.vtk.org/doc/nightly/html/classvtkAppendPolyData.html vtkAppendPolyData] and pass it to [http://www.vtk.org/doc/nightly/html/classvtkDepthSortPolyData.html vtkDepthSortPolyData]. See this tcl [http://public.kitware.com/cgi-bin/viewcvs.cgi/*checkout*/Hybrid/Testing/Tcl/depthSort.tcl?root=VTK&content-type=text/plain example]. Depth sorting is done per centroid of geometry primitives, not per pixel. For this reason it is not exact but it solves <b>most</b> of the ordering and gives result usually good enough.<br />
* 2. If the graphics card supports it, use "[[VTK/Depth_Peeling | depth peeling]]". It performs per pixel sorting (better result) but it is really slow.<br />
<br />
== Platform-specific questions ==<br />
<br />
=== What platforms does vtk run on? ===<br />
<br />
VTK should compile and run on most versions of Unix, Linux, Windows, and Mac OS X. It has been tested on Suns, SGIs, HPs, Alphas, RS6000s and many Windows and Mac workstations.<br />
<br />
=== What Graphics Cards work with VTK ===<br />
<br />
VTK uses OpenGL to perform almost all of its rendering and some graphics cards/drivers have better support for OpenGL than others. This is not a listing of what cards perform well. It is a listing of what cards actually produce correct results. Here is a list of cards and their status roughly in best to worst order.<br />
<br />
* Any Nvidia desktop card on Windows -- 100% compatible<br /> <br />
* Any ATI desktop cards on Windows -- 100% compatible<br /><br />
* Mesa -- most releases pass all VTK tests<br /><br />
* Microsoft Software OpenGL -- passes all VTK tests but does have a couple bugs<br /><br />
* Mac graphics cards -- these usually pass all VTK tests. Older cards may have some issues, for example, the ATI Rage 128 Pro does not support textures larger than 1024x1024.<br /><br />
* Non-linux UNIX cards (Sun HP SGI) -- These generally work<br /><br />
* Any Nvidia card under linux -- these usually pass all VTK tests but have some issues<br /><br />
* Any ATI card under linux -- these usually pass all VTK tests but have some issues<br /><br />
* Nvidia laptop graphics cards under Windows -- known to have some issues, newer cards pass all tests<br /><br />
* ATI laptop graphics cards under Windows -- known to have some issues, newer cards pass all tests (e.g. [http://public.kitware.com/pipermail/vtkusers/2004-August/075966.html ATI Mobility Radeon 9600])<br /><br />
* Intel Extreme Graphics -- fails some VTK tests<br /><br />
<br />
=== How do I build the examples on the PC running Windows? ===<br />
<br />
Since building the C++ examples on the PC isn't all that easy, here are<br />
some instructions from Jack McInerney.<br />
<br />
Steps for creating a VTK C++ project 8/14/96<br />
<br />
This is based on what I learned creating a project to run the Mace<br />
example. These steps allowed me to successfully built and run this example.<br />
<br />
# Create a console project (File, New, then select Console application).<br />
# Add the files of interest to the project. (e.g., Mace.cxx)<br />
# Under Build, select Update all Dependencies. A long list of .hh files will show up under dependencies<br /> For this to work, Visual C++ needs to know where to look to find the include files. In my case they are at C:\VTK\VTK12SRC\INCLUDE. To tell Visual C++ to look there, go to Tools, Options. Select the tab Directories. Under the list for Include files add: C:\VTK\VTK12SRC\INCLUDE<br />
# Compile the file Mace.cxx. This will lead to many warnings about data possibly lost as double variables are converted to float variables. These can be gotten rid of by going to Build, Settings, and select the C++ tab. Under the General catagory, set Warning Level to 1* (instead of 3).<br />
# Before linking, some additional settings must be modified. Go to Build, Settings, and select the Link tab. In the General catagory, add the libraries opengl32.lib and glaux.lib to the Object/Library Modules. Put a space between each file name. Then select the C++ tab and the Category: Code Generation. Under Use Run-Time Library, select Debug Multithreaded DLL. Select OK to exit the dialog box. The above libraries are available from Microsoft's Web site at: http://www.microsoft.com/softlib/mslfiles/Opengl95.exe or ftp://ftp.microsoft.com/softlib/mslfiles/Opengl95.exe <br /> This is a self extracting archive which contains these files. Simply place them in your windows system directory.<br />
# Link the code by selecting Build, Build MaceProject.exe. I still get one warning when I do this, but it appears to be harmless<br /><br />
<br />
When you go to run the program, it will bomb out unless it can find 2<br />
DLLs: Opengl32.dll and Glu32.dll. These need to be located either in the<br />
project directory or the C:\WINDOWS directory. These files are supplied<br />
on the vtk CD-ROM (in the vtk\bin directory).<br />
<br />
=== How do I build the Java examples on the PC running Windows? ===<br />
One common issue building the examples is missing one or all of vtkPanel, vtkCanvas and AxesActor<br />
classes. For whatever reason these are not in the vtk.jar (at least for 4.2.2).<br />
But you can get them from the source distribution (just unzip the source and extract<br />
these needed .java files, and point your Java-compiler to them).<br />
<br />
Another common issue appears to be class loading dependency errors. Make sure the<br />
directory with the .dll files is in your classpath when you run (default location<br />
is C:\Program Files\vtk42\bin\). Yet this still seems insufficient for some of the<br />
libraries. One possible solution is to copy the Java awt.dll to this directory as<br />
well.<br />
<br />
=== 64-bit System Issues ===<br />
<br />
vtk builds on 64 bit systems, that is, systems where sizeof(void*) is 64 bits. However, parts of the vtk codebase are not 64 bit clean and so runtime problems are likely if that code is used.<br />
<br />
===== General =====<br />
VTK binary files are not compatible between 32-bit and 64-bit systems. For portability, use the default file type, ASCII, for vtkPolyDataWriter, etc. You may be able to write a binary file on a 64-bit system and read it back in.<br />
<br />
===== Mac OS X Specific =====<br />
Mac OS X 10.3 and earlier have no support for 64 bit. On Mac OS X 10.4, VTK cannot be built as 64 bit because it requires Carbon, Cocoa, or X11, none of which are available to 64 bit processes. On Mac OS X 10.5, Cocoa is available to 64 bit processes, but Carbon is not. VTK is known to work reasonably with 64 bit Cocoa.<br />
<br />
===== Windows Specific =====<br />
todo<br />
<br />
=== What size swap space should I use on a PC? ===<br />
<br />
Building vtk on the PC requires a significant amount of memory (at least<br />
when using Visual C++)... but the final product is nice and compact. To<br />
build vtk on the PC, we recommend setting the min/max swap space to at<br />
least 400MB/500MB (depending on how much RAM you have... the sum of RAM<br />
and swap space should be roughly 500+ MB).<br />
<br />
=== Are there any benchmarks of VTK and/or the hardware it runs on? ===<br />
<br />
Take a look at the "Simple Sphere Benchmark":<br />
<br />
http://www.barre.nom.fr/vtk/bench.html<br />
<br />
It is not a "real world" benchmark, but provide synthetic results<br />
comparing different hardware running VTK:<br />
<br />
http://purl.oclc.org/NET/rriv/vtk/sphere-bench<br />
<br />
=== Why is XtString undefined when using VTK+Python on Unix? ===<br />
<br />
This is a side effect of dynamic linking on (some?) Unix systems. It<br />
appears often on Linux with the Mesa libraries at least. The solution is<br />
to make sure your Mesa libraries are linked with the Xt library. One way<br />
to do this is to add "-lXt" to MESA_LIB in your user.make file.<br />
<br />
=== How do I get the Python bindings to work when building VTK with Borland C++? ===<br />
<br />
If you've built VTK with the freely downloadable Borland C++ 5.5 (or its<br />
commercial counterpart) and you're using the Python binaries from<br />
http://www.python.org/, you'll note that when you try to run a VTK<br />
Python example you get something similar to the following error message:<br />
<br />
from vtkCommonPython import * <br />
ImportError: dynamic module does not define init function<br />
(initvtkCommonPython)<br />
<br />
This is because BCC32 prepends an underscore ("_") to all exported<br />
functions, so (in this case) the vtkCommonPython.dll contains a symbol<br />
_initvtkCommonPython which Python does not find. All kits (e.g.<br />
Rendering, Filtering, Patented) will suffer from this problem.<br />
<br />
The solution is to create Borland module definition in the VTK binary<br />
(output) directory, in my case VTK/bin. You have to do this for all kits<br />
that you are planning to use in Python. Each .def file must have the<br />
same basename as the DLL, e.g. "vtkCommonPython.def" for<br />
vtkCommonPython.dll and it must be present at VTK link time. The def<br />
file contains an export alias, e.g.:<br />
<br />
EXPORTS<br />
initvtkCommonPython=_initvtkCommonPython<br />
<br />
The Borland compiler will create an underscore-less alias in the DLL<br />
file and Python will be able to load it as a module.<br />
<br />
=== How do I build Python bindings on AIX? ===<br />
<br />
There is a problem with dynamic loading on AIX. Old AIX did not have<br />
dlopen/dlsym, but they used load mechanism. Python still reflects this.<br />
VTK is however not compatible with the old load mechanism.<br />
<br />
The following patch to Python 2.2.2 makes python use dlopen/dlsym on AIX<br />
5 or greater.<br />
<br />
http://www.vtk.org/files/misc/python_aix.diff<br />
<br />
=== How to build VTK for offscreen rendering? ===<br />
<br />
<b>[this section is obsolete. Mangle Mesa is not supported anymore in VTK>=5.2]</b> (not sure about 5.0)<br />
<br />
Struggled a few hours to get VTK to do offscreen rendering. I use it to<br />
batch process medical images. Without actually producing output on the<br />
screen, I still print resulting images in a report to easily review the<br />
results of an experiment.<br />
<br />
Here is how I solved this problem for VTK version 4.2.2.<br />
<br />
1. Download Mesa-4.0.4 source<br />
<br />
Modify Mesa-4.0.4/Make-config in the 'linux:' target the following vars:<br />
<br />
GL_LIB = libVTKMesaGL.so<br />
GLU_LIB = libVTKMesaGLU.so<br />
GLUT_LIB = libVTKMesaglut.so<br />
GLW_LIB = libVTKMesaGLw.so<br />
OSMESA_LIB = libOSVTKMesa.so<br />
<br />
In Mesa 6.2.1 you need to edit Mesa/configs/default instead:<br />
<br />
# Library names (base name)<br />
GL_LIB = VTKMesaGL<br />
GLU_LIB = VTKMesaGLU<br />
GLUT_LIB = VTKMesaglut<br />
GLW_LIB = VTKMesaGLw<br />
OSMESA_LIB = VTKMesaOSMesa<br />
<br />
<br />
And then export this env var:<br />
<br />
export CFLAGS="-O -g -ansi -pedantic -fPIC -ffast-math-DUSE_MGL_NAMESPACE -D_POSIX_SOURCE -D_POSIX_C_SOURCE=199309L-D_SVID_SOURCE -D_BSD_SOURCE -DUSE_XSHM -DPTHREADS -I/usr/X11R6/include"<br />
<br />
then<br />
<br />
For Mesa 4.0.4<br />
<br />
make -f Makefile.X11 linux <br />
cp Mesa-4.0.4/lib/* /data/usr/mesa404/lib/<br />
<br />
in Mesa 6.2.1:<br />
<br />
make linux-x86<br />
make install<br />
(I generally use /opt/VTKMesa/*)<br />
<br />
I use 'VTKMesa' name extension to avoid conflicts with my RH9.0 libs<br />
(especially OSMesa lib in XFree!). I'm using shared libraries, because<br />
that allows me to use dynamic libs from VTK and not the vtk program<br />
itself without explicitly having to load VTKMesaGL with my app. I copied<br />
the 'VTKMesa' libs in /data/usr/mesa404/lib/, but any odd place probably<br />
will work. Avoid /usr/lib /usr/local/lib for now.<br />
<br />
2. Follow normal instructions to get a proper working vtk, then<br />
<br />
ccmake <br />
<br />
with the following options:<br />
<br />
{| border="1" cellpadding="2" cellspacing="0"<br />
| VTK_USE_MANGLED_MESA || ON<br />
|-<br />
| MANGLED_MESA_INCLUDE_DIR || /data/usr/mesa404/include<br />
|-<br />
| MANGLED_MESA_LIBRARY || /data/usr/mesa404/lib/libVTKMesaGL.so<br />
|-<br />
| MANGLED_OSMESA_INCLUDE_DIR || /data/usr/mesa404/include<br />
|-<br />
| MANGLED_OSMESA_LIBRARY || /data/usr/mesa404/lib/libVTKMesaOSMesa.so<br />
|-<br />
| OPENGL_xmesa_INCLUDE_DIR || /data/usr/mesa404/include<br />
|}<br />
<br />
test using /data/prog/VTK-4.2.2/Examples/MangledMesa/Tcl scripts<br />
<br />
<br />
If you're doing things on UNIX, you should also look at [[VTK Classes]]. It has links to RenderWindow objects that are probably easier to use than rebuilding VTK with Mesa.<br />
<br />
=== How to get keyboard events working on Mac OS X? ===<br />
<br />
On Mac OS X, there are (at least) two kinds of executables:<br />
* [http://developer.apple.com/documentation/MacOSX/Conceptual/BPInternational/Articles/InternatSupport.html#//apple_ref/doc/uid/20000278-73764 Application Bundles]<br />
* plain UNIX executables<br />
<br />
For a program to be able to display a graphical interface (that is, display windows that allow mouse and keyboard interaction) it really should be an Application Bundle. If a plain UNIX executable tries, there will be various bugs, such as keyboard and mouse events not working reliably.<br />
<br />
Many, but not all, of the example VTK applications are built as plain UNIX executables, and thus have these problems. This is [http://www.vtk.org/Bug/bug.php?op=show&bugid=2025 VTK bug 2025].<br />
<br />
When you build your own VTK application, it is best to make it in the form of an Application Bundle. With CMake 2.0.5 or later, simply add the following to your CMakeLists.txt file:<br />
<br />
IF(APPLE)<br />
SET(EXECUTABLE_FLAG MACOSX_BUNDLE)<br />
ENDIF(APPLE)<br />
<br />
If for some reason you cannot build as an Application Bundle (perhaps because your app needs command line parameters) you might be able to avoid the above problems by adding an [http://developer.apple.com/documentation/MacOSX/Conceptual/BPRuntimeConfig/Articles/ConfigFiles.html#//apple_ref/doc/uid/20002091-SW1 __info_plist section] to your Mach-O executable. If you succeed, please post to the VTK list.<br />
<br />
=== Can VTK be built as a Universal Binary on Mac OS X? ===<br />
<br />
For VTK 5.0.4 and older, the short answer is "no".<br />
<br />
For VTK CVS the short answer is "mostly". You need to set CMAKE_OSX_ARCHITECTURES to the architectures you want and CMAKE_OSX_SYSROOT to a Mac OS X SDK that supports Universal builds. The usual settings are:<br />
<br />
CMAKE_OSX_ARCHITECTURES=ppc;i386 <br />
CMAKE_OSX_SYSROOT=/Developer/SDKs/MacOSX10.4u.sdk <br />
<br />
This will result in a Universal build. However, there may be runtime bugs due to VTK's use of TRY_RUN. Work is being done to improve this situation.<br />
<br />
=== How can I stop Java Swing or AWT components from flashing or bouncing between values? ===<br />
<br />
While not strictly a VTK problem, this comes up fairly often when using Java-wrapped VTK. Try the following two JRE arguments to stop the Swing/AWT components flashing:<br />
-Dsun.java2d.ddoffscreen=false -Dsun.java2d.gdiblit=false<br />
Note that these are classified as "unsupported properties," so may not work on all platform or installations (in particular, ddoffscreen refers to DirectDraw and, as such, is specific to Windows).<br />
<br />
=== How can a user process access more than 2 GB of ram in 32-bit Windows? ===<br />
<br />
By default on Windows, the most memory that a user process can access is 2 GB, no matter how much RAM you have installed in your system. With Windows XP Professional you can make it possible for a process to use up to 3 GB of memory by doing two things:<br />
<br />
1) Modify the boot parameters in boot.ini (on my 32 bit WinXP Pro machine, it's in: "C:\boot.ini") to tell the operating system that you want user processes to have access to up to 3GB of RAM (This is a really important file, and if you don't know what you are doing, stop reading this and go back to work!). This is done by adding the /3GB flag to the line of the file that tells the boot loader where the operating system is. My boot.ini file looks like:<br />
<br />
[boot loader]<br />
timeout=30<br />
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS<br />
[operating systems]<br />
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /3GB<br />
<br />
This is a very bad file to make mistakes on, so don't - it may be very difficult to repair your computer to boot if you mess up this file. There is a nice description of this in the Microsoft article <br />
[http://www.microsoft.com/whdc/system/platform/server/PAE/PAEmem.mspx Memory Support and Windows Operating Systems].<br />
<br />
2) The other thing that you need to do is make your executable LARGEADDRESSAWARE. Assuming that you have a Windows binary that you want to try this on, you can use the 'editbin' utility that comes with Visual Studio to change the setting of one bit (the IMAGE_FILE_LARGE_ADDRESS_AWARE bit) in the image header of the executable. For a program 'prog.exe' you can make the change by<br />
<br />
editbin /LARGEADDRESSAWARE prog.exe<br />
<br />
Of course, depending on how your program handles memory you might find that it crashes when you try to use the extra memory, but that's a separate issue. If you are compiling your program with a version of Visual Studio you should be able to find the switch to make your program /LARGEADDRESSAWARE.<br />
<br />
=== Shared builds of VTK and debugging QVTKWidget using Visual Studio ===<br />
<br />
Assuming that you have built a shared build of VTK and you may or may<br />
not have a set it up such that there is a path to the release version<br />
of VTK in your PATH statement.<br />
<br />
Then if you debug a project that is using QVTKWidget, you will come<br />
across a problem in that if you are debugging a debug version; the<br />
application depends upon the debug version of QVTK.dll which will<br />
depend upon QtGui4d.dll (among others) and load it. But, because the<br />
release version of QVTK.dll is in the path, QtGiu4.dll will also be<br />
loaded preventing the application from running. You will get a<br />
"QWidget: Must construct a QApplication before a QPaintDevice"<br />
message.<br />
<br />
The solution to this problem is to set the path to the correct build<br />
of VTK on the "'''Debugging'''" properties of your project. Right click on<br />
your project, bring up the properties dialog, and select "'''Debugging'''"<br />
from the list on the left. There should be an "'''Environment'''" line. You<br />
can add variables here using key=value pairs.<br />
For example, add the following line:<br />
PATH=<Path To VTK>\bin\$(OutDir);%PATH%<br />
You can then add the same line to other configurations, such as the release one, by selecting<br />
them from the top left drop down box labelled '''Configuration'''.<br />
<br />
$(OutDir) will be set by Visual Studio to either Debug or Release,<br />
depending upon what configuration you have selected. Make sure <br />
that ;%PATH% is appended so that Qt and other files can be appended <br />
to the PATH statement.<br />
<br />
<br />
== Changes to the VTK API ==<br />
<br />
=== What is the policy on Changes to the API ===<br />
<br />
Between patch releases maintain the API unless there is a really strong reason not to. <br />
<br />
Between regular releases maintain backwards compatibility to the API with prior releases of VTK when doing so does not increase the complexity or readability of the current VTK or when the benefits of breaking the API are negligible.<br />
<br />
Clearly these statements have a lot of wiggle room. For example in vtkLightKit BackLight and Headlight were used and released. Now BackLight and HeadLight might make more sense and probably will be easier for non-native English speakers, but is it worth breaking the API for it, probably not. Another factor is how long the API has been around and how widely used it is. These all indicate how painful it will be to change the API which is half of the cost/benefit decision.<br />
<br />
=== Change to vtkIdList::IsId() ===<br />
<br />
vtkIdList::IsId(int id) used to return a 0 or 1 to indicate whether the<br />
specified id is in the list. Now it returns -1 if the id is not in the<br />
list; or a non-negative number indicating the position in the list.<br />
<br />
=== Changes vtkEdgeTable ===<br />
<br />
vtkEdgeTable had two changes. The constructor now takes no arguments,<br />
and you use InitEdgeInsertion() to tell the class how many points are in<br />
the dataset. Also, IsEdge(p1,p2) now returns a -1 if the edge (defined<br />
by points p1,p2) is not defined. otherwise a non-negative integer value<br />
is returned.<br />
<br />
These changes were made to support the association of attributes with<br />
edges.<br />
<br />
=== Changes between VTK 4.2 and VTK 4.4 (and how to update) ===<br />
<br />
We have removed the CVS date, revision, and the language from the<br />
copyright on all the files. This information wasn't being used much and<br />
it created extra work for developers. For example you edit vtkObject.h<br />
rebuild all of VTK, check in you change, then you must rebuild all of<br />
VTK again because commiting the header file causes it to be changed by<br />
CVS (because the revision number changed) This change will also make it<br />
easier to compare different branches of VTK since these revision number<br />
differences will no longer show up. The CVS revision number is still in<br />
the cxx file in the RevisionMacro. You don't need to make any changes to<br />
your code for this.<br />
<br />
The DataArray classes now use a templated intermediate class to share<br />
their implementation. Again there is no need for you to make changes to<br />
your code.<br />
<br />
Legacy code has been removed. Specifically none of the old style<br />
callbacks are supported and observers should be used instead. So where<br />
you used a filter->SetStartMethod(myFunc) you should do a<br />
filter->AddObserver(vtkCommand::StartEvent,myCommand) Usually this will<br />
require you to create a small class for the observer.<br />
vtkImageOpenClose3D.cxx has an example of using an observer and there<br />
are a few other examples in VTK. If you switch to using Observers your<br />
code should also work with versions of VTK from 3.2 or later since the<br />
Observers have been in VTK since VTK 3.2.<br />
<br />
Many functions that previously took or returned float now take or return<br />
double. To change your code to work with VTK 4.4 or later you can just<br />
replace float with double for the appropriate calls and variables. If<br />
you want your code to work with both old and new versions of VTK you can<br />
use vtkFloatingPointType which is defined to be double in VTK 4.4 and<br />
later and it is float in vtk 4.2.5. In versions of VTK prior to 4.2.5<br />
you can use something like:<br />
<br />
#ifndef vtkFloatingPointType<br />
#define vtkFloatingPointType vtkFloatingPointType<br />
typedef float vtkFloatingPointType;<br />
#endif<br />
<br />
at the beginning of your code. That will set it to the correct value for<br />
all versions of VTK old and new.<br />
<br />
=== Use of New() and Delete() now enforced (vs. new & delete) ===<br />
<br />
Constructors and destructors in VTK are now protected. This means you<br />
can no longer use little "new" or "delete" to create VTK instances.<br />
You'll have to use the methods ::New() and ::Delete() (as has been<br />
standard practice for some time).<br />
<br />
The reason for this is to enforce the use of New() and Delete(). Not<br />
using New() and Delete() can lead to bad mojo, mainly reference counting<br />
problems or not taking advantage of special procedures incorporated into<br />
the New() method (e.g., selecting the appropriate hardware interface<br />
during instance creation time).<br />
<br />
If you've used New() and Delete() in your code, these changes will not<br />
affect you at all. If you're using little "new" or "delete", your code<br />
will no longer and compile and you'll have to switch to New() and Delete().<br />
<br />
=== Changes between VTK 4.4 and VTK 4.6 ===<br />
<br />
Collection Changes<br />
<br />
Collections have had some small changes (originally started by Chris<br />
Volpe) to better support reentrant iteration. Specifically all the<br />
collection have an InitTraversal(sit) and GetNextFoobar(sit) methods.<br />
(where Foobar is what the collection contains, for example<br />
GetNextActor(sit)) The argument to both of these methods is a<br />
vtkCollectionSimpleIterator. Most of the collection use in VTK has been<br />
modified to use these new methods. The advantage is that these new<br />
methods support having the same collection be iterated through in a<br />
reentrant safe manner. In the past this was not true and led to a number<br />
of problems. In the future for C++ class development please use this<br />
approach to iterating through a collection. These changes are fully<br />
backwards compatible and no old APIs were harmed in the making of these<br />
changes. So in summary, for the future, where you would have written:<br />
<br />
for (actors->InitTraversal();<br />
(actor = actors->GetNextActor());)<br />
<br />
you would now have:<br />
<br />
vtkCollectionSimpleIterator actorIt;<br />
for (actors->InitTraversal(actorIt);<br />
(actor = actors->GetNextActor(actorIt));)<br />
<br />
=== Changes in VTK between 3.2 and 4.0 ===<br />
<br />
* Changes to vtkDataSetAttributes, vtkFieldData and vtkDataArray: All attributes (scalars, vectors...) are now stored in the field data as vtkDataArray's. vtkDataSetAttributes became a sub-class of vtkFieldData. For backwards compatibility, the interface which allows setting/getting the attributes the old way (by passing in a sub-class of vtkAttributeData such as vtkScalars) is still supported but it will be removed in the future. Therefore, the developers should use the new interface which requires passing in a vtkDataArray to set an attribute. vtkAttributeData and it's sub-classes (vtkScalars, vtkVectors...) will be deprectated in the near future; developers should use vtkDataArray and it's sub-classes instead. We are in the process of removing the use of these classes from vtk filters.<br />
<br />
* Subclasses of vtkAttributeData (vtkScalars, vtkVectors, vtkNormals, vtkTCoords, vtkTensors) were removed. As of VTK 4.0, vtkDataArray and it's sub-classes should be used to represent attributes and fields. Detailed description of the changes and utilities for upgrading from 3.2 to 4.0 can be found in the package: http://www.vtk.org/files/misc/Upgrading.zip<br />
<br />
* Added special methods to data arrays to replace methods like<br />
<br />
tc SetTCoord i x y 0<br />
<br />
or<br />
<br />
vc SetVector i vx vy vz<br />
<br />
in interpreted languages (Tcl, Python, Java). Use:<br />
<br />
tc SetTuple2 i x y<br />
<br />
or<br />
<br />
vc SetTuple3 i vx vy vz<br />
<br />
* Improved support for parallel visualization: vtkMultiProcessController and it's sub-classes have been re-structured and mostly re-written. The functionality of vtkMultiProcessController have been re-distributed between vtkMultiProcessController and vtkCommunicator. vtkCommunicator is responsible of sending/receiving messages whereas vtkMultiProcessController (and it's subclasses) is responsible of program flow/control (for example processing rmi's). New classes have been added to the Parallel directory. These include vtkCommunicator, vtkMPIGroup, vtkMPICommunicator, vtkSharedMemoryCommunicator, vtkMPIEventLog... There is now a tcl interpreter which supports parallel scripts. It is called pvtk and can be build on Windows and Unix. Examples for both Tcl and C++ can be found in the examples directories.<br />
<br />
* vtkSocketCommunicator and vtkSocketController have been added. These support message passing via BSD sockets. Best used together with input-output ports.<br />
<br />
* Since it was causing very long compile times (it essentially includes every vtk header file) and it was hard to maintain (you had to add a line whenever you added a class to VTK) vtk.h was removed. You will have to identify the header files needed by your application and include them one by one.<br />
<br />
* vtkIterativeClosestPointTransform has been added. This class is an implementation of the ICP algorithm. It matches two surfaces using the iterative closest point (ICP) algorithm. The core of the algorithm is to match each vertex in one surface with the closest surface point on the other, then apply the transformation that modify one surface to best match the other (in a least square sense).<br />
<br />
* The SetFileName, SaveImageAsPPM and related methods in vtkRenderWindow have been removed. vtkWindowToImageFilter combined with any of the image writers provides greater functionality.<br />
<br />
* Support for reading and writing PGM and JPEG images has been included.<br />
<br />
* Methods with parameters of the form "type param[n]" are wrapped. Previously, these methods were only wrapped if the array was declared 'const'. The python wrappers will allow values to be returned in the array.<br />
<br />
* The directory structure was completely reorganized. There are now subdirectories for Common (core common classes) Filtering (superclasses for filtering operations) Imaging (filters and sources that produce images or structured points) Graphics (filters or sources that produce data types other than ImageData and StructuredPoints) IO (file IO classes that do not require Rendering support) Rendering (all actors mappers annotation and rendering classes) Hybrid (typically filters and sources that require support from Rendering or both Imaging and Graphics) Parallel (parallel visualization support classes) Patented (patented classes) Examples (documented examples) Wrapping (support for the language wrappers). In many directories you will see a Testing subdirectory. The Testing subdirectories contain tests used to validate VTKs operation. Some tests may be useful as examples but they are not well documented.<br />
<br />
* The Build process for VTK now uses CMake (found at www.cmake.org) This replaces pcmaker on windows and configure on UNIX. This resolves some longstanding problems and limitation we were having with pcmaker and configure, and unifies the build process into one place.<br />
<br />
=== Changes to VTK between 4.0 and 4.2 ===<br />
<br />
* Use of macros to support serialization, standardize the New method, and provide the Superclass typedef.<br />
<br />
* Subclassing of VTK classes in the python wrappers (virtual method hooks are not provided).<br />
<br />
* vtkImageWindow, vtkImager, vtkTkImageWindowWidget and their subclasses have been removed to reduce duplicated code and enable interation in ImageWindows. Now people should use vtkRenderer and vtkRenderWindow instead. vtkImageViewer still works as a turn key image viewing class although it now uses vtkRenderWindow and vtkRenderer internally instead of vtkImageWindow and vtkImager.<br />
<br />
* New class: vtkBandedPolyDataContourFilter. Creates solid colored bands (like you find on maps) of scalar value.<br />
<br />
* Event processing: Several new events to VTK were added (see vtkCommand.h). Also event processing can now be prioritized and aborted. This allows applications to manage who processes which events, and terminates the processing of a particular event if desired.<br />
<br />
* 3D Widgets: A new class vtkInteractorObserver was added to observe events on vtkRenderWindowInteractor. Using the new event processing infrastructure, multiple 3D widgets (subclasses of vtkInteractorObserver) can be used simultaneously to process interactions. Several new 3D widgets have been added including:<br />
** vtkLineWidget<br />
** vtkPlaneWidget<br />
** vtkImagePlaneWidget<br />
** vtkBoxWidget<br />
** vtkSphereWidget<br />
<br />
* Besides providing a representation, widgets also provide auxiliary functionality such as providing transforms, implicit functions, plane normals, sphere radius and center, etc.<br />
<br />
* New class: vtkInstantiator provides a means by which one can create an instance of a VTK class using only the name of the class as a string.<br />
<br />
* New class: vtkXMLParser provides a wrapper around the Expat XML parsing library. A new parser can be written by subclassing from vtkXMLParser and providing a few simple virtual method implementations.<br />
<br />
* TIFF reader is now implemented using libtiff, which makes it capable of reading almost all available TIFF formats. The libtiff is also available internally as vtktiff.<br />
<br />
* New method (all sub-classes of vtkObject): Added a virtual function called NewInstance to vtkTypeMacro. NewInstance creates and returns an object of the same type as the current one. It does not copy any properties. The returned pointer is of the same type as the pointer the method was invoked with. This method should replace all the MakeObject methods scattered through VTK.<br />
<br />
* vtkSetObject macro is depricated for use inside the VTK. It is still a valid construct in projects that use VTK. Instead use vtkCxxSetObjectMacro which does the same thing.<br />
<br />
* vtkPLOT3DReader have been improved. It now supports:<br />
** multigrid (each block is one output)<br />
** ascii<br />
** fortran-style byte counts<br />
** little/big endian<br />
** i-blanking (partial)<br />
<br />
* A new vtkTextProperty class has been created, and duplicated text API s have been obsoleted accordingly. Check the<br />
[[VTK_FAQ#Text_properties_in_VTK_4.2|Text properties in VTK 4.2]] FAQ entry for a full description of the change.<br />
<br />
=== How do I upgrade my existing C++ code from 3.2 to 4.x? ===<br />
<br />
This is (a corrected version of) an email that was posted to vtkusers.<br />
Please feel free to correct or add anything.<br />
<br />
{| cellspacing="3" <br />
|- valign="top"<br />
|width="55%" bgcolor="#f0f0ff" style="border:1px solid #ffc9c9;padding:1em;padding-top:0.5em;"|<br />
<br />
I've just ported my medium-sized (40K lines) application from vtk3.2 to<br />
vtk4.x. I thought I would share my experiences with you, in case there<br />
were people out there contemplating it but a bit scared.<br />
<br />
One source of information for upgrading code is:<br />
<br />
http://www.vtk.org/files/misc/Upgrading.zip<br />
<br />
I'm using VC++6 + MFC on Win2K and was unable/unwilling to run the<br />
script in the zip file.<br />
<br />
So,<br />
<br />
I switched all my include directories to the new VTK ones and<br />
recompiled. 337 errors, not unexpectedly. Most concerned vtkScalars and<br />
vtkTCoords which have both been removed. Where I was using single value<br />
scalars, like this:<br />
<br />
vtkScalars *scalars = vtkScalars::New();<br />
scalars->SetNumberOfScalars(N_POINTS);<br />
...<br />
polydata->GetPointData()->SetScalars(scalars);<br />
...<br />
scalars->SetScalar(i,2.3);<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkFloatArray *scalars = vtkFloatArray::New();<br />
scalars->SetNumberOfComponents(1);<br />
scalars->SetNumberOfTuples(N_POINTS);<br />
...<br />
polydata->GetPointData()->SetScalars(scalars);<br />
...<br />
scalars->SetTuple1(i,2.3);<br />
...<br />
<br />
OK so far, far fewer errors.<br />
<br />
Where I had 2D texture coordinates:<br />
<br />
vtkTCoords *tcoords = vtkTCoords::New();<br />
tcoords->SetNumberOfTCoords(N);<br />
...<br />
float p[3];<br />
p[0]=x; p[1]=y;<br />
tcoords->SetTCoord(i,p);<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkFloatArray *tcoords = vtkFloatArray::New();<br />
tcoords->SetNumberOfComponents(2);<br />
tcoords->SetNumberOfTuples(N);<br />
...<br />
float p[2];<br />
p[0]=x; p[1]=y;<br />
tcoords->SetTuple(i,p);<br />
....<br />
<br />
All well and good, still fewer errors. Make sure you call<br />
SetNumberOfComponents *before* SetNumberOfTuples else you'll get<br />
problems (I did!).<br />
<br />
Where I was creating 0-255 image data and had been using:<br />
<br />
vtkScalars* scalars = vtkScalars::New();<br />
scalars->SetDataTypeToUnsignedChar();<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkUnsignedCharArray *scalars = vtkUnsignedCharArray::New()<br />
...<br />
<br />
Going well!<br />
<br />
When creating RGB images, I had been using:<br />
<br />
vtkScalars *scalars = vtkScalars::New();<br />
scalars->SetDataTypeToUnsignedChar();<br />
scalars->SetNumberOfComponents(3);<br />
scalars->SetNumberOfScalars(X*Y);<br />
...<br />
scalars->SetActiveComponent(0);<br />
scalars->SetScalar(i,val1);<br />
scalars->SetActiveComponent(1);<br />
scalars->SetScalar(i,val2);<br />
scalars->SetActiveComponent(2);<br />
scalars->SetScalar(i,val3);<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkUnsignedCharArray *scalars = vtkUnsignedCharArray::New()<br />
scalars->SetNumberOfComponents(3);<br />
scalars->SetNumberOfTuples(X*Y);<br />
...<br />
scalars->SetComponent(i,0,val1);<br />
scalars->SetComponent(i,1,val2);<br />
scalars->SetComponent(i,2,val3);<br />
...<br />
<br />
My remaining errors concerned vtkWin32OffscreenRenderWindow that has<br />
been removed. Where I had been using:<br />
<br />
vtkWin32OffscreenRenderWindow *offscreen = vtkWin32OffscreenRenderWindow::New();<br />
...<br />
<br />
I replaced with:<br />
<br />
vtkWin32OpenGLRenderWindow *offscreen = vtkWin32OpenGLRenderWindow::New();<br />
offscreen->SetOffScreenRendering(1);<br />
...<br />
<br />
All done. I'd had to throw in some #include "vtkFloatArray.h" and things<br />
like that of course. Zero compile errors.<br />
<br />
Had to remember to link against the new vtk lib files, so I removed<br />
<br />
vtkdll.lib <br />
<br />
and added<br />
<br />
vtkCommon.lib<br />
vtkGraphics.lib<br />
<br />
etc.<br />
<br />
Zero link errors. My program is up and running again, no apparant<br />
problems. Plus now I can use all the new features of vtk4. (And I'm sure<br />
it's faster but maybe that's my imagination.)<br />
<br />
All this took me about three hours.<br />
<br />
Bye!<br />
<br />
Tim.<br />
|}<br />
<br />
=== What is the release schedule for VTK ===<br />
<br />
VTK has a formal release every eight to sixteen months. VTK 4.0 was cut in December 2001 and released in March 2002. VTK 4.2 was releaseed in February 2003. VTK 4.4 (which was an interim release) was released at the end of 2003. VTK 5.0 was released in January 2006, 5.0.1 in July 2006, 5.0.2 in September 2006, 5.0.3 in March 2007, and 5.0.4 in January 2008.<br />
<br />
=== Roadmap: What changes are being considered for VTK ===<br />
<br />
This is a list of changes that are being considered for inclusion into<br />
VTK. Some of these changes will happen while other changes we would like<br />
to see happen but may not due to funding or time issues. For each change<br />
we try to list what the change is, when we hope to complete it, if it is<br />
actively being developed. Detailed discussion on changes is limited to<br />
the vtk-developers mailing list.<br />
<br />
# Modify existing image filters to use the new vtkImageIterator etc. Most simple filters have been modified to use ithe iterator in VTK 4.2. It would be nice to have some sort of efficient neighborhood iterators but so far we haven't come up with any.<br />
# Rework the polydata and unstructured grid structures (vtkMesh ??). Related ideas include:<br />
#* Make UnstructuredGrid more compact by removing the cell point count from the vtkCellArray. This will reduce the storage required by each cell by 4 bytes.<br />
#* Make vtkPolyData an empty subclass of vtkUnstructuredGrid. There are a number of good reasons for this but it is a tricky task and backwards compatibility needs to be maintained.<br />
# More parallel support, including parallel compositing algorithms<br />
# Algorithms like LIC (http://www-courses.cs.uiuc.edu/~cs419/lic.pdf), maybe a couple terrain-decimation algorithms<br />
# Further integration of STL and other important C++ constructs (like templates)<br />
<br />
VTK 4.4 (intermediate release, end of 2003)<br />
<br />
* convert APIs to double (done)<br />
* remove old callbacks (done)<br />
* blanking<br />
* ref count observers (done)<br />
* switch collections to use iterators (done)<br />
* improve copyright (done)<br />
<br />
VTK 5.0 (major release, early 2005)<br />
<br />
* new pipeline mechanism (see [[Media:Pipeline.pdf|Pipeline.pdf]])<br />
* time support<br />
* true AMR support<br />
<br />
=== Changes to Interactors ===<br />
<br />
The Interactors have been updated to use the Command/Observer events of<br />
vtk. The vtkRenderWindowInteractor now has ivars for all the event<br />
information. There is a new class called<br />
vtkGenericRenderWindowInteractor that can be used to set up the bindings<br />
from other languages like python, Java or TCl.<br />
<br />
A new class vtkInteractorObserver was also added. It has a<br />
SetInteractor() method. It observes the keypress and delete events<br />
invoked by the render window interactor. The keypress activation value<br />
for a widget is now 'i' (although this can be programmed).<br />
vtkInteractorObserver has the state ivar Enabled. All subclasses must<br />
have the SetEnabled(int) method. Convenience methods like On(), Off(),<br />
EnabledOn(), and EnabledOff() are available. The state of the interactor<br />
observer is obtained using GetEnabled(). The SetEnabled(1) method adds<br />
observers to watch the interactor (appropriate to the particular<br />
interactor observer) ; SetEnabled(0) removes the observers . There are<br />
two new events: EnableEvent and DisableEvent which are invoked by the<br />
SetEnabled() method.<br />
<br />
The events also support the idea of priority now. When you add an<br />
observer, you can specify a priority from 0 to 1. Higher values will be<br />
called back first. An observer can also tell the object not to call any<br />
more observers. This way you can handle an event, and stop further<br />
processing. In this way you can add handlers to InteractorStyles without<br />
sub-classing and from wrapped languages.<br />
<br />
For more information see: vtkGenericRenderWindowInteractor,<br />
vtkRenderWindowInteractor, vtkInteractorObserver.<br />
<br />
=== Header files and vtkSetObjectMacro ===<br />
<br />
On some platforms such as MS Visual Studio .NET, compiler is not capable<br />
of handling too big input files. Some VTK files with all includes do<br />
become big enough to overwhelm the compiler. The solution is to minimize<br />
the amount of includes. This especially goes for header files, because<br />
they propagate to other files. Every class header file should include<br />
only the parent class header file. If there is no other alternative, you<br />
should put a comment next to include file explaining why the file has to<br />
be included.<br />
<br />
The related issue is with vtkSetObjectMacro. This file calles some<br />
methods on an argument class, which implies that the argument class<br />
header file has to be included. The result is bloat on the header files.<br />
The solution is to not use vtkSetObjectMacro but vtkCxxSetObjectMacro.<br />
The difference is that vtkCxxSetObjectMacro goes in the Cxx file and not<br />
in the header file.<br />
<br />
Example: Instead of<br />
<br />
#include "vtkBar.h"<br />
class vtkFoo : public vtkObject<br />
{ ...<br />
vtkSetObjectMacro(Bar, vtkBar);<br />
...<br />
};<br />
<br />
Do:<br />
<br />
class vtkBar;<br />
class vtkFoo : public vtkObject<br />
{<br />
...<br />
virtual void SetBar(vtkBar*);<br />
...<br />
};<br />
<br />
and add the following line to vtkFoo.cxx<br />
<br />
vtkCxxSetObjectMacro(vtkFoo,Bar,vtkBar);<br />
<br />
=== Text properties in VTK 4.2 ===<br />
<br />
A new<br />
[http://public.kitware.com/VTK/doc/nightly/html/classvtkTextProperty.html vtkTextProperty]<br />
class has been added to VTK 4.2.<br />
<br />
This class factorizes text attributes that used to be spread out and<br />
duplicated in many different classes (mostly 2D actors and text<br />
mappers). Among those attributes, font family, font size,<br />
bold/italic/shadow properties, horizontal and vertical justification,<br />
line spacing and offset have been retained, whereas new attributes like<br />
color and opacity have been introduced.<br />
<br />
We tried to make sure that you can use a vtkTextProperty to modify text<br />
properties in the same way a vtkProperty can be used to modify the<br />
surface properties of a geometric object. In that regard, you should be<br />
able to share a vtkTextProperty between different actors or assign the<br />
same vtkTextProperty to an actor that offers multiple vtkTextProperty<br />
attributes ([http://www.vtk.org/doc/nightly/html/classvtkXYPlotActor.html vtkXYPlot]<br />
for example).<br />
<br />
Here is a quick example:<br />
<br />
vtkTextActor *actor0 = vtkTextActor::New();<br />
actor0->GetTextProperty()->SetItalic(1);<br />
//<br />
vtkTextProperty *tprop = vtkTextProperty::New();<br />
tprop->SetBold(1);<br />
//<br />
vtkTextActor *actor1 = vtkTextActor::New();<br />
actor1->SetTextProperty(tprop);<br />
//<br />
vtkTextActor *actor2 = vtkTextActor::New();<br />
actor2->SetTextProperty(tprop);<br />
<br />
*Backward compatibility issues*:<br />
<br />
1) Color and Opacity:<br />
<br />
The text color and text opacity settings are now controlled by the<br />
vtkTextProperty Color and Opacity attributes instead of the<br />
corresponding actor's color and opacity attributes. In the following<br />
example, those settings were controlled by the attributes of the<br />
vtkProperty2D attached to the vtkActor2D (vtkTextActor). The<br />
vtkTextProperty attributes should be used instead:<br />
<br />
vtkTextActor *actor = vtkActor::New();<br />
actor->GetProperty()->SetColor(...);<br />
actor->GetProperty()->SetOpacity(...);<br />
<br />
becomes:<br />
<br />
actor->GetTextProperty()->SetColor(...);<br />
actor->GetTextProperty()->SetOpacity(...);<br />
<br />
To make migration easier for a while, we have set the vtkTextProperty<br />
default color to ''(-1.0, -1.0, -1.0)'' and the default opacity to ''-1.0''.<br />
These "magic" values are checked by the underlying text mappers at<br />
rendering time. If they are found, the color and opacity of the 2D<br />
actor's vtkProperty2D are used, just as it was in VTK 4.1.<br />
<br />
2) API (i.e. SetBold(), SetItalic(), etc)<br />
<br />
Most of the VTK classes involving text used to provide their own text<br />
attributes like Bold, Italic, Shadow, FontFamily. Thus, each of those<br />
classes would duplicate the vtkTextMapper API through methods like<br />
SetItalic(), SetBold(), SetFontFamily(), etc.<br />
<br />
Moreover, if one class had different text elements (say, for example,<br />
the title and the labels of a scalar bar), there was no way to modify<br />
the text properties of these elements separately.<br />
<br />
The vtkTextProperty class has been created to address both issues, by<br />
obsoleting those duplicated attributes and methods and providing a<br />
unified way to access text properties, and by allowing each class to<br />
associate different vtkTextProperty to different text elements.<br />
<br />
Migrating your code basically involves using the old API on your actor's<br />
vtkTextProperty instead of the actor itself. For example:<br />
<br />
actor->SetBold(1);<br />
<br />
becomes:<br />
<br />
actor->GetTextProperty()->SetBold(1);<br />
<br />
When a class provides different vtkTextProperty for different text<br />
elements, the TextProperty attribute is usually prefixed with that<br />
element type. Example: AxisTitleTextProperty, or AxisLabelTextProperty.<br />
This allows you to set different aspect for each text elements. If you<br />
want to use the same properties, you can either set the same values on<br />
each vtkTextProperty, or make both vtkTextProperty point to the same<br />
vtkTextProperty object. Example:<br />
<br />
actor->GetAxisLabelTextProperty()->SetBold(1);<br />
actor->GetAxisTitleTextProperty()->SetBold(1);<br />
<br />
or:<br />
<br />
vtkTextProperty *tprop = vtkTextProperty::New();<br />
tprop->SetBold(1);<br />
actor->SetAxisLabelTextProperty(tprop);<br />
actor->SetAxisTitleTextProperty(tprop);<br />
<br />
or:<br />
<br />
actor->SetAxisLabelTextProperty(actor->GetAxisTitleTextProperty());<br />
actor->GetAxisTitleTextProperty()->SetBold(1);<br />
<br />
The following list specifies the name of the text properties used in the<br />
VTK classes involving text.<br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkTextMapper.html vtkTextMapper]:<br />
* you can still use the vtkTextMapper + vtkActor2D combination, but we would advise you to use a single vtkTextActor instead, this will give you maximum flexibility.<br />
* has 1 text prop: TextProperty, but although you have access to it, do not twwak it unless you are using vtkTextMapper with a vtkActor2D. In all other cases, use the text prop provided by the actor (see below).<br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkTextActor.html vtkTextActor]:<br />
* has 1 text prop: TextProperty. <br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkLabeledDataMapper.html vtkLabeledDataMapper]:<br />
* has 1 text prop: LabelTextProperty. <br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkCaptionActor2D.html vtkCaptionActor2D]:<br />
* has 1 text prop: CaptionTextProperty. <br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkLegendBoxActor.html vtkLegendBoxActor]:<br />
* has 1 text prop: EntryTextProperty.<br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkAxisActor2D.html vtkAxisActor2D],<br />
[http://www.vtk.org/doc/nightly/html/classvtkParallelCoordinatesActor.html vtkParallelCoordinatesActor], and<br />
[http://www.vtk.org/doc/nightly/html/classvtkScalarBarActor.html vtkScalarBarActor]:<br />
* have 2 text props: TitleTextProperty, LabelTextProperty.<br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkXYPlotActor.html vtkXYPlotActor]:<br />
* has 3 text prop: TitleTextProperty (plot title), AxisTitleTextProperty, AxisLabelTextProperty (title and labels of all axes)<br />
* the legend box text prop (i.e. entry text prop) can be retrieved through actor->GetLegendBoxActor()->GetEntryTextProperty()<br />
* the X (or Y) axis text props (i.e. title and label text props) can be retrieved through actor->GetX/YAxisActor2D->GetTitle/LabelTextProperty(), and will override the corresponding AxisTitleTextProperty or AxisLabelTextProperty props as long as they remain untouched. <br />
<br />
[http://www.vtk.org/doc/nightly/html/classvtkCubeAxesActor2D.html vtkCubeAxesActor2D]:<br />
* has 2 text props: AxisTitleTextProperty, AxisLabelTextProperty (title and label of all axes)<br />
* the X (Y or Z) axis text props (i.e. title and label text props) can be retrieved through actor->GetX/Y/ZAxisActor2D->GetTitle/LabelTextProperty(), and will override the corresponding AxisTitleTextProperty or AxisLabelTextProperty props as long as they remain untouched.<br />
<br />
=== Forward declaration in VTK 4.x ===<br />
<br />
Since VTK 4.x all classes have been carefully inspected to only include the necessesary headers, and do what is called 'forward declaration' for all other needed classes. Thus, when you compile your projects using a filter that takes as input a dataset and you are passing an imagedata: you need to explicitely include imagedata within your implementation file. This is true for all data types.<br />
<br />
For example, if you get this error:<br />
<br />
no matching function for call to `vtkContourFilter::SetInput(vtkImageData*)'<br />
VTK/Filtering/vtkDataSetToPolyDataFilter.h:44:<br />
candidates are: virtual void vtkDataSetToPolyDataFilter::SetInput(vtkDataSet*)<br />
<br />
This means you need to add in your code : #include "vtkImageData.h"<br />
<br />
=== Using Volume Rendering in VTK ===<br />
<br />
I recently updated my VTK CVS version. And my c++ code that use to work fine are now complaining about:<br />
<br />
undefined reference to `vtkUnstructuredGridAlgorithm::SetInput(vtkDataObject*)'<br />
undefined reference to `vtkUnstructuredGridAlgorithm::GetOutput()' <br />
<br />
There is now a new subfolder and a new option to enable building the VolumeRendering library. You have to turn VTK_USE_VOLUMERENDERING to ON in order to use it. Also make sure that your executable is linking properly to this new library:<br />
<br />
ADD_EXECUTABLE(foo foo.cxx)<br />
TARGET_LINK_LIBRARIES(foo vtkVolumeRendering)<br />
<br />
=== API Changes in VTK 5.2 ===<br />
<br />
==== <tt>vtkProp::RenderTranslucentGeometry()</tt> is gone ====<br />
<br />
<tt>vtkProp::RenderTranslucentGeometry()</tt> is gone and has been broken down into 3 methods:<br />
* <tt>HasTranslucentPolygonalGeometry()</tt><br />
* <tt>RenderTranslucentPolygonalGeometry()</tt><br />
* <tt>RenderVolumetricGeometry()</tt><br />
<br />
Here is what to change in a vtkProp subclass:<br />
* If <tt>RenderTranslucentGeometry()</tt> was used to render translucent polygonal geometry only, override <tt>HasTranslucentPolygonalGeometry()</tt> and <tt>RenderTranslucentPolygonalGeometry()</tt>. <b>Just renaming <tt>RenderTranslucentGeometry()</tt> as <tt>RenderTranslucentPolygonalGeometry()</tt> is not enough!</b><br />
* If <tt>RenderTranslucentGeometry()</tt> was used to render translucent volumetric geometry only, override <tt>RenderVolumetricGeometry()</tt>. In this case, just renaming <tt>RenderTranslucentGeometry()</tt> as <tt>RenderVolumetricGeometry()</tt> is OK.<br />
* If <tt>RenderTranslucentGeometry()</tt> was used to render translucent polygonal geometry and translucent volumetric geometry, override all 3 methods.<br />
<br />
The reason of this change is that <tt>HasTranslucentPolygonalGeometry()</tt> is used to decide if an expensive initialization of the new rendering algorithm of translucent polygonal geometry (depth peeling) is necessary. <tt>RenderTranslucentPolygonalGeometry()</tt> is called multiple times during the rendering of the translucent polygonal geometry of the scene. <tt>RenderVolumetricGeometry()</tt> is called in an additional pass, after depth peeling. For this reason, <b><tt>RenderTranslucentGeometry()</tt> cannot just be marked as deprecated but had to be removed from the API</b>.<br />
<br />
<br />
<br />
==== <tt>vtkImagePlaneWidget</tt> has action names changed ====<br />
from:<br />
enum<br />
{<br />
CURSOR_ACTION = 0,<br />
SLICE_MOTION_ACTION = 1,<br />
WINDOW_LEVEL_ACTION = 2<br />
};<br />
to:<br />
enum<br />
{<br />
VTK_CURSOR_ACTION = 0,<br />
VTK_SLICE_MOTION_ACTION = 1,<br />
VTK_WINDOW_LEVEL_ACTION = 2<br />
};<br />
<br />
==== <tt>GetOutput()</tt> now returns <tt>vtkDataObject</tt> for some algorithms ====<br />
<br />
The following algorithms now work on <tt>vtkGraph</tt> as well as <tt>vtkDataSet</tt>, so no <tt>GetOutput()</tt> longer returns <tt>vtkDataSet</tt>. To obtain the dataset, use <tt>vtkDataSet::SafeDownCast(filter->GetOutput())</tt><br />
* <tt>vtkArrayCalculator</tt><br />
* <tt>vtkAssignAttribute</tt><br />
* <tt>vtkProgrammableFilter</tt><br />
<br />
=== API Changes in VTK 5.4 ===<br />
* empty right now.<br />
=== API Changes in VTK 5.5 ===<br />
<br />
* vtkStreamTracer<br />
Changed<br />
enum Units <br />
{ <br />
TIME_UNIT, <br />
LENGTH_UNIT, <br />
CELL_LENGTH_UNIT <br />
}<br />
to<br />
enum Units<br />
{ <br />
TIME_UNIT = 0, <br />
CELL_LENGTH_UNIT = 2 <br />
}<br />
<br />
Changed<br />
* MaximumPropagation<br />
* MaximumIntegrationStep<br />
* MinimumIntegrationStep<br />
* InitialIntegrationStep <br />
from type ''IntervalInformation'' to type ''double''.<br />
<br />
Added a member variable to the class<br />
* int IntegrationStepUnit<br />
<br />
The following APIs were '''removed''' from the class:<br />
* void SetMaximumProgration(int unit, double max)<br />
* void SetMaximumProgrationUnit(int unit)<br />
* int GetMaximumPropagationUnit()<br />
* void SetMaximumPropagationUnitToTimeUnit()<br />
* void SetMaximumPropagationUnitToLengthUnit()<br />
* void SetMaximumPropagationUnitToCellLengthUnit()<br />
* void SetMinimumIntegrationStep(int unit, double step)<br />
* void SetMinimumIntegrationStepUnit(int unit)<br />
* int GetMinimumIntegrationStepUnit()<br />
* void SetMinimumIntegrationStepUnitToTimeUnit()<br />
* void SetMinimumIntegrationStepUnitToLengthUnit()<br />
* void SetMinimumIntegrationStepUnitToCellLengthUnit()<br />
* void SetMaximumIntegrationStep(int unit, double step)<br />
* void SetMaximumIntegrationStepUnit(int unit)<br />
* int GetMaximumIntegrationStepUnit()<br />
* void SetMaximumIntegrationStepUnitToTimeUnit()<br />
* void SetMaximumIntegrationStepUnitToLengthUnit()<br />
* void SetMaximumIntegrationStepUnitToCellLengthUnit()<br />
* void SetInitialIntegrationStep(int unit, double step)<br />
* void SetInitialIntegrationStepUnit(int unit)<br />
* int GetInitialIntegrationStepUnit()<br />
* void SetInitialIntegrationStepUnitToTimeUnit()<br />
* void SetInitialIntegrationStepUnitToLengthUnit()<br />
* void SetInitialIntegrationStepUnitToCellLengthUnit()<br />
* void ConvertIntervals(double& step, double& minStep, double& maxStep, int direction, double cellLength, double speed)<br />
* static double ConvertToTime(IntervalInformation& interval, double cellLength, double speed)<br />
* static double ConvertToLength(IntervalInformation& interval, double cellLength, double speed)<br />
* static double ConvertToCellLength(IntervalInformation& interval, double cellLength, double speed)<br />
* static double ConvertToUnit(IntervalInformation& interval, double cellLength, double speed)<br />
<br />
The following APIs were added to the class:<br />
* int GetIntegrationStepUnit()<br />
* void SetIntegrationStepUnit(int unit)<br />
* void ConvertIntervals(double& step, double& minStep, double& maxStep, int direction, double cellLength)<br />
* static double ConvertToTime(double interval, int unit, double cellLength)<br />
* static double ConvertToTime(IntervalInformation& interval, double cellLength)<br />
* static double ConvertToCellLength(IntervalInformation& interval, double cellLength)<br />
* static double ConvertToUnit(IntervalInformation& interval, int unit, double cellLength)<br />
<br />
<br />
* vtkInterpolatedVelocityField<br />
Added a new member variable and two associated functions:<br />
* bool NormalizeVector<br />
* vtkSetMacro(NormalizeVector, bool)<br />
* vtkGetMacro(NormalizeVector, bool)<br />
<br />
== OpenGL requirements ==<br />
<br />
=== Terminology ===<br />
<br />
* a software component using OpenGL (like VTK) <b>requires</b> some minimal version of OpenGL and some minimal set of OpenGL extensions at runtime. At compile time, it <b>requires</b> an OpenGL header file (<tt>gl.h</tt>) compatible with some minimal version of the OpenGL API.<br />
* an OpenGL implementation (software (like Mesa) or hardware (combination of a graphic card and its driver) ) <b>supports</b> some OpenGL versions and a set of extensions.<br />
<br />
=== How do I check which OpenGL versions or extensions are supported by my graphic card or OpenGL implementation? ===<br />
<br />
==== Linux/Unix ====<br />
<br />
Two ways:<br />
<br />
* General method<br />
<pre><br />
$ glxinfo<br />
</pre><br />
<br />
* vendor specific tool<br />
<br />
if you have an nVidia card and nvidia-settings installed on it, run it and go to the OpenGL/GLX Information item under the X Screen 0 item.<br />
<br />
==== Windows ====<br />
<br />
You can download and use GLview http://www.realtech-vr.com/glview<br />
<br />
==== Mac OS X ====<br />
<br />
With Xcode installed, Macintosh HD->Developer->Applications->Graphic Tools->OpenGL Driver Monitor.app->Monitors->Renderer Info-><name of the OpenGL driver>->OpenGL Extensions<br />
<br />
=== VTK 5.0 ===<br />
<br />
==== What is the minimal OpenGL version of the API required to compile VTK5.0? ====<br />
<br />
The <tt>gl.h</tt> file provided by your compiler/system/SDK has to define at least the OpenGL 1.1 API.<br />
<br />
(Note: the functions and macros defined in higher OpenGL API versions or in other OpenGL extensions are provided by <tt>glext.h</tt>, <tt>glxext.h</tt> and <tt>wglext.h</tt>. Those 3 files are official files taken from http://opengl.org/registry/ and already part of the VTK source tree).<br />
<br />
==== What is the minimal OpenGL version required by VTK5.0 at runtime? ====<br />
<br />
All the VTK classes using OpenGL require an OpenGL implementation (software or hardware) >=1.1 except for <tt>vtkVolumeTextureMapper3D</tt>.<br />
<br />
If you want to use <tt>vtkVolumeTextureMapper3D</tt>, the following extensions or OpenGL versions are required (at runtime):<br />
* extension <tt>GL_EXT_texture3D</tt> or OpenGL>=1.2<br />
and<br />
* extension <tt>GL_ARB_multitexture</tt> or OpenGL>=1.3<br />
and either:<br />
* extensions <tt>GL_ARB_fragment_program</tt> and <tt>GL_ARB_vertex_program</tt><br />
or:<br />
* extensions <tt>GL_NV_texture_shader2</tt> and <tt>GL_NV_register_combiners</tt> and <tt>GL_NV_register_combiners2</tt><br />
<br />
=== VTK 5.2 ===<br />
<br />
==== What is the minimal OpenGL version of the API required to compile VTK5.2? ====<br />
<br />
Same answer than for VTK 5.0.<br />
<br />
==== What is the minimal OpenGL version required by VTK5.2 at runtime? ====<br />
<br />
All the VTK classes using OpenGL require an OpenGL implementation (software or hardware) >=1.1 except for <tt>vtkVolumeTextureMapper3D</tt>, <tt>vtkHAVSVolumeMapper</tt>,<br />
<tt>vtkGLSLShaderProgram</tt>, depth peeling and some hardware offscreen rendering using framebuffer objects (FBO).<br />
<br />
If you want to use <tt>vtkVolumeTextureMapper3D</tt>, the following extensions or OpenGL versions are required (at runtime):<br />
* extension <tt>GL_EXT_texture3D</tt> or OpenGL>=1.2<br />
and<br />
* extension <tt>GL_ARB_multitexture</tt> or OpenGL>=1.3<br />
and either:<br />
* extensions <tt>GL_ARB_fragment_program</tt> and <tt>GL_ARB_vertex_program</tt><br />
or:<br />
* extensions <tt>GL_NV_texture_shader2</tt> and <tt>GL_NV_register_combiners</tt> and <tt>GL_NV_register_combiners2</tt><br />
<br />
If you want to use <tt>vtkHAVSVolumeMapper</tt>, the following extensions or OpenGL versions are required (at runtime):<br />
* OpenGL>=1.3<br />
* <tt>GL_ARB_draw_buffers</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_fragment_program</tt><br />
* <tt>GL_ARB_vertex_program</tt><br />
* <tt>GL_EXT_framebuffer_object</tt><br />
* either <tt>GL_ARB_texture_float</tt> or <tt>GL_ATI_texture_float</tt><br />
<br />
The following extension or OpenGL version is used by <tt>vtkHAVSVolumeMapper</tt> if provided (at runtime), but it is optional:<br />
* <tt>GL_ARB_vertex_buffer_object</tt> or OpenGL>=1.5<br />
<br />
If you want to use <tt>vtkGLSLShaderProgram</tt>, the following extensions or OpenGL versions are required (at runtime):<br />
* OpenGL>=1.3<br />
* <tt>GL_ARB_shading_language_100</tt> or OpenGL>=2.0,<br />
* <tt>GL_ARB_shader_objects</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_vertex_shader</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_fragment_shader</tt> or OpenGL>=2.0.<br />
<br />
Depth peeling ( see [[VTK/Depth_Peeling | VTK Depth Peeling]] for more information) requires (at runtime):<br />
* <tt>GL_ARB_depth_texture</tt> or OpenGL>=1.4<br />
* <tt>GL_ARB_shadow</tt> or OpenGL>=1.4<br />
* <tt>GL_EXT_shadow_funcs</tt> or OpenGL>=1.5<br />
* <tt>GL_ARB_vertex_shader</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_fragment_shader</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_shader_objects</tt> or OpenGL>=2.0<br />
* <tt>GL_ARB_occlusion_query</tt> or OpenGL>=1.5<br />
* <tt>GL_ARB_multitexture</tt> or OpenGL>=1.3<br />
* <tt>GL_ARB_texture_rectangle</tt><br />
* <tt>GL_SGIS_texture_edge_clamp</tt> or <tt>GL_EXT_texture_edge_clamp</tt> or OpenGL>=1.2<br />
<br />
Hardware-based offscreen rendering using framebuffer object (FBO) will be used as the default offscreen method if the following extensions or OpenGL version are available (at runtime):<br />
* <tt>GL_EXT_framebuffer_object</tt><br />
and either <br />
* <tt>GL_ARB_texture_non_power_of_two</tt> or OpenGL>=2.0<br />
or<br />
* <tt>GL_ARB_texture_rectangle</tt><br />
In addition, if the the framebuffer needs a stencil buffer, extension <tt>GL_EXT_packed_depth_stencil</tt> is required. Even if all those extensions are supported, the chosen FBO format might<br />
not be supported by the card; in this case, this method of offscreen rendering is not used.<br />
<br />
== Miscellaneous questions ==<br />
<br />
=== Can't you split up the data file? ===<br />
<br />
The data is now in one file that is about 15 Megabytes. This is smaller<br />
than the original data files VTK used and we hope that this size is not<br />
a problem for people anymore. If it is please let us know.<br />
<br />
=== Do you have any shared library tips? ===<br />
<br />
VTK version 4.0 and later supports both shared and static libraries on<br />
most all platforms. For development we typically use shared libraries<br />
since they are faster to link when making small changes. You can control<br />
how VTK builds by setting the BUILD_SHARED_LIBS option in CMake.<br />
<br />
== Legal issues ==<br />
<br />
=== Is VTK FDA-Approved ? ===<br />
<br />
Given the fact that VTK is a software toolkit, it cannot be the<br />
subject of FDA approval as a medical device. We have discussed<br />
this topic in several occasions and received advice from FDA<br />
representatives, that can be summarized as follow:<br />
<br />
<br />
VTK is to be considered as an off-the-shelf (OTS) product that<br />
is used for supporting a higher level medical application/product.<br />
The developer of such application/product will be responsible for<br />
performing the validation processes described in FDA published<br />
guidelines for the development of software-related medical devices.<br />
<br />
For mode details see the page [[FDA Guidelines for Software Development]]<br />
<br />
=== What are the legal issues? ===<br />
<br />
The Visualization Toolkit software is provided under the following<br />
copyright. We think it's pretty reasonable. We do restrict the<br />
distribution of modified code. This is primarily a revision control<br />
issue. We don't want a bunch of renegade vtks running around without us<br />
having some idea why the changes were made and giving us a chance to<br />
incorporate them into the general release.<br />
<br />
The text of the VTK copyright is available [http://www.vtk.org/copyright.php here].<br />
<br />
=== What is the deal with the patents ===<br />
<br />
As the copyright mentions there are some patents used in VTK. If you use<br />
any code in the Patented/ directory for commercial application you<br />
should contact the patent holder and obtain a license.<br />
<br />
As of VTK4.0 the following classes are known to use algorithms patented<br />
by General Electric Company: vtkDecimate, vtkMarchingCubes,<br />
vtkMarchingSquares, vtkDividingCubes, vtkSliceCubes and vtkSweptSurface.<br />
The GE contact is:<br />
<br />
Carl B. Horton<br />
Sr. Counsel, Intellectual Property<br />
3000 N. Grandview Blvd., W-710<br />
Waukesha, WI 53188<br />
Phone: (262) 513-4022<br />
E-Mail: mailto:Carl.Horton@med.ge.com<br />
<br />
As of VTK4.0 the following classes are known to use algorithms patented<br />
by Kitware, Inc.: vtkGridSynchronizedTemplates3D,<br />
vtkKitwareContourFilter.h, vtkSynchronizedTemplates2D, and<br />
vtkSynchronizedTemplates3D. The Kitware contact is:<br />
<br />
Ken Martin<br />
Kitware<br />
28 Corporate Drive, Suite 204,<br />
Clifton Park, NY 12065<br />
Phone:1-518-371-3971<br />
E-Mail: mailto:kitware@kitware.com<br />
<br />
=== Can VTK be used as part of a project distributed under a GPL License ? ===<br />
<br />
==== Short Answer ====<br />
<br />
Yes, it is fine to take VTK code and to include it in a project that is distributed under a GPL license.<br />
<br />
==== Long Answer ====<br />
<br />
===== Terms =====<br />
<br />
Let's call project X the larger project that:<br />
<br />
# Will include source code from VTK (in part or as a whole)<br />
# Will be distributed under GPL license<br />
<br />
Note in particular that:<br />
<br />
# The copyright notices in VTK files must be kept.<br />
# If VTK files are modified by the developers of project X, that fact must be clearly indicated.<br />
# Only the modifications of VTK files made by the developers of project X will be covered by a GPL license. The original VTK code remains covered by the VTK license.<br />
# The collection of copyrighted works (project X in this case), that includes VTK (in part or as a whole) and their software will be covered by a GPL license.<br />
<br />
===== Details =====<br />
<br />
As the [http://www.vtk.org/copyright.php VTK license] is a variation of the [http://www.opensource.org/licenses/bsd-license.php Modified BSD license], to which only the following term has been added:<br />
<br />
Modified source versions must be plainly marked as such, <br />
and must not be misrepresented as being the original software.<br />
<br />
and that the Modified BSD license is itself compatible with the GPL <br />
<br />
http://www.gnu.org/philosophy/license-list.html (Modified BSD license)<br />
<br />
Then the VTK license is also compatible with the GPL license. Since the terms of the GPL license do not preclude the additional term of the VTK license from being followed.<br />
<br />
NOTE: The licenses are only '''one way compatible'''.<br />
<br />
* You can use VTK code inside a GPL licensed project.<br />
* You '''can not''' use GPL licensed code inside VTK.<br />
<br />
That is the reason why there are no GPL third party libraries in VTK. Having GPL third party libraries in VTK would prevent closed source projects from being built against VTK.<br />
<br />
{{VTK/Template/Footer}}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK_Datasets&diff=15536VTK Datasets2009-06-05T19:31:37Z<p>Dcthomp: New page: Kitware maintains two public repositories that hold data VTK can read: * '''VTKData''' is a repository intended for small datasets used in regression tests * '''VTKLargeData''' is a reposi...</p>
<hr />
<div>Kitware maintains two public repositories that hold data VTK can read:<br />
* '''VTKData''' is a repository intended for small datasets used in regression tests<br />
* '''VTKLargeData''' is a repository intended for medium to large datasets used in examples<br />
<br />
== VTKData ==<br />
<br />
You may obtain this data through CVS:<br />
<br />
cvs -d :pserver:anonymous@www.vtk.org:/cvsroot/VTKData login<br />
cvs -d :pserver:anonymous@www.vtk.org:/cvsroot/VTKData -z3 co VTKData<br />
<br />
This repository is frequently fetched by dashboard machines and the repository contents change often as new baseline images for tests are added. So, in general, datasets in VTKData should not exceed 2MiB to avoid excessive network traffic.<br />
<br />
== VTKLargeData ==<br />
<br />
You may obtain this data through CVS:<br />
<br />
cvs -d :pserver:dcthomp@www.vtk.org:/cvsroot/VTK login<br />
cvs -d :pserver:dcthomp@www.vtk.org:/cvsroot/VTK -z3 co VTKLargeData<br />
<br />
This repository is intended as a place to keep sample data. It can hold large data (in the sense of datasets available over the internet -- not large in the sense of high performance computing). It should not change frequently and is ''not'' intended to hold data for use in nightly tests.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=VTK&diff=15535VTK2009-06-05T19:20:06Z<p>Dcthomp: /* Administrative Topics */ Add link to test and sample data page.</p>
<hr />
<div>http://public.kitware.com/images/logos/vtk-logo2.jpg<br />
<br /><br />
The Visualization ToolKit (VTK) is an open source, freely available software system for 3D computer graphics, image processing, and visualization used by thousands of researchers and developers around the world. VTK consists of a C++ class library, and several interpreted interface layers including Tcl/Tk, Java, and Python. Professional support and products for VTK are provided by Kitware, Inc. ([http://www.kitware.com www.kitware.com]) VTK supports a wide variety of visualization algorithms including scalar, vector, tensor, texture, and volumetric methods; and advanced modeling techniques such as implicit modelling, polygon reduction, mesh smoothing, cutting, contouring, and Delaunay triangulation. In addition, dozens of imaging algorithms have been directly integrated to allow the user to mix 2D imaging / 3D graphics algorithms and data.<br />
<br />
These are fully independent, compilable examples. There is significant overlap in the square examples, but they are each intended to illustrate a different concept.<br />
<br />
==Example Usage (C++)==<br />
===Working with PolyData===<br />
* [[Which Libraries Do I Link To?]]<br />
* [[Which Header Files Do I Include?]]<br />
* [[Useful Classes]]<br />
* [[Write a VTP file]]<br />
* [[Read a VTP file]]<br />
* [[Write a file of the four corners of a square]]<br />
* [[Write a file of a triangulated square]]<br />
* [[Write a file of the four corners of square, each with a different color]]<br />
* [[Write a file of a triangulated square, each corner with a different color]]<br />
* [[Write a file of a triangulated square, each triangle with a different color]]<br />
* [[Write a sphere to a VTP file]]<br />
* [[Write a plane to a VTP file]]<br />
* [[Write two cubes to a VTP file]]<br />
* [[Add Normals to a Polydata]]<br />
* [[Add Miscellaneous Data to Points in a Polydata]]<br />
* [[Add Global Miscellaneous Data to a Polydata]]<br />
* [[Extract Normals from a Polydata]]<br />
<br />
===Other File Types===<br />
* [[Write a VTU file]]<br />
* [[Read a VTU file]]<br />
* [[Read an OBJ File]]<br />
* [[Convert a series of DICOM files into a VTI File]]<br />
<br />
===Data Structures===<br />
* [[KDTree]]<br />
* [[Octree]]<br />
<br />
===Filters===<br />
* [[Apply a Transformation to Points]]<br />
* [[Landmark Transform]]<br />
* [[Iterative Closest Points (ICP) Transform]]<br />
<br />
==Administrative Topics==<br />
<br />
* Where can I find more [[VTK Additional Information|information about VTK]]?<br />
<br />
* [[VTK 5.4 Release Planning]]<br />
<br />
* Where can I [http://vtk.org/get-software.php download VTK]?<br />
<br />
* Where can I download a tarball of the [http://vtk.org/files/nightly/vtkNightlyDocHtml.tar.gz nightly HTML documentation]?<br />
<br />
* Where can I get [[VTK Datasets]]?<br />
<br />
* [[VTK Classes|Extending VTK]]<br />
<br />
* [[VTK Coding Standards]]<br />
<br />
* [[VTK cvs commit Guidelines]]<br />
<br />
* [[VTK Patch Procedure]] -- merge requests for the current release branch<br />
<br />
* [[VTK Scripts|Extending VTK with Scripts]]<br />
<br />
* [[VTK Tools|VTK-Based Tools and Applications]]<br />
<br />
* What are some [[VTK Projects|projects using VTK]]?<br />
<br />
* [[Proposed Changes to VTK | Proposed Changes to VTK]]<br />
<br />
* [[VTK FAQ|Frequently asked questions (FAQ)]]<br />
<br />
* [[VTK OpenGL|Common OpenGL troubles]]<br />
<br />
* [[VTK Related Job Opportunities|VTK Related Job Opportunities]]<br />
<br />
* [[VTK/Writing_VTK_files_using_python | Writing VTK files using python]]<br />
<br />
* [[VTK/mesh quality | Geometric mesh quality]]<br />
<br />
* [[VTK_XML_Formats | VTK XML Format Details]]<br />
<br />
* [[VTK_Third_Party_Library_Patrol | VTK 3rd Party Library Patrol]]<br />
<br />
* [[Python Wrapping FAQ]]<br />
<br />
== Current Projects ==<br />
* [[VTK/Graph Layout | VTK Graph Layout]]<br />
* [[VTK/Java Wrapping | VTK Java Wrapping]]<br />
* [[VTK/Composite Data Redesign | Composite Data Redesign]]<br />
* [[VTKWidgets | VTK Widget Redesign]]<br />
* [[VTKShaders | Shaders in VTK]]<br />
* [[VTK/VTKMatlab | VTK with Matlab]]<br />
* [[VTK/Time_Support | VTK Time support]]<br />
* [[VTK/Depth_Peeling | VTK Depth Peeling]]<br />
* [[VTK/MultiPass_Rendering | VTK Multi-Pass Rendering]]<br />
* [[VTK/Using_JRuby | Using VTK with JRuby]]<br />
* [[VTK/Painters | Painters]]<br />
* [[VTK/Cray XT3 Compilation| Cray XT3 Compilation]]<br />
* [[VTK/statistics | Statistics]]<br />
* [[VTK/Array Refactoring | Array Refactoring]]<br />
* [[VTK/Multicore and Streaming | Multicore and Streaming]]<br />
<br />
== External Links ==<br />
*[http://www.imtek.uni-freiburg.de/simulation/mathematica/IMSweb/ IMTEK Mathematica Supplement (IMS)], the Open Source IMTEK Mathematica Supplement (IMS) interfaces VTK and generates ParaView batch scripts<br />
*[http://zorayasantos.tripod.com/vtk_csharp_examples], VTK examples in C# (Visual Studio 5.0 and .NET 2.0)<br />
{{VTK/Template/Footer}}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=CMake_Borland_Compiler_Issues&diff=14088CMake Borland Compiler Issues2008-11-19T23:30:33Z<p>Dcthomp: Note about STL multisets with comparators</p>
<hr />
<div>== '+' (plus) in the file name problem==<br />
<br />
* Borland linker when creating executable does not support '+' in the name.<br />
<br />
* CMake solves this problem by creating object files by replacing '+' with '_p_'.<br />
<br />
== Broken streampos implementation ==<br />
<br />
Using free bcc5.5.1. <br />
<br />
<pre><br />
#include <iostream><br />
<br />
int main()<br />
{<br />
std::streampos a = 9;<br />
long c = 10;<br />
<br />
if( a >= c )<br />
{<br />
std::cerr << "Impossible happen !" << std::endl;<br />
}<br />
else<br />
{<br />
std::cerr << "Seems resonable" << std::endl;<br />
}<br />
<br />
return 0;<br />
}<br />
</pre><br />
<br />
On borland a >= c is true ... <br />
<br />
According to the doc:<br />
<br />
http://www.cplusplus.com/ref/iostream/streampos.html<br />
<br />
This type describes a class to contain all the information needed to restore an arbitrary file-position indicator within a stream. It can be constructed from or casted to an integer offset value (streamoff).<br />
<br />
There is a bug in the definition of std::streampos::operator<= where the implementation do the contrary of what expected.<br />
<br />
== STL multisets with comparators ==<br />
<br />
The Borland 5.5 compiler's STL implementation does not properly copy comparators during assignment operations. For an example of how to work around this, see [http://public.kitware.com/cgi-bin/viewcvs.cgi/Rendering/vtkLabelHierarchy.cxx?r1=1.12&r2=1.13 vtkLabelHierarchy.cxx (rev 1.13)]<br />
<br />
{{CMake/Template/Footer}}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=IEEE_Vis08_ParaView_Tutorial&diff=13894IEEE Vis08 ParaView Tutorial2008-10-21T18:44:52Z<p>Dcthomp: Add link to plugin example code.</p>
<hr />
<div>Here are the slides for the '''''Advanced ParaView Visualization''''' tutorial given at the IEEE Vis08 ParaView tutorial. This tutorial comprises a collection of advanced topics given by a group of ParaView developers from various organizations. Most of the topics are intended for visualization experts and those already familiar with ParaView.<br />
<br />
Click one of the links in the agenda below to retrieve the slides for that presentation.<br />
<br />
{|<br />
| [[Media:ParaViewVis08_Introduction.ppt|Introduction]]<br />
| Kenneth Moreland<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_Selection.ppt|Selection]]<br />
| Utkarsh Ayachit<br />
| Kitware, Inc.<br />
|-<br />
| [[Media:ParaViewVis08_CustomizingParaView.ppt|Customizing ParaView]]<br />
| Timothy Shead<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_Python.ppt|Python Scripting]] ([[Media:ParaViewVis08_PythonScrtips.zip|Scripts]])<br />
| Utkarsh Ayachit<br />
| Kitware, Inc.<br />
|-<br />
| [[Media:IEEE08_Time-In-ParaView.ppt|Time in ParaView]]<br />
| John Biddiscombe<br />
| Swiss National Supercomputing Centre<br />
|-<br />
| [[Media:IEEE08_Particle-Rendering.ppt|Particle Visualization]]<br />
| John Biddiscombe<br />
| Swiss National Supercomputing Centre<br />
|-<br />
| [[Media:ParaViewVis08_ParallelVisualization.ppt|Parallel Visualization]]<br />
| Kenneth Moreland<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_GenericDataSetAPI.pdf|Generic Data Set API]]<br />
| David Thompson<br />
| Sandia National Laboratories<br />
|-<br />
|<br />
| See [[Plugin Examples]] for sample code.<br />
|<br />
|-<br />
| [[Media:ParaViewVis08_InfoVisInParaView.ppt|InfoVis in ParaView]]<br />
| Timothy Shead<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_Conclusion.ppt|Concluding Remarks]]<br />
| Kenneth Moreland<br />
| Sandia National Laboratories<br />
|}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:GenericDataSetExample1.png&diff=13893File:GenericDataSetExample1.png2008-10-21T18:08:01Z<p>Dcthomp: A screenshot of the edge/face plugin based on the GenericDataSet API running in ParaView.</p>
<hr />
<div>A screenshot of the edge/face plugin based on the GenericDataSet API running in ParaView.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/Plugin_Examples&diff=13892ParaView/Plugin Examples2008-10-21T18:00:49Z<p>Dcthomp: /* Generic Data Set */</p>
<hr />
<div>This page contains plugins for use with ParaView. The ParaView CVS repository contains many examples in the Plugins directory.<br />
<br />
= Generic Data Set =<br />
<br />
[[Image:GenericDataSetExample1.png|thumb|right|A screenshot of the edge/face plugin based on the GenericDataSet API.]]<br />
If you would like to add new cell types to VTK that violate the assumptions that the unstructured grid and polydata classes make, this plugin illustrates what's required. The Skeleton subdirectory contains empty implementations of the classes required for creating a new mesh type. The main directory contains an implementation of many of the members and a small test program that provide interpolation of attributes defined by values attached to edges and faces instead of the traditional VTK point and cell data.<br />
<br />
* [[Media:GenericDataSetExample.tar.gz|GenericDataSetExample.tar.gz (60 kiB)]]<br />
<br />
<br />
{{ParaView/Template/Footer}}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:GenericDataSetExample.tar.gz&diff=13891File:GenericDataSetExample.tar.gz2008-10-21T17:54:35Z<p>Dcthomp: uploaded a new version of "Image:GenericDataSetExample.tar.gz": An example of how to implement the vtkGenericDataSet API. Please see the enclosed ReadMe.txt file for more information.</p>
<hr />
<div>This is an example of how to implement higher order or other novel finite element types into VTK using the vtkGenericDataSet API. It is a partial implementation the API to provide edge/face element interpolation.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:ParaViewVis08_GenericDataSetAPI.pdf&diff=13885File:ParaViewVis08 GenericDataSetAPI.pdf2008-10-20T21:41:36Z<p>Dcthomp: A presentation on how to subclass the VTK generic dataset classes to provide access to novel finite element formulations.</p>
<hr />
<div>A presentation on how to subclass the VTK generic dataset classes to provide access to novel finite element formulations.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=IEEE_Vis08_ParaView_Tutorial&diff=13884IEEE Vis08 ParaView Tutorial2008-10-20T21:39:42Z<p>Dcthomp: </p>
<hr />
<div>Here are the slides for the '''''Advanced ParaView Visualization''''' tutorial given at the IEEE Vis08 ParaView tutorial. This tutorial comprises a collection of advanced topics given by a group of ParaView developers from various organizations. Most of the topics are intended for visualization experts and those already familiar with ParaView.<br />
<br />
Click one of the links in the agenda below to retrieve the slides for that presentation.<br />
<br />
{|<br />
| [[Media:ParaViewVis08_Introduction.ppt|Introduction]]<br />
| Kenneth Moreland<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_Selection.ppt|Selection]]<br />
| Utkarsh Ayachit<br />
| Kitware, Inc.<br />
|-<br />
| [[Media:ParaViewVis08_CustomizingParaView.ppt|Customizing ParaView]]<br />
| Timothy Shead<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_Python.ppt|Python Scripting]] ([[Media:ParaViewVis08_PythonScrtips.zip|Scripts]])<br />
| Utkarsh Ayachit<br />
| Kitware, Inc.<br />
|-<br />
| [[Media:IEEE08_Time-In-ParaView.ppt|Time in ParaView]]<br />
| John Biddiscombe<br />
| Swiss National Supercomputing Centre<br />
|-<br />
| [[Media:IEEE08_Particle-Rendering.ppt|Particle Visualization]]<br />
| John Biddiscombe<br />
| Swiss National Supercomputing Centre<br />
|-<br />
| [[Media:ParaViewVis08_ParallelVisualization.ppt|Parallel Visualization]]<br />
| Kenneth Moreland<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_GenericDataSetAPI.pdf|Generic Data Set API]]<br />
| David Thompson<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_InfoVisInParaView.ppt|InfoVis in ParaView]]<br />
| Timothy Shead<br />
| Sandia National Laboratories<br />
|-<br />
| [[Media:ParaViewVis08_Conclusion.ppt|Concluding Remarks]]<br />
| Kenneth Moreland<br />
| Sandia National Laboratories<br />
|}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=File:GenericDataSetExample.tar.gz&diff=13879File:GenericDataSetExample.tar.gz2008-10-20T15:37:48Z<p>Dcthomp: This is an example of how to implement higher order or other novel finite element types into VTK using the vtkGenericDataSet API. It is a partial implementation the API to provide edge/face element interpolation.</p>
<hr />
<div>This is an example of how to implement higher order or other novel finite element types into VTK using the vtkGenericDataSet API. It is a partial implementation the API to provide edge/face element interpolation.</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/Plugin_Examples&diff=13878ParaView/Plugin Examples2008-10-20T15:35:47Z<p>Dcthomp: New page: This page contains plugins for use with ParaView. The ParaView CVS repository contains many examples in the Plugins directory. = Generic Data Set = If you would like to add new cell type...</p>
<hr />
<div>This page contains plugins for use with ParaView. The ParaView CVS repository contains many examples in the Plugins directory.<br />
<br />
= Generic Data Set =<br />
<br />
If you would like to add new cell types to VTK that violate the assumptions that the unstructured grid and polydata classes make, this plugin illustrates what's required. The Skeleton subdirectory contains empty implementations of the classes required for creating a new mesh type. The main directory contains an implementation of many of the members and a small test program that provide interpolation of attributes defined by values attached to edges and faces instead of the traditional VTK point and cell data.<br />
<br />
* [[Media:GenericDataSetExample.tar.gz|GenericDataSetExample.tar.gz (50 kiB)]]<br />
<br />
<br />
{{ParaView/Template/Footer}}</div>Dcthomphttps://public.kitware.com/Wiki/index.php?title=ParaView/Plugin_HowTo&diff=13877ParaView/Plugin HowTo2008-10-20T15:23:59Z<p>Dcthomp: </p>
<hr />
<div>= Overview =<br />
<br />
ParaView comes with plethora of functionality bundled in: several readers, multitude of filters, quite a few different types of views etc. However, it is not uncommon for developers to want to add new functionality to ParaView for eg. to add support to their new file format, incorporate a new filter into paraview etc. ParaView makes it possible to add new functionlity by using an extensive plugin mechanism. <br />
<br />
Plugins can be used to extend ParaView in several ways:<br />
* Add new readers, writers, filters <br />
* Add custom GUI components such as toolbar buttons to perform common tasks<br />
* Add new views in for display data<br />
<br />
Examples for different types of plugins are provided with the ParaView source under '''Examples/Plugins/'''.<br />
<br />
This document has major sections:<br />
* First section covers how to use existing plugins in ParaView.<br />
* Second section contains information for developers about writing new plugins for ParaView.<br />
<br />
=Using Plugins=<br />
<br />
Plugins are distributed as shared libraries (*.so on Unix, *.dylib on Mac, *.dll on Windows etc). For a plugin to be loadable in ParaView, it must be built with the same version of ParaView as it is expected to be deployed on. Plugins can be classified into two broad categories:<br />
* Server-side plugins<br />
: These are plugins that extend the algorithmic capabilities for ParaView eg. new filters, readers, writers etc. Since in ParaView data is processed on the server-side, these plugins need to be loaded on the server.<br />
* Client-side plugins<br />
: These are plugins that extend the ParaView GUI eg. property panels for new filters, toolbars, views etc. These plugins need to be loaded on the client.<br />
<br />
Oftentimes a plugin has both server-side as well as client-side components to it eg. a plugin that adds a new filter and a property panel that goes with that filter. Such plugins need to be loaded both on the server as well as the client. <br />
<br />
Generally, users don't have to worry whether a plugin is a server-side or client-side plugin. Simply load the plugin on the server as well as the client. ParaView will include relevant components from plugin on each of the processes.<br />
<br />
There are two ways for loading plugins:<br />
<br />
* Using the GUI ('''Plugin Manager''')<br />
: Plugins can be loaded into ParaView using the '''Plugin Manager''' accessible from '''Tools | Manage Plugins/Extensions''' menu. The Plugin Manager has two sections for loading local plugins and remote plugins (enabled only when connected to a server). To load a plugin on the local as well as remote side, simply browse to the plugin shared library. If the loading is successful, the plugin will appear in the list of loaded plugins. The Plugin manager also lists the paths it searched to load plugins automatically.<br />
<table><br />
<tr><br />
<td><br />
[[Image:LocalPlugin_Manager.png|thumb|300px|'''Figure 1:''' Plugin Manager when not connected to a remote server, showing loaded plugins on the local site.''']]<br />
</td><br />
<td><br />
[[Image:RemotePlugin_Manager.png|thumb|300px|'''Figure 2:''' Plugin Manager when connected to a server showing loaded plugins on the local as well as remote sites.''']]<br />
</td><br />
</table><br />
* Using environment variable (Auto-loading plugins)<br />
: Plugins loaded using the Plugin Manager are not saved across sessions. Hence one has to reload those plugins every time ParaView is started. If one wants ParaView to automatically load a set of plugins on startup, one can use the '''PV_PLUGIN_PATH''' environment variable. '''PV_PLUGIN_PATH''' can be used to list a set of directories (separated by colon (:) or semi-colon (;)) which ParaView will search on startup to load plugins. This enviromnent variable needs to be set on both the client node to load local plugins as well as the remote server to load remote plugins.<br />
<br />
=Writing Plugins=<br />
This section covers writing and compiling different types of Plugins. To create a plugin, one must have their own build of ParaVeiw3. Binaries downloaded from www.paraview.org do not include necessary header files or import libraries (where applicable) for compiling plugins.<br />
<br />
The beginning of a CMakeLists.txt file contains<br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
Where CMake will ask for the ParaView_DIR which you point to your ParaView build. The PARAVIEW_USE_FILE includes build parameters and macros for building plugins.<br />
<br />
==Adding a Filter==<br />
<br />
In this plugin, we want to add a new filter ParaView. The filter has to be a VTK-based algorithm, written as following the standard procedures for writing VTK algorithms. Generally for such cases where we are adding a new VTK class to ParaView (be it a filter, reader or a writer), we need to do the following tasks:<br />
* Write a '''Server Manager Configuration XML''' which describes the ''Proxy'' interface for the new VTK class. Basically, this defines the interface for the client to create and modify instances of the new class on the server side. Please refer to the [http://www.kitware.com/products/paraview.html ParaView Guide] for details about writing these server-manager xmls.<br />
* Write a configuration XML for the GUI to make ParaView GUI aware of this new class, if applicable. For filters, this is optional, since ParaView automatically recognizes filters added through plugins and lists them in the '''Alphabetical''' sub-menu. One may use the GUI configuration xml to add the new filter to a specific category in the ''Filters'' menu, or add a new category etc. For readers and writers, this is required since ParaView GUI needs to know what extensions your reader/writer supports etc.<br />
<br />
For this example, refer to '''Examples/Plugins/Filter''' in the ParaView source. Let's say we have written a new vtkMyElevationFilter (vtkMyElevationFilter.h|cxx), which extends the functionality of the vtkElevationFilter and we want to package that as a plugin for ParaView. For starters, we simply want to use this filter in ParaView (not doing anything fancy with Filters menu categories etc.). As described, we need to write the server manager configuration XML (MyElevationFilter.xml). Once that's done, we write a CMakeLists.txt file to package this into a plugin. <br />
<br />
This CMakeLists.txt simply needs to include the following lines:<br />
<br />
<font color="green"># Locate ParaView build and then import CMake configuration, <br />
# macros etc. from it.</font><br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
<br />
<font color="green"># Use the ADD_PARAVIEW_PLUGIN macro to build a plugin</font><br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(<br />
MyElevation <font color="green">#<--Name for the plugin</font><br />
"1.0" <font color="green">#<--Version string</font><br />
<font color="purple">SERVER_MANAGER_XML</font> MyElevationFilter.xml <font color="green">#<-- server manager xml</font><br />
<font color="purple">SERVER_MANAGER_SOURCES</font> vtkMyElevationFilter.cxx <font color="green">#<-- source files for the new classes</font><br />
)<br />
<br />
Then using cmake and a build system, one can build a plugin for this new filter. Once this plugin is loaded the filter will appear under the "Alphabetical" list in the Filters menu.<br />
<br />
===Adding ''Categories'' to the Filters Menu===<br />
<br />
Now suppose we want to add a new category to the Filters menu, called "Extensions" and then show this filter in that submenu. In that case, we'll need a GUI configuration xml to tell the ParaView GUI to create the category. This GUI configuration xml will look as such:<br />
<br />
<source lang="xml"><br />
<ParaViewFilters><br />
<Category name="Extensions" menu_label="&amp;Extensions"><br />
<!-- adds a new category and then adds our filter to it --><br />
<Filter name="MyElevationFilter" /><br />
</Category><br />
</ParaViewFilters><br />
</source><br />
<br />
If the name of the category is same as an already existsing category eg. ''Data Analysis'', then the filter gets added to the existing category.<br />
<br />
The CMakeLists.txt must change to include this new xml (let's call it MyElevationGUI.xml) as follows:<br />
<br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
<br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(MyElevation "1.0"<br />
<font color="purple">SERVER_MANAGER_XML</font> MyElevationFilter.xml <br />
<font color="purple">SERVER_MANAGER_SOURCES</font> vtkMyElevationFilter.cxx<br />
<font color="purple">GUI_RESOURCE_FILES</font> MyElevationGUI.xml)<br />
<br />
===Adding Icons===<br />
You can see that some filters in the Filters menu (eg. Clip) have icons associated with them. It's possible for the plugin to add icons for filters it adds as well. For that you need to write a Qt resource file (say MyElevation.qrc) as follows:<br />
<br />
<source lang="xml"><br />
<RCC><br />
<qresource prefix="/MyIcons" ><br />
<file>MyElevationIcon.png</file><br />
</qresource><br />
</RCC><br />
</source><br />
<br />
The GUI configuration xml now refers to the icon provided by this resource as follows:<br />
<source lang="xml"><br />
<ParaViewFilters><br />
<Category name="Extensions" menu_label="&amp;Extensions"><br />
<!-- adds a new category and then adds our filter to it --><br />
<Filter name="MyElevationFilter" icon=":/MyIcons/MyElevationIcon.png" /><br />
</Category><br />
</ParaViewFilters><br />
</source><br />
<br />
Finally, the CMakeLists.txt file much change to include our MyElevation.qrc file as follows:<br />
<br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
<br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(MyElevation "1.0"<br />
<font color="purple">SERVER_MANAGER_XML</font> MyElevationFilter.xml <br />
<font color="purple">SERVER_MANAGER_SOURCES</font> vtkMyElevationFilter.cxx<br />
<font color="purple">GUI_RESOURCES</font> MyElevation.qrc<br />
<font color="purple">GUI_RESOURCE_FILES</font> MyElevationGUI.xml)<br />
<br />
==Enabling a filter in VTK==<br />
<br />
Sometimes, the filter that one wants to add to ParaView is already available in VTK, it's just not exposed through the ParaView GUI. For such filters too, one can create a plugin as with a new filter. In this case too we need the server manager configuration xml for the filter describing it's API and the optional GUI xml to add the filter to any specific category. <br />
<br />
For example, let's say we simply want to expose the '''vtkCellDerivatives''' in VTK. Then first, we'll write the server manager configuration XML (call it vtkCellDerivatives.xml), similar to what we would have done for adding a new filter. Please refer to the [http://www.kitware.com/products/paraview.html ParaView Guide] for details about writing this XML.<br />
<br />
Now we can compile this into a plugin using the following CMakeLists.txt<br />
<br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
<br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(CellDerivatives "1.0"<br />
<font color="purple">SERVER_MANAGER_XML</font> CellDerivatives.xml)<br />
<br />
However, this would require ParaView built from source (as described earlier). Alternatively, we can simply load the XML from the '''Plugins Manager''' (just select the file type to be *.xml in the file open dialog).<br />
<br />
Similarly compiled Qt resources (*.bqrc) can be loaded at runtime. *.bqrc is a binary file containing resources which can include icons, the GUI configuration xmls for adding catergories etc. A .bqrc can be made from a .qrc by running the rcc utility provided by Qt:<br />
rcc -binary -o myfile.bqrc myfile.qrc.<br />
<br />
==Adding a Reader==<br />
<br />
Adding a new reader to a plugin is similar to adding a filter except that instead of the GUI configuration xml describing categories in the filter menu, we require the xml to define what file extensions this reader can handle. This xml (MyReaderGUI.xml) looks like this:<br />
<br />
<source lang="xml"><br />
<ParaViewReaders><br />
<Reader name="MyPNGReader" extensions="mypng"<br />
file_description="My PNG Files"><br />
</Reader><br />
</ParaViewReaders><br />
</source><br />
<br />
And the CMakeLists.txt looks as follows where vtkMyPNGReader.cxx is the source for the reader, MyPNGReader.xml is the server manager configuration xml, and MyReaderGUI.xml is the GUI configuration xml described above:<br />
<br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(MyReader "1.0" <br />
<font color="purple">SERVER_MANAGER_XML</font> MyPNGReader.xml<br />
<font color="purple">SERVER_MANAGER_SOURCES</font> vtkMyPNGReader.cxx <br />
<font color="purple">GUI_RESOURCE_FILES</font> MyReaderGUI.xml)<br />
<br />
If you want your reader to work correctly with a file series, please refer to [[Animating legacy VTK file series#Making custom readers work with file series|file series animation]] for details.<br />
<br />
==Adding a Writer==<br />
<br />
Similar to a reader, for a writer we need to tell ParaView what extensions this writer supports. This can be done using the GUI XML as follows:<br />
<br />
<source lang="xml"><br />
<ParaViewWriters><br />
<Writer name="MyTIFFWriter"<br />
extensions="tif"<br />
file_description="My Tiff Files"><br />
</Writer><br />
</ParaViewWriters><br />
</source><br />
<br />
==Adding a Toolbar==<br />
<br />
Filters, reader and writers are by far the most common ways for extending ParaView. However, ParaView plugin functionality goes far beyond that. The following sections cover some of these advanced plugins that can be written.<br />
<br />
Applications use toolbars to provide easy access to commonly used functionality. It is possible to have plugins that add new toolbars to ParaView. The plugin developer implements his own C++ code to handle the callback for each button on the toolbar. Hence one can do virtually any operation using the toolbar plugin with some understanding of the ParaView Server Manager framework and the ParaView GUI components. <br />
<br />
Please refer to '''Examples/Plugins/SourceToolbar''' for this section. There we are adding a toolbar with two buttons to create a sphere and a cylinder source. For adding a toolbar, one needs to implement a subclass for [http://doc.trolltech.com/4.3/qactiongroup.html QActionGroup] which adds the [http://doc.trolltech.com/4.3/qaction.html QAction]s for each of the toolbar button and then implements the handler for the callback when the user clicks any of the buttons. In the example '''SourceToobarActions.h|cxx''' is the QActionGroup subclass that adds the two tool buttons.<br />
<br />
To build the plugin, the CMakeLists.txt file is:<br />
<br />
<font color="green"># We need to wrap for Qt stuff such as signals/slots etc. to work correctly.</font><br />
QT4_WRAP_CPP(MOC_SRCS SourceToolbarActions.h)<br />
<br />
<font color="green"># This is a macro for adding QActionGroup subclasses automatically as toolbars.</font><br />
<font color="violet">ADD_PARAVIEW_ACTION_GROUP</font>(IFACES IFACE_SRCS <br />
<font color="purple">CLASS_NAME</font> SourceToolbarActions<br />
<font color="purple">GROUP_NAME</font> "ToolBar/SourceToolbar")<br />
<br />
<font color="green"># Now create a plugin for the toolbar. Here we pass IFACES and IFACE_SRCS<br />
# which are filled up by the above macro with relevant entries</font><br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(SourceToolbar "1.0"<br />
<font color="purple">GUI_INTERFACES</font> ${IFACES}<br />
<font color="purple">SOURCES</font> ${MOC_SRCS} ${IFACE_SRCS} <br />
SourceToolbarActions.cxx)<br />
<br />
For the GROUP_NAME, we are using '''ToolBar/SourceToolbar'''; here '''ToolBar''' is a keyword which implies that the action group is a toolbar (and shows up under '''View | Toolbars''' menu) with the name '''SourceToolbar'''. When the plugin is loaded, this toolbar will show up with two buttons.<br />
<br />
== Adding an object panel ==<br />
Object Panels are the panels for editing object properties.<br />
<br />
ParaView3 contains automatic panel generation code which is suitable for most objects. If you find your object doesn't have a good auto-generated panel, you can make your own.<br />
<br />
To make your own, there is an explanation found on [[CustomObjectPanels]]<br />
<br />
Now let's say we have our own panel we want to make for a ConeSource. In this example, we'll just add a simple label saying that this panel came from the plugin. In ConePanel.h:<br />
<br />
<source lang="cpp"><br />
#include "pqAutoGeneratedObjectPanel.h"<br />
#include <QLabel><br />
#include <QLayout><br />
<br />
class ConePanel : public pqAutoGeneratedObjectPanel<br />
{<br />
Q_OBJECT<br />
public:<br />
ConePanel(pqProxy* pxy, QWidget* p)<br />
: pqAutoGeneratedObjectPanel(pxy, p)<br />
{<br />
this->layout()->addWidget(new QLabel("This is from a plugin", this));<br />
}<br />
};<br />
</source><br />
<br />
Then in our CMakeLists.txt file:<br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
QT4_WRAP_CPP(MOC_SRCS ConePanel.h)<br />
<font color="violet">ADD_PARAVIEW_OBJECT_PANEL</font>(IFACES IFACE_SRCS <br />
<font color="purple">CLASS_NAME</font> ConePanel<br />
<font color="purple">XML_NAME</font> ConeSource <font color="purple">XML_GROUP</font> sources)<br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(GUIConePanel "1.0"<br />
<font color="purple">GUI_INTERFACES</font> ${IFACES}<br />
<font color="purple">SOURCES</font> ${MOC_SRCS} ${IFACE_SRCS})<br />
<br />
==Adding components to Display Panel (decorating display panels)==<br />
Display panel is the panel shown on the '''Display''' tab in the '''Object Inspector'''. It is possible to add GUI components to existing [http://www.paraview.org/ParaView3/Doc/Nightly/html/classpqDisplayPanel.html display panels].<br />
<br />
In this example we want to add a GUI element to the display panel shown for the spread sheet view to size of data that is fetched to the client at one time referred to as the ''Block Size''.<br />
<br />
For that we write the implementation in QObject subclass (say MySpreadsheetDecorator) with a constructor that takes in the pqDisplayPanel which is to be decorated.<br />
<br />
<source lang="cpp"><br />
...<br />
class MySpreadsheetDecorator : public QObject<br />
{<br />
...<br />
public:<br />
MySpreadsheetDecorator(pqDisplayPanel* panel);<br />
virtual ~MySpreadsheetDecorator();<br />
...<br />
};<br />
</source><br />
<br />
In the constructor, we have access to the panel, hence we can get the ''layout'' from it and add custom widgets to it. In this case, it would be a spin-box or a line edit to enter the block size. <br />
''pqDisplayPanel::getRepresentation()'' provides access to the representation being shown on the panel. We can use [http://www.paraview.org/ParaView3/Doc/Nightly/html/classpqPropertyLinks.html pqPropertyLinks] to link the "BlockSize" property on the representation with the spin-box for the block size so that when the widget is changed by the user, the property changes and vice-versa.<br />
<br />
Now the CMakeLists.txt to package this plugin looks like follows:<br />
<br />
QT4_WRAP_CPP(MOC_SRCS MySpreadsheetDecorator.h)<br />
<br />
<font color="green"># This is the macro to add a display panel decorator.<br />
# It needs the class name, and the panel types we are decorating. It fills up <br />
# IFACES and IFACE_SRCS with proper values as needed by ADD_PARAVIEW_PLUGIN macro.</font><br />
<font color="violet">ADD_PARAVIEW_DISPLAY_PANEL_DECORATOR</font>(<br />
IFACES IFACE_SRCS <br />
<font color="purple">CLASS_NAME</font> MySpreadsheetDecorator<br />
<font color="purple">PANEL_TYPES</font> pqSpreadSheetDisplayEditor <br />
<font color="green"># <-- This identifies the panel type(s) to decorate<br />
# Our decorator will only be instantiated for the panel types indicated here</font><br />
)<br />
<br />
<font color="green"># create a plugin</font><br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(MySpreadsheetDecorator "1.0" <br />
<font color="purple">GUI_INTERFACES</font> ${IFACES} <br />
<font color="purple">SOURCES</font> MySpreadsheetDecorator.cxx ${MOC_SRCS} ${IFACE_SRCS})<br />
<br />
An example panel decorator is available under '''Examples/Plugins/DisplayPanelDecorator''' in the ParaView source.<br />
<br />
==Autostart Plugins==<br />
This refers to a plugin which needs to be notified when ParaView starts up or the plugin is loaded which ever happens later and then notified when ParaView quits. Example is in '''Examples/Plugins/Autostart''' in the ParaView source. For such a plugin, we need to provide a QObject subclass (pqMyApplicationStarter) with methods that need to be called on startup and shutdown.<br />
<br />
<source lang="cpp"><br />
...<br />
class pqMyApplicationStarter : public QObject<br />
{<br />
...<br />
public:<br />
// Callback for startup.<br />
// This cannot take any arguments<br />
void onStartup();<br />
<br />
// Callback for shutdown.<br />
// This cannot take any arguments<br />
void onShutdown();<br />
...<br />
};<br />
</source><br />
<br />
The CMakeLists.txt looks as follows:<br />
<br />
QT4_WRAP_CPP(MOC_SRCS pqMyApplicationStarter.h)<br />
<br />
<font color="green"># Macro for auto-start plugins. We specify the class name<br />
# and the methods to call on startup and shutdown on an instance of that class.<br />
# It fills IFACES and IFACE_SRCS with proper values as needed by ADD_PARAVIEW_PLUGIN macro.</font><br />
<font color="violet">ADD_PARAVIEW_AUTO_START</font>(IFACES IFACE_SRCS <br />
<font color="purple">CLASS_NAME</font> pqMyApplicationStarter <font color="green"># the class name for our class</font><br />
<font color="purple">STARTUP</font> onStartup <font color="green"># specify the method to call on startup</font><br />
<font color="purple">SHUTDOWN</font> onShutdown <font color="green"># specify the method to call on shutdown</font><br />
)<br />
<br />
<font color="green"># Create a plugin for this starter </font><br />
<font color="violet">ADD_PARAVIEW_PLUGIN</font>(Autostart "1.0" <br />
<font color="purple">GUI_INTERFACES</font> ${IFACES} <br />
<font color="purple">SOURCES</font> pqMyApplicationStarter.cxx ${MOC_SRCS} ${IFACE_SRCS})<br />
<br />
== Adding a custom view ==<br />
ParaView contains a render view for rendering 3d images. It also contains chart views to visualize data in line charts and histogram charts. You may want to create another custom view that does your own view of the data.<br />
<br />
For this example, we'll just make a simple Qt widget with labels showing the displays that have been added to the view.<br />
<br />
To make a custom view, we need both client and server side plugins.<br />
<br />
For our server side, we simply have:<br />
<source lang="xml"><br />
<ServerManagerConfiguration><br />
<ProxyGroup name="displays"><br />
<GenericViewDisplayProxy name="MyDisplay"<br />
base_proxygroup="displays" base_proxyname="GenericViewDisplay"><br />
</GenericViewDisplayProxy><br />
</ProxyGroup><br />
<ProxyGroup name="views"><br />
<ViewModuleProxy name="MyViewViewModule"<br />
base_proxygroup="rendermodules" base_proxyname="ViewModule"<br />
display_name="MyDisplay"><br />
</ViewModuleProxy><br />
</ProxyGroup><br />
<ProxyGroup name="filters"><br />
<SourceProxy name="MyExtractEdges" class="vtkExtractEdges"<br />
label="My Extract Edges"><br />
<InputProperty<br />
name="Input"<br />
command="SetInputConnection"><br />
<ProxyGroupDomain name="groups"><br />
<Group name="sources"/><br />
<Group name="filters"/><br />
</ProxyGroupDomain><br />
<DataTypeDomain name="input_type"><br />
<DataType value="vtkDataSet"/><br />
</DataTypeDomain><br />
</InputProperty><br />
<Hints><br />
<View type="MyView"/><br />
</Hints><br />
</SourceProxy><br />
</ProxyGroup><br />
</ServerManagerConfiguration><br />
</source><br />
<br />
We define "MyDisplay" as a simple display proxy, and "MyViewModule" as a simple view module.<br />
We have our own filter "MyExtractEdges" with a hint saying it prefers to be shown in a view of type "MyView." So if we create a MyExtractEdges in ParaView3, it'll automatically be shown in our custom view.<br />
<br />
We build the server plugin with a CMakeLists.txt file as:<br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
ADD_PARAVIEW_PLUGIN(SMMyView "1.0" SERVER_MANAGER_XML MyViewSM.xml)<br />
<br />
<br />
Our client side plugin will contain an extension of pqGenericViewModule.<br />
We can let ParaView give us a display panel for these displays, or we can make our own deriving from pqDisplayPanel. In this example, we'll make a simple display panel.<br />
<br />
We implement MyView in MyView.h:<br />
<source lang="cpp"><br />
#include "pqGenericViewModule.h"<br />
#include <QMap><br />
#include <QLabel><br />
#include <QVBoxLayout><br />
#include <vtkSMProxy.h><br />
#include <pqDisplay.h><br />
#include <pqServer.h><br />
#include <pqPipelineSource.h><br />
<br />
/// a simple view that shows a QLabel with the display's name in the view<br />
class MyView : public pqGenericViewModule<br />
{<br />
Q_OBJECT<br />
public:<br />
MyView(const QString& viewtypemodule, const QString& group, const QString& name,<br />
vtkSMAbstractViewModuleProxy* viewmodule, pqServer* server, QObject* p)<br />
: pqGenericViewModule(viewtypemodule, group, name, viewmodule, server, p)<br />
{<br />
this->MyWidget = new QWidget;<br />
new QVBoxLayout(this->MyWidget);<br />
<br />
// connect to display creation so we can show them in our view<br />
this->connect(this, SIGNAL(displayAdded(pqDisplay*)),<br />
SLOT(onDisplayAdded(pqDisplay*)));<br />
this->connect(this, SIGNAL(displayRemoved(pqDisplay*)),<br />
SLOT(onDisplayRemoved(pqDisplay*)));<br />
<br />
}<br />
~MyView()<br />
{<br />
delete this->MyWidget;<br />
}<br />
<br />
/// we don't support save images<br />
bool saveImage(int, int, const QString& ) { return false; }<br />
vtkImageData* captureImage(int) { return NULL; }<br />
<br />
/// return the QWidget to give to ParaView's view manager<br />
QWidget* getWidget()<br />
{<br />
return this->MyWidget;<br />
}<br />
/// returns whether this view can display the given source<br />
bool canDisplaySource(pqPipelineSource* source) const<br />
{<br />
if(!source ||<br />
this->getServer()->GetConnectionID() != source->getServer()->GetConnectionID() ||<br />
QString("MyExtractEdges") != source->getProxy()->GetXMLName())<br />
{<br />
return false;<br />
}<br />
return true;<br />
}<br />
<br />
protected slots:<br />
void onDisplayAdded(pqDisplay* d)<br />
{<br />
QString text = QString("Display (%1)").arg(d->getProxy()->GetSelfIDAsString());<br />
QLabel* label = new QLabel(text, this->MyWidget);<br />
this->MyWidget->layout()->addWidget(label);<br />
this->Labels.insert(d, label);<br />
}<br />
<br />
void onDisplayRemoved(pqDisplay* d)<br />
{<br />
QLabel* label = this->Labels.take(d);<br />
if(label)<br />
{<br />
this->MyWidget->layout()->removeWidget(label);<br />
delete label;<br />
}<br />
}<br />
<br />
protected:<br />
<br />
QWidget* MyWidget;<br />
QMap<pqDisplay*, QLabel*> Labels;<br />
<br />
};<br />
</source><br />
<br />
And MyDisplay.h is:<br />
<source lang="cpp"><br />
#include "pqDisplayPanel.h"<br />
#include <QVBoxLayout><br />
#include <QLabel><br />
<br />
class MyDisplay : public pqDisplayPanel<br />
{<br />
Q_OBJECT<br />
public:<br />
MyDisplay(pqDisplay* display, QWidget* p)<br />
: pqDisplayPanel(display, p)<br />
{<br />
QVBoxLayout* l = new QVBoxLayout(this);<br />
l->addWidget(new QLabel("From Plugin", this));<br />
}<br />
};<br />
</source><br />
<br />
The CMakeLists.txt file to build the client plugin would be:<br />
FIND_PACKAGE(ParaView REQUIRED)<br />
INCLUDE(${PARAVIEW_USE_FILE})<br />
<br />
QT4_WRAP_CPP(MOC_SRCS MyView.h MyDisplay.h)<br />
<br />
ADD_PARAVIEW_VIEW_MODULE(IFACES IFACE_SRCS VIEW_TYPE MyView VIEW_XML_GROUP views<br />
DISPLAY_XML MyDisplay DISPLAY_PANEL MyDisplay)<br />
<br />
ADD_PARAVIEW_PLUGIN(GUIMyView "1.0" GUI_INTERFACES ${IFACES}<br />
SOURCES ${MOC_SRCS} ${IFACE_SRCS})<br />
<br />
We load the plugins in ParaView, and we create something like a Cone, then create a "My Extract Edges" filter. The multiview manager will create a new view and the label "Display (151)".<br />
<br />
In ParaView 3.4, there's also a macro, ADD_PARAVIEW_VIEW_OPTIONS() which allows adding options pages for the custom view, accessible from Edit -> View Settings. The example in ParaView3/Examples/Plugins/GUIView demonstrates this (until more information is put here).<br />
<br />
= Examples =<br />
<br />
The ParaView CVS repository contains many examples in the Plugins directory. Additional examples are available on this wiki at the [[Plugin Examples]] entry.<br />
<br />
{{ParaView/Template/Footer}}</div>Dcthomp