Difference between revisions of "ParaView/Users Guide/List of filters"

From KitwarePublic
Jump to navigationJump to search
Line 17: Line 17:
|
|
Only the values 0 and 1 are accepted.
Only the values 0 and 1 are accepted.
|-
| '''Isosurface'''<br>''(ContourValue)''
|
This property specifies the values at which to compute the isosurface.
| 1
|
The value must lie within the range of the selected data array.




Line 74: Line 64:


|-
|-
| '''Contour By'''<br>''(SelectInputScalars)''
| '''Select Material Arrays'''<br>''(SelectMaterialArrays)''
|
|
This property specifies the name of the cell scalar array from which the contour filter will compute isolines and/or isosurfaces.
This property specifies the cell arrays from which the contour filter will
compute contour cells.


|
|
Line 101: Line 92:
|
|
Only the values 0 and 1 are accepted.
Only the values 0 and 1 are accepted.
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|
This property specifies the values at which to compute the isosurface.
| 0.1
|
The value must be greater than or equal to 0 and less than or equal to 1.




Line 109: Line 110:




Clip with scalars. Tetrahedra.
Clip with scalars. Tetrahedra.




Line 118: Line 119:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Degenerate Cells'''<br>''(DegenerateCells)''
|
If this property is on, a transition mesh between levels is created.
| 1
|
Only the values 0 and 1 are accepted.
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
Line 140: Line 131:


The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.
|-
| '''Internal Decimation'''<br>''(InternalDecimation)''
|
If this property is on, internal tetrahedra are decimation
| 1
|
Only the values 0 and 1 are accepted.




Line 236: Line 237:




Copies geometry from first input. Puts all of the arrays into the output.
Copies geometry from first input. Puts all of the arrays into the output.


The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br>
The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br>
Line 664: Line 665:




==Clean to Grid==
==Clean Cells to Grid==




This filter merges points and converts the data set to unstructured grid.
This filter merges cells and converts the data set to unstructured grid.


The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.<br>
Merges degenerate cells. Assumes the input grid does not contain duplicate<br>
points. You may want to run vtkCleanUnstructuredGrid first to assert it. If<br>
duplicated cells are found they are removed in the output. The filter also<br>
handles the case, where a cell may contain degenerate nodes (i.e. one and<br>
the same node is referenced by a cell more than once).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 680: Line 685:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Clean to Grid filter.
This property specifies the input to the Clean Cells to Grid filter.


|
|
Line 687: Line 692:




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.




Line 693: Line 698:




==Clip==
==Clean to Grid==




Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.
This filter merges points and converts the data set to unstructured grid.


The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.<br>
The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 707: Line 712:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Clip Type'''<br>''(ClipFunction)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset.
This property specifies the input to the Clean to Grid filter.
 
|
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|}
 
 
==Clip==
 
 
Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.
 
The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Clip Type'''<br>''(ClipFunction)''
|
This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset.


|
|
Line 881: Line 915:




==Compute Derivatives==
==Clip Generic Dataset==




This filter computes derivatives of scalars and vectors.
Clip with an implicit plane, sphere or with scalars. Clipping does not reduce the dimensionality of the data set.  This output data type of this filter is always an unstructured grid.


CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.<br>
The Generic Clip filter cuts away a portion of the input data set using a plane, a sphere, a box, or a scalar value. The menu in the Clip Function portion of the interface allows the user to select which implicit function to use or whether to clip using a scalar value. Making this selection loads the appropriate user interface. For the implicit functions, the appropriate 3D widget (plane, sphere, or box) is also displayed. The use of these 3D widgets, including their user interface components, is discussed in section 7.4.<br>
If an implicit function is selected, the clip filter returns that portion of the input data set that lies inside the function. If Scalars is selected, then the user must specify a scalar array to clip according to. The clip filter will return the portions of the data set whose value in the selected Scalars array is larger than the Clip value. Regardless of the selection from the Clip Function menu, if the Inside Out option is checked, the opposite portions of the data set will be returned.<br>
This filter operates on all types of data sets, and it returns unstructured grid data on output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 895: Line 931:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''Clip Type'''<br>''(ClipFunction)''
|
|
This property specifies the input to the filter.
Set the parameters of the clip function.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must be set to one of the following: Plane, Box, Sphere, Scalar.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|-
| '''Output Tensor Type'''<br>''(OutputTensorType)''
| '''Input'''<br>''(Input)''
|
|
This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor.
Set the input to the Generic Clip filter.


| 1
|
|
The value must be one of the following: Nothing (0), Vector Gradient (1), Strain (2).
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.




|-
|-
| '''Output Vector Type'''<br>''(OutputVectorType)''
| '''Inside Out'''<br>''(InsideOut)''
|
|
This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data.
Choose which portion of the dataset should be clipped away.


| 1
| 0
|
|
The value must be one of the following: Nothing (0), Scalar Gradient (1), Vorticity (2).
Only the values 0 and 1 are accepted.




Line 930: Line 966:
| '''Scalars'''<br>''(SelectInputScalars)''
| '''Scalars'''<br>''(SelectInputScalars)''
|
|
This property indicates the name of the scalar array to differentiate.
If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.


|
|
|
|
An array of scalars is required.
An array of scalars is required.
Valud array names will be chosen from point and cell data.




|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
| '''Value'''<br>''(Value)''
|
|
This property indicates the name of the vector array to differentiate.
If clipping with a scalar array, choose the clipping value.


| 1
| 0
|
|
An array of vectors is required.
The value must lie within the range of the selected data array.




Line 950: Line 989:




==Connectivity==
==Compute Derivatives==




Mark connected components with integer point attribute array.
This filter computes derivatives of scalars and vectors.


The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.<br>
CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 964: Line 1,003:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Color Regions'''<br>''(ColorRegions)''
| '''Input'''<br>''(Input)''
|
|
Controls the coloring of the connected regions.
This property specifies the input to the filter.


| 1
|
|
Only the values 0 and 1 are accepted.
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|-
| '''Extraction Mode'''<br>''(ExtractionMode)''
| '''Output Tensor Type'''<br>''(OutputTensorType)''
|
|
Controls the extraction of connected surfaces.
This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor.


| 5
| 1
|
|
The value must be one of the following: Extract Point Seeded Regions (1), Extract Cell Seeded Regions (2), Extract Specified Regions (3), Extract Largest Region (4), Extract All Regions (5), Extract Closes Point Region (6).
The value must be one of the following: Nothing (0), Vector Gradient (1), Strain (2).




|-
|-
| '''Input'''<br>''(Input)''
| '''Output Vector Type'''<br>''(OutputVectorType)''
|
|
This property specifies the input to the Connectivity filter.
This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data.


| 1
|
|
|
The value must be one of the following: Nothing (0), Scalar Gradient (1), Vorticity (2).
The selected object must be the result of the following: sources (includes readers), filters.




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|
This property indicates the name of the scalar array to differentiate.


|
|
An array of scalars is required.


|}
 
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|
This property indicates the name of the vector array to differentiate.
 
| 1
|
An array of vectors is required.
 
 
|}




==Contingency Statistics==
==Connectivity==




Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
Mark connected components with integer point attribute array.


This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.<br>
The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.<br>
This filter computes contingency tables between pairs of attributes. This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model. Data is assessed by computing <br>
*  the probability of observing both variables simultaneously;<br>
*  the probability of each variable conditioned on the other (the two values need not be identical); and<br>
*  the pointwise mutual information (PMI).
<br>
Finally, the summary statistics include the information entropy of the observations.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,019: Line 1,072:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
| '''Color Regions'''<br>''(ColorRegions)''
|
|
Specify which type of field data the arrays will be drawn from.
Controls the coloring of the connected regions.


| 0
| 1
|
|
Valud array names will be chosen from point and cell data.
Only the values 0 and 1 are accepted.




|-
|-
| '''Input'''<br>''(Input)''
| '''Extraction Mode'''<br>''(ExtractionMode)''
|
|
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
Controls the extraction of connected surfaces.


| 5
|
|
|
The value must be one of the following: Extract Point Seeded Regions (1), Extract Cell Seeded Regions (2), Extract Specified Regions (3), Extract Largest Region (4), Extract All Regions (5), Extract Closes Point Region (6).
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The dataset must contain a point or cell array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.




|-
|-
| '''Model Input'''<br>''(ModelInput)''
| '''Input'''<br>''(Input)''
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.
This property specifies the input to the Connectivity filter.


|
|
Line 1,054: Line 1,101:




The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|}
| '''Variables of Interest'''<br>''(SelectArrays)''
|
Choose arrays whose entries will be used to form observations for statistical analysis.


|
|
An array of scalars is required.


==Contingency Statistics==


|-
| '''Task'''<br>''(Task)''
|
Specify the task to be performed: modeling and/or assessment.
#  "Statistics of all the data," creates an output table (or tables) summarizing the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The ''Training fraction'' setting will be ignored for tasks 1 and 3.
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.


| 3
This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.<br>
|
This filter computes contingency tables between pairs of attributes.  This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.  Data is assessed by computing <br>
The value must be one of the following: Statistics of all the data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).
*  the probability of observing both variables simultaneously;<br>
 
the probability of each variable conditioned on the other (the two values need not be identical); and<br>
 
*  the pointwise mutual information (PMI).
|-
<br>
| '''Training Fraction'''<br>''(TrainingFraction)''
Finally, the summary statistics include the information entropy of the observations.<br>
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.
 
| 0.1
|
The value must be greater than or equal to 0 and less than or equal to 1.
 
 
|}
 
 
==Contour==
 
 
Generate isolines or isosurfaces using point scalars.
 
The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,110: Line 1,127:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Gradients'''<br>''(ComputeGradients)''
| '''Attribute Mode'''<br>''(AttributeMode)''
|
|
If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1.
Specify which type of field data the arrays will be drawn from.


| 0
| 0
|
|
Only the values 0 and 1 are accepted.
Valud array names will be chosen from point and cell data.




|-
|-
| '''Compute Normals'''<br>''(ComputeNormals)''
| '''Input'''<br>''(Input)''
|
|
If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0.
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
Select whether to compute normals.


| 1
|
|
Only the values 0 and 1 are accepted.
|
The selected object must be the result of the following: sources (includes readers), filters.
 


The dataset must contain a point or cell array.


|-
| '''Compute Scalars'''<br>''(ComputeScalars)''
|
If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array.


| 0
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
|
Only the values 0 and 1 are accepted.




|-
|-
| '''Isosurfaces'''<br>''(ContourValues)''
| '''Model Input'''<br>''(ModelInput)''
|
|
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.
A previously-calculated model with which to assess a separate dataset. This input is optional.


|
|
|
|
The value must lie within the range of the selected data array.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.




|-
|-
| '''Input'''<br>''(Input)''
| '''Variables of Interest'''<br>''(SelectArrays)''
|
|
This property specifies the input dataset to be used by the contour filter.
Choose arrays whose entries will be used to form observations for statistical analysis.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
An array of scalars is required.
 
 
The dataset must contain a point array with 1 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|-
| '''Point Merge Method'''<br>''(Locator)''
| '''Task'''<br>''(Task)''
|
|
This property specifies an incremental point locator for merging duplicate / coincident points.
Specify the task to be performed: modeling and/or assessment.
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.
 
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.


| 3
|
|
|
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).
The selected object must be the result of the following: incremental_point_locators.
 
 
The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator.




|-
|-
| '''Contour By'''<br>''(SelectInputScalars)''
| '''Training Fraction'''<br>''(TrainingFraction)''
|
|
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


| 0.1
|
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
An array of scalars is required.
 
 
Valud array names will be chosen from point and cell data.




Line 1,195: Line 1,204:




==Curvature==
==Contour==




This filter will compute the Gaussian or mean curvature of the mesh at each point.
Generate isolines or isosurfaces using point scalars.


The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures.<br><br><br>
The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br>
; the type can be selected from the Curvature type menu button.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,210: Line 1,218:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Curvature Type'''<br>''(CurvatureType)''
| '''Compute Gradients'''<br>''(ComputeGradients)''
|
|
This propery specifies which type of curvature to compute.
If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1.


| 0
| 0
|
|
The value must be one of the following: Gaussian (0), Mean (1).
Only the values 0 and 1 are accepted.




|-
|-
| '''Input'''<br>''(Input)''
| '''Compute Normals'''<br>''(ComputeNormals)''
|
|
This property specifies the input to the Curvature filter.
If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0.
Select whether to compute normals.


| 1
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.




|-
|-
| '''Invert Mean Curvature'''<br>''(InvertMeanCurvature)''
| '''Compute Scalars'''<br>''(ComputeScalars)''
|
|
If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals.
If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array.


| 0
| 0
Line 1,242: Line 1,248:




|}
==D3==
Repartition a data set into load-balanced spatially convex regions. Create ghost cells if requested.
The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Isosurfaces'''<br>''(ContourValues)''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Boundary Mode'''<br>''(BoundaryMode)''
|
|
This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering.
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.


| 0
|
|
The value must be one of the following: Assign cells uniquely (0), Duplicate cells (1), Divide cells (2).
|
The value must lie within the range of the selected data array.




Line 1,271: Line 1,261:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the D3 filter.
This property specifies the input dataset to be used by the contour filter.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a point array with 1 components.




Line 1,282: Line 1,275:


|-
|-
| '''Minimal Memory'''<br>''(UseMinimalMemory)''
| '''Point Merge Method'''<br>''(Locator)''
|
|
If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction.
This property specifies an incremental point locator for merging duplicate / coincident points.


| 0
|
|
Only the values 0 and 1 are accepted.
|
The selected object must be the result of the following: incremental_point_locators.
 
 
The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator.
 
 
|-
| '''Contour By'''<br>''(SelectInputScalars)''
|
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.
 
|
|
An array of scalars is required.
 
 
Valud array names will be chosen from point and cell data.




Line 1,294: Line 1,303:




==Decimate==
==Contour Generic Dataset==




Simplify a polygonal model using an adaptive edge collapse algorithm. This filter works with triangles only.
Generate isolines or isosurfaces using point scalars.


The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.<br>
The Generic Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The available scalar arrays are listed in the Scalars menu. The scalar range of the selected array will be displayed.<br>
The interface for adding contour values is very similar to the one for selecting cut offsets (in the Cut filter). To add a single contour value, select the value from the New Value slider in the Add value portion of the interface and click the Add button, or press Enter. To instead add several evenly spaced contours, use the controls in the Generate range of values section. Select the number of contour values to generate using the Number of Values slider. The Range slider controls the interval in which to generate the contour values. Once the number of values and range have been selected, click the Generate button. The new values will be added to the Contour Values list. To delete a value from the Contour Values list, select the value and click the Delete button. (If no value is selected, the last value in the list will be removed.) Clicking the Delete All button removes all the values in the list. If no values are in the Contour Values list when Accept is pressed, the current value of the New Value slider will be used.<br>
In addition to selecting contour values, you can also select additional computations to perform. If any of Compute Normals, Compute Gradients, or Compute Scalars is selected, the appropriate computation will be performed, and a corresponding point-centered array will be added to the output.<br>
The Generic Contour filter operates on a generic data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,308: Line 1,320:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Boundary Vertex Deletion'''<br>''(BoundaryVertexDeletion)''
| '''Compute Gradients'''<br>''(ComputeGradients)''
|
|
If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target.
Select whether to compute gradients.


| 1
| 0
|
|
Only the values 0 and 1 are accepted.
Only the values 0 and 1 are accepted.
Line 1,318: Line 1,330:


|-
|-
| '''Feature Angle'''<br>''(FeatureAngle)''
| '''Compute Normals'''<br>''(ComputeNormals)''
|
|
The value of thie property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split.
Select whether to compute normals.


| 15
| 1
|
|
The value must be greater than or equal to 0 and less than or equal to 180.
Only the values 0 and 1 are accepted.




|-
|-
| '''Input'''<br>''(Input)''
| '''Compute Scalars'''<br>''(ComputeScalars)''
|
|
This property specifies the input to the Decimate filter.
Select whether to compute scalars.


| 0
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.




|-
|-
| '''Preserve Topology'''<br>''(PreserveTopology)''
| '''Isosurfaces'''<br>''(ContourValues)''
|
|
If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced.
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.


| 0
|
|
Only the values 0 and 1 are accepted.
|
The value must lie within the range of the selected data array.




|-
|-
| '''Target Reduction'''<br>''(TargetReduction)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.)
Set the input to the Generic Contour filter.


| 0.9
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.
 
 
|-
| '''Point Merge Method'''<br>''(Locator)''
|
This property specifies an incremental point locator for merging duplicate / coincident points.
 
|
|
The selected object must be the result of the following: incremental_point_locators.
 
 
The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator.
 
 
|-
| '''Contour By'''<br>''(SelectInputScalars)''
|
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.
 
|
|
An array of scalars is required.
 
 
Valud array names will be chosen from point and cell data.




Line 1,363: Line 1,401:




==Delaunay 2D==
==Curvature==




Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution.
This filter will compute the Gaussian or mean curvature of the mesh at each point.


Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh.<br><br><br>
The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures.<br><br><br>
The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation.<br><br><br>
; the type can be selected from the Curvature type menu button.<br>
Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed.<br><br><br>
The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,383: Line 1,416:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Alpha'''<br>''(Alpha)''
| '''Curvature Type'''<br>''(CurvatureType)''
|
|
The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output.
This propery specifies which type of curvature to compute.


| 0
| 0
|
|
The value must be greater than or equal to 0.
The value must be one of the following: Gaussian (0), Mean (1).




|-
|-
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
| '''Input'''<br>''(Input)''
|
|
If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output.
This property specifies the input to the Curvature filter.


| 0
|
|
Only the values 0 and 1 are accepted.
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.




|-
|-
| '''Input'''<br>''(Input)''
| '''Invert Mean Curvature'''<br>''(InvertMeanCurvature)''
|
|
This property specifies the input dataset to the Delaunay 2D filter.
If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals.


| 0
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
|}
 
 
==D3==




The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
Repartition a data set into load-balanced spatially convex regions.  Create ghost cells if requested.


The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Offset'''<br>''(Offset)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Boundary Mode'''<br>''(BoundaryMode)''
|
|
This property is a multiplier to control the size of the initial, bounding Delaunay triangulation.
This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering.


| 1
| 0
|
|
The value must be greater than or equal to 0.75.
The value must be one of the following: Assign cells uniquely (0), Duplicate cells (1), Divide cells (2).




|-
|-
| '''Projection Plane Mode'''<br>''(ProjectionPlaneMode)''
| '''Input'''<br>''(Input)''
|
|
This property determines type of projection plane to use in performing the triangulation.
This property specifies the input to the D3 filter.


| 0
|
|
The value must be one of the following: XY Plane (0), Best-Fitting Plane (2).
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|-
| '''Tolerance'''<br>''(Tolerance)''
| '''Minimal Memory'''<br>''(UseMinimalMemory)''
|
|
This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.
If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction.


| 1e-05
| 0
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
Only the values 0 and 1 are accepted.




Line 1,448: Line 1,500:




==Delaunay 3D==
==Decimate==




Create a 3D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkUnstructuredGrid as output.
Simplify a polygonal model using an adaptive edge collapse algorithm. This filter works with triangles only.


Delaunay3D is a filter that constructs a 3D Delaunay triangulation<br>
The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.<br>
from a list of input points. These points may be represented by any<br>
 
dataset of type vtkPointSet and subclasses. The output of the filter<br>
{| class="PropertiesTable" border="1" cellpadding="5"
is an unstructured grid dataset. Usually the output is a tetrahedral<br>
|-
mesh, but if a non-zero alpha distance value is specified (called<br>
| '''Property'''
the "alpha" value), then only tetrahedra, triangles, edges, and<br>
| '''Description'''
vertices lying within the alpha radius are output. In other words,<br>
| '''Default Value(s)'''
non-zero alpha values may result in arbitrary combinations of<br>
| '''Restrictions'''
tetrahedra, triangles, lines, and vertices. (The notion of alpha<br>
|-
value is derived from Edelsbrunner's work on "alpha shapes".)<br><br><br>
| '''Boundary Vertex Deletion'''<br>''(BoundaryVertexDeletion)''
The 3D Delaunay triangulation is defined as the triangulation that<br>
|
satisfies the Delaunay criterion for n-dimensional simplexes (in<br>
If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target.
this case n=3 and the simplexes are tetrahedra). This criterion<br>
 
states that a circumsphere of each simplex in a triangulation<br>
| 1
contains only the n+1 defining points of the simplex. (See text for<br>
|
more information.) While in two dimensions this translates into an<br>
Only the values 0 and 1 are accepted.
"optimal" triangulation, this is not true in 3D, since a measurement<br>
 
for optimality in 3D is not agreed on.<br><br><br>
 
Delaunay triangulations are used to build topological structures<br>
|-
from unorganized (or unstructured) points. The input to this filter<br>
| '''Feature Angle'''<br>''(FeatureAngle)''
is a list of points specified in 3D. (If you wish to create 2D<br>
|
triangulations see Delaunay2D.) The output is an unstructured<br>
The value of thie property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split.
grid.<br><br><br>
 
The Delaunay triangulation can be numerically sensitive. To prevent<br>
| 15
problems, try to avoid injecting points that will result in<br>
|
triangles with bad aspect ratios (1000:1 or greater). In practice<br>
The value must be greater than or equal to 0 and less than or equal to 180.
this means inserting points that are "widely dispersed", and enables<br>
 
smooth transition of triangle sizes throughout the mesh. (You may<br>
 
even want to add extra points to create a better point<br>
|-
distribution.) If numerical problems are present, you will see a<br>
| '''Input'''<br>''(Input)''
warning message to this effect at the end of the triangulation<br>
|
process.<br><br><br>
This property specifies the input to the Decimate filter.
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can<br>
be triangulated in more than one way (at least according to the<br>
Delaunay criterion). The choice of triangulation (as implemented by<br>
this algorithm) depends on the order of the input points. The first<br>
four points will form a tetrahedron; other degenerate points<br>
(relative to this initial tetrahedron) will not break it.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the<br>
algorithm. This is because the Delaunay triangulation requires<br>
unique input points. You can control the definition of coincidence<br>
with the "Tolerance" instance variable.<br><br><br>
The output of the Delaunay triangulation is supposedly a convex<br>
hull. In certain cases this implementation may not generate the<br>
convex hull. This behavior can be controlled by the Offset instance<br>
variable. Offset is a multiplier used to control the size of the<br>
initial triangulation. The larger the offset value, the more likely<br>
you will generate a convex hull; and the more likely you are to see<br>
numerical problems.<br><br><br>
The implementation of this algorithm varies from the 2D Delaunay<br>
algorithm (i.e., Delaunay2D) in an important way. When points are<br>
injected into the triangulation, the search for the enclosing<br>
tetrahedron is quite different. In the 3D case, the closest<br>
previously inserted point point is found, and then the connected<br>
tetrahedra are searched to find the containing one. (In 2D, a "walk"<br>
towards the enclosing triangle is performed.) If the triangulation<br>
is Delaunay, then an enclosing tetrahedron will be found. However,<br>
in degenerate cases an enclosing tetrahedron may not be found and<br>
the point will be rejected.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Alpha'''<br>''(Alpha)''
|
|
This property specifies the alpha (or distance) value to control
|
the output of this filter. For a non-zero alpha value, only
The selected object must be the result of the following: sources (includes readers), filters.
edges, faces, or tetra contained within the circumsphere (of
 
radius alpha) will be output. Otherwise, only tetrahedra will be
output.


| 0
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
|
The value must be greater than or equal to 0.




|-
|-
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
| '''Preserve Topology'''<br>''(PreserveTopology)''
|
|
This boolean controls whether bounding triangulation points (and
If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced.
associated triangles) are included in the output. (These are
introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.)


| 0
| 0
Line 1,548: Line 1,557:


|-
|-
| '''Input'''<br>''(Input)''
| '''Target Reduction'''<br>''(TargetReduction)''
|
|
This property specifies the input dataset to the Delaunay 3D filter.
This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.)


|
| 0.9
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
 
 
|-
| '''Offset'''<br>''(Offset)''
|
This property specifies a multiplier to control the size of the
initial, bounding Delaunay triangulation.
 
| 2.5
|
The value must be greater than or equal to 2.5.
 
 
|-
| '''Tolerance'''<br>''(Tolerance)''
|
This property specifies a tolerance to control discarding of
closely spaced points. This tolerance is specified as a fraction
of the diagonal length of the bounding box of the points.
 
| 0.001
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
The value must be greater than or equal to 0 and less than or equal to 1.
Line 1,586: Line 1,569:




==Descriptive Statistics==
==Delaunay 2D==




Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution.
 
This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.
<br>
This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select.
 
<br>
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided. Data is assessed using this model by detrending the data (i.e., subtracting the mean) and then dividing by the standard deviation. Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br>


Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh.<br><br><br>
The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation.<br><br><br>
Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed.<br><br><br>
The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,606: Line 1,589:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
| '''Alpha'''<br>''(Alpha)''
|
|
Specify which type of field data the arrays will be drawn from.
The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output.


| 0
| 0
|
|
Valud array names will be chosen from point and cell data.
The value must be greater than or equal to 0.




|-
|-
| '''Input'''<br>''(Input)''
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
|
|
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output.


| 0
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The dataset must contain a point or cell array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.




|-
|-
| '''Model Input'''<br>''(ModelInput)''
| '''Input'''<br>''(Input)''
|
|
A previously-calculated model with which to assess a separate dataset. This input is optional.
This property specifies the input dataset to the Delaunay 2D filter.


|
|
Line 1,641: Line 1,618:




The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.




|-
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
| '''Offset'''<br>''(Offset)''
|
|
Choose arrays whose entries will be used to form observations for statistical analysis.
This property is a multiplier to control the size of the initial, bounding Delaunay triangulation.


| 1
|
|
|
The value must be greater than or equal to 0.75.
An array of scalars is required.




|-
|-
| '''Deviations should be'''<br>''(SignedDeviations)''
| '''Projection Plane Mode'''<br>''(ProjectionPlaneMode)''
|
|
Should the assessed values be signed deviations or unsigned?
This property determines type of projection plane to use in performing the triangulation.


| 0
| 0
|
|
The value must be one of the following: Unsigned (0), Signed (1).
The value must be one of the following: XY Plane (0), Best-Fitting Plane (2).




|-
|-
| '''Task'''<br>''(Task)''
| '''Tolerance'''<br>''(Tolerance)''
|
|
Specify the task to be performed: modeling and/or assessment.
This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.
#  "Statistics of all the data," creates an output table (or tables) summarizing the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The ''Training fraction'' setting will be ignored for tasks 1 and 3.
| 1e-05
 
| 3
|
The value must be one of the following: Statistics of all the data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).
 
 
|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.
 
| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
The value must be greater than or equal to 0 and less than or equal to 1.
Line 1,693: Line 1,654:




==Elevation==
==Delaunay 3D==




Create point attribute array by projecting points onto an elevation vector.
Create a 3D Delaunay triangulation of input                                points. It expects a vtkPointSet as input and                                produces vtkUnstructuredGrid as output.
 
The Elevation filter generates point scalar values for an input dataset along a specified direction vector.<br><br><br>
The Input menu allows the user to select the data set to which this filter will be applied. Use the Scalar range entry boxes to specify the minimum and maximum scalar value to be generated. The Low Point and High Point define a line onto which each point of the data set is projected. The minimum scalar value is associated with the Low Point, and the maximum scalar value is associated with the High Point. The scalar value for each point in the data set is determined by the location along the line to which that point projects.<br>
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''High Point'''<br>''(HighPoint)''
|
This property defines the other end of the direction vector (large scalar values).
 
| 0 0 1
|
The coordinate must lie within the bounding box of the dataset. It will default to the maximum in each dimension.
 
 
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input dataset to the Elevation filter.
 
|
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


Delaunay3D is a filter that constructs a 3D Delaunay triangulation<br>
from a list of input points. These points may be represented by any<br>
dataset of type vtkPointSet and subclasses. The output of the filter<br>
is an unstructured grid dataset. Usually the output is a tetrahedral<br>
mesh, but if a non-zero alpha distance value is specified (called<br>
the "alpha" value), then only tetrahedra, triangles, edges, and<br>
vertices lying within the alpha radius are output. In other words,<br>
non-zero alpha values may result in arbitrary combinations of<br>
tetrahedra, triangles, lines, and vertices. (The notion of alpha<br>
value is derived from Edelsbrunner's work on "alpha shapes".)<br><br><br>
The 3D Delaunay triangulation is defined as the triangulation that<br>
satisfies the Delaunay criterion for n-dimensional simplexes (in<br>
this case n=3 and the simplexes are tetrahedra). This criterion<br>
states that a circumsphere of each simplex in a triangulation<br>
contains only the n+1 defining points of the simplex. (See text for<br>
more information.) While in two dimensions this translates into an<br>
"optimal" triangulation, this is not true in 3D, since a measurement<br>
for optimality in 3D is not agreed on.<br><br><br>
Delaunay triangulations are used to build topological structures<br>
from unorganized (or unstructured) points. The input to this filter<br>
is a list of points specified in 3D. (If you wish to create 2D<br>
triangulations see Delaunay2D.) The output is an unstructured<br>
grid.<br><br><br>
The Delaunay triangulation can be numerically sensitive. To prevent<br>
problems, try to avoid injecting points that will result in<br>
triangles with bad aspect ratios (1000:1 or greater). In practice<br>
this means inserting points that are "widely dispersed", and enables<br>
smooth transition of triangle sizes throughout the mesh. (You may<br>
even want to add extra points to create a better point<br>
distribution.) If numerical problems are present, you will see a<br>
warning message to this effect at the end of the triangulation<br>
process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can<br>
be triangulated in more than one way (at least according to the<br>
Delaunay criterion). The choice of triangulation (as implemented by<br>
this algorithm) depends on the order of the input points. The first<br>
four points will form a tetrahedron; other degenerate points<br>
(relative to this initial tetrahedron) will not break it.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the<br>
algorithm. This is because the Delaunay triangulation requires<br>
unique input points. You can control the definition of coincidence<br>
with the "Tolerance" instance variable.<br><br><br>
The output of the Delaunay triangulation is supposedly a convex<br>
hull. In certain cases this implementation may not generate the<br>
convex hull. This behavior can be controlled by the Offset instance<br>
variable. Offset is a multiplier used to control the size of the<br>
initial triangulation. The larger the offset value, the more likely<br>
you will generate a convex hull; and the more likely you are to see<br>
numerical problems.<br><br><br>
The implementation of this algorithm varies from the 2D Delaunay<br>
algorithm (i.e., Delaunay2D) in an important way. When points are<br>
injected into the triangulation, the search for the enclosing<br>
tetrahedron is quite different. In the 3D case, the closest<br>
previously inserted point point is found, and then the connected<br>
tetrahedra are searched to find the containing one. (In 2D, a "walk"<br>
towards the enclosing triangle is performed.) If the triangulation<br>
is Delaunay, then an enclosing tetrahedron will be found. However,<br>
in degenerate cases an enclosing tetrahedron may not be found and<br>
the point will be rejected.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Low Point'''<br>''(LowPoint)''
| '''Alpha'''<br>''(Alpha)''
|
|
This property defines one end of the direction vector (small scalar values).
This property specifies the alpha (or distance) value to control
the output of this filter.  For a non-zero alpha value, only
edges, faces, or tetra contained within the circumsphere (of
radius alpha) will be output.  Otherwise, only tetrahedra will be
output.


| 0 0 0
| 0
|
|
The coordinate must lie within the bounding box of the dataset. It will default to the minimum in each dimension.
The value must be greater than or equal to 0.




|-
|-
| '''Scalar Range'''<br>''(ScalarRange)''
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
|
|
This property determines the range into which scalars will be mapped.
This boolean controls whether bounding triangulation points (and
associated triangles) are included in the output. (These are
introduced as an initial triangulation to begin the triangulation
process. This feature is nice for debugging output.)


| 0 1
| 0
|
|
|}
Only the values 0 and 1 are accepted.
 
 
==Extract AMR Blocks==
 
 
This filter extracts a list of datasets from hierarchical datasets.


This filter extracts a list of datasets from hierarchical datasets.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Extract Datasets filter.
This property specifies the input dataset to the Delaunay 3D filter.


|
|
Line 1,773: Line 1,763:




The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.




|-
|-
| '''Selected Data Sets'''<br>''(SelectedDataSets)''
| '''Offset'''<br>''(Offset)''
|
This property specifies a multiplier to control the size of the
initial, bounding Delaunay triangulation.
 
| 2.5
|
|
This property provides a list of datasets to extract.
The value must be greater than or equal to 2.5.
 


|-
| '''Tolerance'''<br>''(Tolerance)''
|
|
This property specifies a tolerance to control discarding of
closely spaced points. This tolerance is specified as a fraction
of the diagonal length of the bounding box of the points.
| 0.001
|
|
The value must be greater than or equal to 0 and less than or equal to 1.
|}
|}




==Extract Block==
==Descriptive Statistics==




This filter extracts a range of blocks from a multiblock dataset.
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.


This filter extracts a range of groups from a multiblock dataset<br>
This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.
<br>
This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select.


{| class="PropertiesTable" border="1" cellpadding="5"
<br>
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided. Data is assessed using this model by detrending the data (i.e., subtracting the mean) and then dividing by the standard deviation. Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br>
 
 
{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Property'''
Line 1,800: Line 1,812:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Block Indices'''<br>''(BlockIndices)''
| '''Attribute Mode'''<br>''(AttributeMode)''
|
|
This property lists the ids of the blocks to extract
Specify which type of field data the arrays will be drawn from.
from the input multiblock dataset.


| 0
|
|
|
Valud array names will be chosen from point and cell data.
 
 
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Extract Group filter.
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.


|
|
Line 1,817: Line 1,831:




The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.
The dataset must contain a point or cell array.




|-
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
| '''Maintain Structure'''<br>''(MaintainStructure)''
|
This is used only when PruneOutput is ON. By default, when pruning the
output i.e. remove empty blocks, if node has only 1 non-null child
block, then that node is removed. To preserve these parent nodes, set
this flag to true.
 
| 0
|
Only the values 0 and 1 are accepted.




|-
|-
| '''Prune Output'''<br>''(PruneOutput)''
| '''Model Input'''<br>''(ModelInput)''
|
|
When set, the output mutliblock dataset will be pruned to remove empty
A previously-calculated model with which to assess a separate dataset. This input is optional.
nodes. On by default.


| 1
|
|
Only the values 0 and 1 are accepted.
|
The selected object must be the result of the following: sources (includes readers), filters.




|}
The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.
 
 
==Extract CTH Parts==
 
 
Create a surface from a CTH volume fraction.


Extract CTH Parts is a specialized filter for visualizing the data from a CTH simulation. It first converts the selected cell-centered arrays to point-centered ones. It then contours each array at a value of 0.5. The user has the option of clipping the resulting surface(s) with a plane. This filter only operates on unstructured data. It produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Variables of Interest'''<br>''(SelectArrays)''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Double Volume Arrays'''<br>''(AddDoubleVolumeArrayName)''
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.
Choose arrays whose entries will be used to form observations for statistical analysis.


|
|
Line 1,871: Line 1,861:


|-
|-
| '''Float Volume Arrays'''<br>''(AddFloatVolumeArrayName)''
| '''Deviations should be'''<br>''(SignedDeviations)''
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.
Should the assessed values be signed deviations or unsigned?


| 0
|
|
|
The value must be one of the following: Unsigned (0), Signed (1).
An array of scalars is required.




|-
|-
| '''Unsigned Character Volume Arrays'''<br>''(AddUnsignedCharVolumeArrayName)''
| '''Task'''<br>''(Task)''
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.
Specify the task to be performed: modeling and/or assessment.
 
#  "Detailed model of input data," creates a set of output tables containing a calculated statistical model of the '''entire''' input dataset;
|
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
|
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
An array of scalars is required.
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.


|-
| 3
| '''Clip Type'''<br>''(ClipPlane)''
|
This property specifies whether to clip the dataset, and if so, it also specifies the parameters of the plane with which to clip.
 
|
|
|
The value must be set to one of the following: None, Plane, Box, Sphere.
The value must be one of the following: Detailed model of input data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).




|-
|-
| '''Input'''<br>''(Input)''
| '''Training Fraction'''<br>''(TrainingFraction)''
|
|
This property specifies the input to the Extract CTH Parts filter.
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.
 
|
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The dataset must contain a cell array with 1 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|
The value of this property is the volume fraction value for the surface.


| 0.1
| 0.1
Line 1,929: Line 1,899:




==Extract Cells By Region==
==Elevation==




This filter extracts cells that are inside/outside a region or at a region boundary.
Create point attribute array by projecting points onto an elevation vector.


This filter extracts from its input dataset all cells that are either completely inside or outside of a specified region (implicit function). On output, the filter generates an unstructured grid.<br>
The Elevation filter generates point scalar values for an input dataset along a specified direction vector.<br><br><br>
To use this filter you must specify a region (implicit function). You must also specify whethter to extract cells lying inside or outside of the region. An option exists to extract cells that are neither inside or outside (i.e., boundary).<br>
The Input menu allows the user to select the data set to which this filter will be applied. Use the Scalar range entry boxes to specify the minimum and maximum scalar value to be generated. The Low Point and High Point define a line onto which each point of the data set is projected. The minimum scalar value is associated with the Low Point, and the maximum scalar value is associated with the High Point. The scalar value for each point in the data set is determined by the location along the line to which that point projects.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 1,944: Line 1,914:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Extract intersected'''<br>''(Extract intersected)''
| '''High Point'''<br>''(HighPoint)''
|
|
This parameter controls whether to extract cells that are on the boundary of the region.
This property defines the other end of the direction vector (large scalar values).


| 0
| 0 0 1
|
|
Only the values 0 and 1 are accepted.
The coordinate must lie within the bounding box of the dataset. It will default to the maximum in each dimension.




|-
|-
| '''Extract only intersected'''<br>''(Extract only intersected)''
| '''Input'''<br>''(Input)''
|
|
This parameter controls whether to extract only cells that are on the boundary of the region. If this parameter is set, the Extraction Side parameter is ignored. If Extract Intersected is off, this parameter has no effect.
This property specifies the input dataset to the Elevation filter.


| 0
|
|
Only the values 0 and 1 are accepted.
|
The selected object must be the result of the following: sources (includes readers), filters.




|-
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
| '''Extraction Side'''<br>''(ExtractInside)''
|
This parameter controls whether to extract cells that are inside or outside the region.
 
| 1
|
The value must be one of the following: outside (0), inside (1).




|-
|-
| '''Intersect With'''<br>''(ImplicitFunction)''
| '''Low Point'''<br>''(LowPoint)''
|
|
This property sets the region used to extract cells.
This property defines one end of the direction vector (small scalar values).


| 0 0 0
|
|
|
The coordinate must lie within the bounding box of the dataset. It will default to the minimum in each dimension.
The value must be set to one of the following: Plane, Box, Sphere.




|-
|-
| '''Input'''<br>''(Input)''
| '''Scalar Range'''<br>''(ScalarRange)''
|
|
This property specifies the input to the Slice filter.
This property determines the range into which scalars will be mapped.


| 0 1
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
|}
|}




==Extract Edges==
==Extract AMR Blocks==




Extract edges of 2D and 3D cells as lines.
This filter extracts a list of datasets from hierarchical datasets.


The Extract Edges filter produces a wireframe version of the input dataset by extracting all the edges of the dataset's cells as lines. This filter operates on any type of data set and produces polygonal output.<br>
This filter extracts a list of datasets from hierarchical datasets.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,015: Line 1,972:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Extract Edges filter.
This property specifies the input to the Extract Datasets filter.


|
|
Line 2,022: Line 1,979:




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.




|}
|-
| '''Selected Data Sets'''<br>''(SelectedDataSets)''
|
This property provides a list of datasets to extract.


|
|
|}


==Extract Level==
 
==Extract Block==




This filter extracts a range of groups from a hierarchical dataset.
This filter extracts a range of blocks from a multiblock dataset.


This filter extracts a range of levels from a hierarchical dataset<br>
This filter extracts a range of groups from a multiblock dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,041: Line 2,005:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Block Indices'''<br>''(BlockIndices)''
|
This property lists the ids of the blocks to extract
from the input multiblock dataset.
|
|
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
Line 2,051: Line 2,023:




The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.




|-
|-
| '''Levels'''<br>''(Levels)''
| '''Maintain Structure'''<br>''(MaintainStructure)''
|
This is used only when PruneOutput is ON. By default, when pruning the
output i.e. remove empty blocks, if node has only 1 non-null child
block, then that node is removed. To preserve these parent nodes, set
this flag to true.
 
| 0
|
|
This property lists the levels to extract
Only the values 0 and 1 are accepted.
from the input hierarchical dataset.
 


|-
| '''Prune Output'''<br>''(PruneOutput)''
|
|
When set, the output mutliblock dataset will be pruned to remove empty
nodes. On by default.
| 1
|
|
Only the values 0 and 1 are accepted.
|}
|}




==Extract Selection==
==Extract CTH Parts==




Extract different type of selections.
Create a surface from a CTH volume fraction.


This filter extracts a set of cells/points given a selection.<br>
Extract CTH Parts is a specialized filter for visualizing the data from a CTH simulation. It first converts the selected cell-centered arrays to point-centered ones. It then contours each array at a value of 0.5. The user has the option of clipping the resulting surface(s) with a plane. This filter only operates on unstructured data. It produces polygonal output.<br>
The selection can be obtained from a rubber-band selection<br>
(either cell, visible or in a frustum) or threshold selection<br>
and passed to the filter or specified by providing an ID list.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,082: Line 2,067:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''Double Volume Arrays'''<br>''(AddDoubleVolumeArrayName)''
|
|
This property specifies the input from which the selection is extracted.
This property specifies the name(s) of the volume fraction array(s) for generating parts.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
An array of scalars is required.
 
 
|-
| '''Float Volume Arrays'''<br>''(AddFloatVolumeArrayName)''
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.
 
|
|
An array of scalars is required.
 


|-
| '''Unsigned Character Volume Arrays'''<br>''(AddUnsignedCharVolumeArrayName)''
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable.
|
|
An array of scalars is required.




|-
|-
| '''Preserve Topology'''<br>''(PreserveTopology)''
| '''Clip Type'''<br>''(ClipPlane)''
|
|
If this property is set to 1 the output preserves the topology of its
This property specifies whether to clip the dataset, and if so, it also specifies the parameters of the plane with which to clip.
input and adds an insidedness array to mark which cells are inside or
out. If 0 then the output is an unstructured grid which contains only
the subset of cells that are inside.


| 0
|
|
Only the values 0 and 1 are accepted.
|
The value must be set to one of the following: None, Plane, Box, Sphere.




|-
|-
| '''Selection'''<br>''(Selection)''
| '''Input'''<br>''(Input)''
|
|
The input that provides the selection object.
This property specifies the input to the Extract CTH Parts filter.


|
|
Line 2,117: Line 2,116:




The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.
The dataset must contain a cell array with 1 components.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|-
| '''Show Bounds'''<br>''(ShowBounds)''
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|
|
For frustum selection, if this property is set to 1 the output is the
The value of this property is the volume fraction value for the surface.
outline of the frustum instead of the contents of the input that lie
within the frustum.


| 0
| 0.1
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 0 and less than or equal to 1.




Line 2,135: Line 2,135:




==Extract Subset==
==Extract Cells By Region==




Extract a subgrid from a structured grid with the option of setting subsample strides.
This filter extracts cells that are inside/outside a region or at a region boundary.


The Extract Grid filter returns a subgrid of a structured input data set (uniform rectilinear, curvilinear, or nonuniform rectilinear). The output data set type of this filter is the same as the input type.<br>
This filter extracts from its input dataset all cells that are either completely inside or outside of a specified region (implicit function). On output, the filter generates an unstructured grid.<br>
To use this filter you must specify a region  (implicit function). You must also specify whethter to extract cells lying inside or outside of the region. An option exists to extract cells that are neither inside or outside (i.e., boundary).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,149: Line 2,150:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Include Boundary'''<br>''(IncludeBoundary)''
| '''Extract intersected'''<br>''(Extract intersected)''
|
|
If the value of this property is 1, then if the sample rate in any dimension is greater than 1, the boundary indices of the input dataset will be passed to the output even if the boundary extent is not an even multiple of the sample rate in a given dimension.
This parameter controls whether to extract cells that are on the boundary of the region.


| 0
| 0
Line 2,159: Line 2,160:


|-
|-
| '''Input'''<br>''(Input)''
| '''Extract only intersected'''<br>''(Extract only intersected)''
|
|
This property specifies the input to the Extract Grid filter.
This parameter controls whether to extract only cells that are on the boundary of the region. If this parameter is set, the Extraction Side parameter is ignored. If Extract Intersected is off, this parameter has no effect.


| 0
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkRectilinearGrid, vtkStructuredPoints, vtkStructuredGrid.




|-
|-
| '''Sample Rate I'''<br>''(SampleRateI)''
| '''Extraction Side'''<br>''(ExtractInside)''
|
|
This property indicates the sampling rate in the I dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.
This parameter controls whether to extract cells that are inside or outside the region.


| 1
| 1
|
|
The value must be greater than or equal to 1.
The value must be one of the following: outside (0), inside (1).




|-
|-
| '''Sample Rate J'''<br>''(SampleRateJ)''
| '''Intersect With'''<br>''(ImplicitFunction)''
|
|
This property indicates the sampling rate in the J dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.
This property sets the region used to extract cells.


| 1
|
|
The value must be greater than or equal to 1.
|
The value must be set to one of the following: Plane, Box, Sphere.




|-
|-
| '''Sample Rate K'''<br>''(SampleRateK)''
| '''Input'''<br>''(Input)''
|
|
This property indicates the sampling rate in the K dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.
This property specifies the input to the Slice filter.


| 1
|
|
The value must be greater than or equal to 1.
|
The selected object must be the result of the following: sources (includes readers), filters.




|-
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
| '''V OI'''<br>''(VOI)''
|
This property specifies the minimum and maximum point indices along each of the I, J, and K axes; these values indicate the volume of interest (VOI). The output will have the (I,J,K) extent specified here.
 
| 0 0 0 0 0 0
|
The values must lie within the extent of the input dataset.




Line 2,214: Line 2,205:




==Extract Surface==
==Extract Edges==




Extract a 2D boundary surface using neighbor relations to eliminate internal faces.
Extract edges of 2D and 3D cells as lines.


The Extract Surface filter extracts the polygons forming the outer surface of the input dataset. This filter operates on any type of data and produces polygonal data as output.<br>
The Extract Edges filter produces a wireframe version of the input dataset by extracting all the edges of the dataset's cells as lines. This filter operates on any type of data set and produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,230: Line 2,221:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Extract Surface filter.
This property specifies the input to the Extract Edges filter.


|
|
Line 2,240: Line 2,231:




|-
|}
| '''Nonlinear Subdivision Level'''<br>''(NonlinearSubdivisionLevel)''
|
If the input is an unstructured grid with nonlinear faces, this
parameter determines how many times the face is subdivided into
linear faces. If 0, the output is the equivalent of its linear
couterpart (and the midpoints determining the nonlinear
interpolation are discarded). If 1, the nonlinear face is
triangulated based on the midpoints. If greater than 1, the
triangulated pieces are recursively subdivided to reach the
desired subdivision. Setting the value to greater than 1 may
cause some point data to not be passed even if no quadratic faces
exist. This option has no effect if the input is not an
unstructured grid.
 
| 1
|
The value must be greater than or equal to 0 and less than or equal to 4.
 
 
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
If the value of this property is set to 1, internal surfaces along process boundaries will be removed. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.
 
| 1
|
Only the values 0 and 1 are accepted.
 


|}


==Extract Generic Dataset Surface==


==FFT Of Selection Over Time==


Extract geometry from a higher-order dataset


Extracts selection over time and plots the FFT
Extract geometry from a higher-order dataset.<br>
 
Extracts the data of a selection (e.g. points or cells) over time,<br>
takes the FFT of them, and plots them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,290: Line 2,250:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
The input from which the selection is extracted.
Set the input to the Generic Geometry Filter.


|
|
Line 2,297: Line 2,257:




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable, vtkCompositeDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkGenericDataSet.




|-
|-
| '''Selection'''<br>''(Selection)''
| '''Pass Through Cell Ids'''<br>''(PassThroughCellIds)''
|
|
The input that provides the selection object.
Select whether to forward original ids.


| 1
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.




Line 2,316: Line 2,273:




==FOF/SOD Halo Finder==
==Extract Level==




Sorry, no help is currently available.
This filter extracts a range of groups from a hierarchical dataset.


This filter extracts a range of levels from a hierarchical dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,329: Line 2,287:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''bb (linking length)'''<br>''(BB)''
| '''Input'''<br>''(Input)''
|
|
Linking length measured in units of interparticle spacing and is dimensionless. Used to link particles into halos for the friends-of-friends (FOF) algorithm.
This property specifies the input to the Extract Group filter.


| 0.2
|
|
The value must be greater than or equal to 0.
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.




|-
|-
| '''Compute the most bound particle'''<br>''(ComputeMostBoundParticle)''
| '''Levels'''<br>''(Levels)''
|
|
If checked, the most bound particle for an FOF halo will be calculated. WARNING: This can be very slow.
This property lists the levels to extract
from the input hierarchical dataset.


| 0
|
|
Only the values 0 and 1 are accepted.
|
|}
 


==Extract Selection==


|-
| '''Compute the most connected particle'''<br>''(ComputeMostConnectedParticle)''
|
If checked, the most connected particle for an FOF halo will be calculated. WARNING: This can be very slow.


| 0
Extract different type of selections.
|
Only the values 0 and 1 are accepted.


This filter extracts a set of cells/points given a selection.<br>
The selection can be obtained from a rubber-band selection<br>
(either cell, visible or in a frustum) or threshold selection<br>
and passed to the filter or specified by providing an ID list.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Compute spherical overdensity (SOD) halos'''<br>''(ComputeSOD)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
If checked, spherical overdensity (SOD) halos will be calculated in addition to friends-of-friends (FOF) halos.
This property specifies the input from which the selection is extracted.


| 0
|
|
Only the values 0 and 1 are accepted.
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable.




|-
|-
| '''Copy FOF halo catalog to original particles'''<br>''(CopyHaloDataToParticles)''
| '''Preserve Topology'''<br>''(PreserveTopology)''
|
|
If checked, the friends-of-friends (FOF) halo catalog information will be copied to the original particles as well.
If this property is set to 1 the output preserves the topology of its
input and adds an insidedness array to mark which cells are inside or
out. If 0 then the output is an unstructured grid which contains only
the subset of cells that are inside.


| 0
| 0
Line 2,379: Line 2,353:


|-
|-
| '''Input'''<br>''(Input)''
| '''Selection'''<br>''(Selection)''
| This property specifies the input of the filter.
|
The input that provides the selection object.
 
|
|
|
|
Line 2,386: Line 2,362:




The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.




|-
|-
| '''np (number of seeded particles in one dimension, i.e., total particles = np^3)'''<br>''(NP)''
| '''Show Bounds'''<br>''(ShowBounds)''
|
|
Number of seeded particles in one dimension. Therefore, total simulation particles is np^3 (cubed).
For frustum selection, if this property is set to 1 the output is the
 
outline of the frustum instead of the contents of the input that lie
| 256
within the frustum.
 
| 0
|
|
The value must be greater than or equal to 0.
Only the values 0 and 1 are accepted.




|-
|}
| '''overlap (shared point/ghost cell gap distance)'''<br>''(Overlap)''
|
The space (in rL units) to extend processor particle ownership for ghost particles/cells. Needed for correct halo calculation when halos cross processor boundaries in parallel computation.


| 5
|
The value must be greater than or equal to 0.


==Extract Subset==


|-
| '''pmin (minimum particle threshold for an FOF halo)'''<br>''(PMin)''
|
Minimum number of particles (threshold) needed before a group is called a friends-of-friends (FOF) halo.


| 100
Extract a subgrid from a structured grid with the option of setting subsample strides.
|
The value must be greater than or equal to 1.


The Extract Grid filter returns a subgrid of a structured input data set (uniform rectilinear, curvilinear, or nonuniform rectilinear). The output data set type of this filter is the same as the input type.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''rL (physical box side length)'''<br>''(RL)''
| '''Include Boundary'''<br>''(IncludeBoundary)''
|
|
The box side length used to wrap particles around if they exceed rL (or less than 0) in any dimension (only positive positions are allowed in the input, or they are wrapped around).
If the value of this property is 1, then if the sample rate in any dimension is greater than 1, the boundary indices of the input dataset will be passed to the output even if the boundary extent is not an even multiple of the sample rate in a given dimension.


| 100
| 0
|
|
The value must be greater than or equal to 0.
Only the values 0 and 1 are accepted.




|-
|-
| '''scale factor for rho_c'''<br>''(RhoCScale)''
| '''Input'''<br>''(Input)''
|
|
Scale factor for rho_c in SOD halo finding such that rho_c' = rho_c * scale factor. Initial rho_c is 2.77536627e11 (M_sun/h) / (Mpc/h)^3.
This property specifies the input to the Extract Grid filter.


| 1
|
|
|-
| '''initial SOD center'''<br>''(SODCenterType)''
|
|
The initial friends-of-friends (FOF) center used for calculating a spherical overdensity (SOD) halo. WARNING: Using MBP or MCP can be very slow.
The selected object must be the result of the following: sources (includes readers), filters.
 


| 0
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkRectilinearGrid, vtkStructuredPoints, vtkStructuredGrid.
|
The value must be one of the following: Center of mass (0), Average position (1), Most bound particle (2), Most connected particle (3).




|-
|-
| '''scale factor for initial SOD mass'''<br>''(SODMassScale)''
| '''Sample Rate I'''<br>''(SampleRateI)''
|
|
Scale factor for the initial SOD mass such that mass' = mass * scale factor. Initial SOD mass is 1.0e14 (M_sun/h).
This property indicates the sampling rate in the I dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


| 1
| 1
|
|
|}
The value must be greater than or equal to 1.




==Feature Edges==
|-
| '''Sample Rate J'''<br>''(SampleRateJ)''
|
This property indicates the sampling rate in the J dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


| 1
|
The value must be greater than or equal to 1.


This filter will extract edges along sharp edges of surfaces or boundaries of surfaces.
The Feature Edges filter extracts various subsets of edges from the input data set. This filter operates on polygonal data and produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Sample Rate K'''<br>''(SampleRateK)''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Boundary Edges'''<br>''(BoundaryEdges)''
|
|
If the value of this property is set to 1, boundary edges will be extracted. Boundary edges are defined as lines cells or edges that are used by only one polygon.
This property indicates the sampling rate in the K dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


| 1
| 1
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 1.




|-
|-
| '''Coloring'''<br>''(Coloring)''
| '''V OI'''<br>''(VOI)''
|
|
If the value of this property is set to 1, then the extracted edges are assigned a scalar value based on the type of the edge.
This property specifies the minimum and maximum point indices along each of the I, J, and K axes; these values indicate the volume of interest (VOI). The output will have the (I,J,K) extent specified here.


| 0
| 0 0 0 0 0 0
|
|
Only the values 0 and 1 are accepted.
The values must lie within the extent of the input dataset.




|-
|}
| '''Feature Angle'''<br>''(FeatureAngle)''
|
Ths value of this property is used to define a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. (See the FeatureEdges property.)


| 30
|
The value must be greater than or equal to 0 and less than or equal to 180.


==Extract Surface==


|-
| '''Feature Edges'''<br>''(FeatureEdges)''
|
If the value of this property is set to 1, feature edges will be extracted. Feature edges are defined as edges that are used by two polygons whose dihedral angle is greater than the feature angle. (See the FeatureAngle property.)
Toggle whether to extract feature edges.


| 1
Extract a 2D boundary surface using neighbor relations to eliminate internal faces.
|
Only the values 0 and 1 are accepted.


The Extract Surface filter extracts the polygons forming the outer surface of the input dataset. This filter operates on any type of data and produces polygonal data as output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Feature Edges filter.
This property specifies the input to the Extract Surface filter.


|
|
Line 2,520: Line 2,482:




The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|-
| '''Manifold Edges'''<br>''(ManifoldEdges)''
| '''Nonlinear Subdivision Level'''<br>''(NonlinearSubdivisionLevel)''
|
|
If the value of this property is set to 1, manifold edges will be extracted. Manifold edges are defined as edges that are used by exactly two polygons.
If the input is an unstructured grid with nonlinear faces, this
parameter determines how many times the face is subdivided into
linear faces.  If 0, the output is the equivalent of its linear
couterpart (and the midpoints determining the nonlinear
interpolation are discarded).  If 1, the nonlinear face is
triangulated based on the midpoints.  If greater than 1, the
triangulated pieces are recursively subdivided to reach the
desired subdivision.  Setting the value to greater than 1 may
cause some point data to not be passed even if no quadratic faces
exist. This option has no effect if the input is not an
unstructured grid.


| 0
| 1
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 0 and less than or equal to 4.




|-
|-
| '''Non-Manifold Edges'''<br>''(NonManifoldEdges)''
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
|
If the value of this property is set to 1, non-manifold ediges will be extracted. Non-manifold edges are defined as edges that are use by three or more polygons.
If the value of this property is set to 1, internal surfaces along process boundaries will be removed. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.


| 1
| 1
Line 2,546: Line 2,518:




==Generate Ids==
==FFT Of Selection Over Time==




Generate scalars from point and cell ids.
Extracts selection over time and plots the FFT


This filter generates scalars using cell and point ids. That is, the point attribute data scalars are generated from the point ids, and the cell attribute data scalars or field data are generated from the the cell ids.<br>
Extracts the data of a selection (e.g. points or cells) over time,<br>
takes the FFT of them, and plots them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,560: Line 2,533:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Array Name'''<br>''(ArrayName)''
| '''Input'''<br>''(Input)''
|
|
The name of the array that will contain ids.
The input from which the selection is extracted.


| Ids
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable, vtkCompositeDataSet.
|-
|-
| '''Input'''<br>''(Input)''
| '''Selection'''<br>''(Selection)''
|
|
This property specifies the input to the Cell Data to Point Data filter.
The input that provides the selection object.


|
|
Line 2,576: Line 2,555:




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.




Line 2,582: Line 2,561:




==Generate Quadrature Points==
==FOF/SOD Halo Finder==




Create a point set with data at quadrature points.
Sorry, no help is currently available.


"Create a point set with data at quadrature points."<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,596: Line 2,574:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''bb (linking length)'''<br>''(BB)''
| This property specifies the input of the filter.
|
|
Linking length measured in units of interparticle spacing and is dimensionless.  Used to link particles into halos for the friends-of-friends (FOF) algorithm.
| 0.2
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must be greater than or equal to 0.




The dataset must contain a cell array.
|-
| '''Compute the most bound particle'''<br>''(ComputeMostBoundParticle)''
|
If checked, the most bound particle for an FOF halo will be calculated.  WARNING: This can be very slow.


 
| 0
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
|
Only the values 0 and 1 are accepted.




|-
|-
| '''Select Source Array'''<br>''(SelectSourceArray)''
| '''Compute the most connected particle'''<br>''(ComputeMostConnectedParticle)''
|
|
Specifies the offset array from which we generate quadrature points.
If checked, the most connected particle for an FOF halo will be calculated.  WARNING: This can be very slow.


| 0
|
|
|
Only the values 0 and 1 are accepted.
An array of scalars is required.




|}
|-
| '''Compute spherical overdensity (SOD) halos'''<br>''(ComputeSOD)''
|
If checked, spherical overdensity (SOD) halos will be calculated in addition to friends-of-friends (FOF) halos.


| 0
|
Only the values 0 and 1 are accepted.


==Generate Quadrature Scheme Dictionary==


|-
| '''Copy FOF halo catalog to original particles'''<br>''(CopyHaloDataToParticles)''
|
If checked, the friends-of-friends (FOF) halo catalog information will be copied to the original particles as well.


Generate quadrature scheme dictionaries in data sets that do not have them.
| 0
|
Only the values 0 and 1 are accepted.


Generate quadrature scheme dictionaries in data sets that do not have them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
Line 2,646: Line 2,634:




|}
|-
| '''maximum radius factor'''<br>''(MaxRadiusFactor)''
|
Maximum radius factor for SOD finding.
 
| 2
|
The value must be greater than or equal to 0.




==Generate Surface Normals==
|-
| '''minimum FOF mass'''<br>''(MinFOFMass)''
|
Minimum FOF mass to calculate an SOD halo.


| 5e+12
|
|-
| '''minimum FOF size'''<br>''(MinFOFSize)''
|
Minimum FOF halo size to calculate an SOD halo.


This filter will produce surface normals used for smooth shading. Splitting is used to avoid smoothing across feature edges.
| 1000
|
The value must be greater than or equal to 0.


This filter generates surface normals at the points of the input polygonal dataset to provide smooth shading of the dataset. The resulting dataset is also polygonal. The filter works by calculating a normal vector for each polygon in the dataset and then averaging the normals at the shared points.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''minimum radius factor'''<br>''(MinRadiusFactor)''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Compute Cell Normals'''<br>''(ComputeCellNormals)''
|
|
This filter computes the normals at the points in the data set. In the process of doing this it computes polygon normals too. If you want these normals to be passed to the output of this filter, set the value of this property to 1.
Minimum radius factor for SOD finding.


| 0
| 0.5
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 0.




|-
|-
| '''Consistency'''<br>''(Consistency)''
| '''np (number of seeded particles in one dimension, i.e., total particles = np^3)'''<br>''(NP)''
|
|
The value of this property controls whether consistent polygon ordering is enforced. Generally the normals for a data set should either all point inward or all point outward. If the value of this property is 1, then this filter will reorder the points of cells that whose normal vectors are oriented the opposite direction from the rest of those in the data set.
Number of seeded particles in one dimension. Therefore, total simulation particles is np^3 (cubed).


| 1
| 256
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 0.




|-
|-
| '''Feature Angle'''<br>''(FeatureAngle)''
| '''overlap (shared point/ghost cell gap distance)'''<br>''(Overlap)''
|
|
The value of this property defines a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. If Splitting is on, points are duplicated along these feature edges. (See the Splitting property.)
The space (in rL units) to extend processor particle ownership for ghost particles/cells. Needed for correct halo calculation when halos cross processor boundaries in parallel computation.


| 30
| 5
|
|
The value must be greater than or equal to 0 and less than or equal to 180.
The value must be greater than or equal to 0.




|-
|-
| '''Flip Normals'''<br>''(FlipNormals)''
| '''pmin (minimum particle threshold for an FOF halo)'''<br>''(PMin)''
|
|
If the value of this property is 1, this filter will reverse the normal direction (and reorder the points accordingly) for all polygons in the data set; this changes front-facing polygons to back-facing ones, and vice versa. You might want to do this if your viewing position will be inside the data set instead of outside of it.
Minimum number of particles (threshold) needed before a group is called a friends-of-friends (FOF) halo.


| 0
| 100
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 1.




|-
|-
| '''Input'''<br>''(Input)''
| '''rL (physical box side length)'''<br>''(RL)''
|
|
This property specifies the input to the Normals Generation filter.
The box side length used to wrap particles around if they exceed rL (or less than 0) in any dimension (only positive positions are allowed in the input, or they are wrapped around).


| 100
|
|
|
The value must be greater than or equal to 0.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.




|-
|-
| '''Non-Manifold Traversal'''<br>''(NonManifoldTraversal)''
| '''rho_c'''<br>''(RhoC)''
|
|
Turn on/off traversal across non-manifold edges. Not traversing non-manifold edges will prevent problems where the consistency of polygonal ordering is corrupted due to topological loops.
rho_c (critical density) for SOD halo finding.


| 1
| 2.77537e+11
|
|
Only the values 0 and 1 are accepted.
|-
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
| '''number of bins'''<br>''(SODBins)''
|
|
Turn this option to to produce the same results regardless of the number of processors used (i.e., avoid seams along processor boundaries). Turn this off if you do want to process ghost levels and do not mind seams.
Number of bins for SOD finding.


| 1
| 20
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 1.




|-
|-
| '''Splitting'''<br>''(Splitting)''
| '''initial SOD center'''<br>''(SODCenterType)''
|
The initial friends-of-friends (FOF) center used for calculating a spherical overdensity (SOD) halo.  WARNING: Using MBP or MCP can be very slow.
 
| 0
|
The value must be one of the following: Center of mass (0), Average position (1), Most bound particle (2), Most connected particle (3).
 
 
|-
| '''initial SOD mass'''<br>''(SODMass)''
|
|
This property controls the splitting of sharp edges. If sharp edges are split (property value = 1), then points are duplicated along these edges, and separate normals are computed for both sets of points to give crisp (rendered) surface definition.
The initial SOD mass.


| 1
| 1e+14
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 0.




Line 2,748: Line 2,751:




==Glyph==
==Feature Edges==




This filter generates an arrow, cone, cube, cylinder, line, sphere, or 2D glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset.
This filter will extract edges along sharp edges of surfaces or boundaries of surfaces.


The Glyph filter generates a glyph (i.e., an arrow, cone, cube, cylinder, line, sphere, or 2D glyph) at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.<br>
The Feature Edges filter extracts various subsets of edges from the input data set. This filter operates on polygonal data and produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 2,762: Line 2,765:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Glyph Transform'''<br>''(GlyphTransform)''
| '''Boundary Edges'''<br>''(BoundaryEdges)''
|
|
The values in this property allow you to specify the transform
If the value of this property is set to 1, boundary edges will be extracted. Boundary edges are defined as lines cells or edges that are used by only one polygon.
(translation, rotation, and scaling) to apply to the glyph source.


| 1
|
|
|
Only the values 0 and 1 are accepted.
The value must be set to one of the following: Transform2.




|-
|-
| '''Input'''<br>''(Input)''
| '''Coloring'''<br>''(Coloring)''
|
|
This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.
If the value of this property is set to 1, then the extracted edges are assigned a scalar value based on the type of the edge.


| 0
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
| '''Feature Angle'''<br>''(FeatureAngle)''
|
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)
Ths value of this property is used to define a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. (See the FeatureEdges property.)


| 5000
| 30
|
|
The value must be greater than or equal to 0.
The value must be greater than or equal to 0 and less than or equal to 180.




|-
|-
| '''Random Mode'''<br>''(RandomMode)''
| '''Feature Edges'''<br>''(FeatureEdges)''
|
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.
If the value of this property is set to 1, feature edges will be extracted. Feature edges are defined as edges that are used by two polygons whose dihedral angle is greater than the feature angle. (See the FeatureAngle property.)
Toggle whether to extract feature edges.


| 1
| 1
Line 2,806: Line 2,806:


|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
| '''Input'''<br>''(Input)''
|
|
This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)
This property specifies the input to the Feature Edges filter.


|
|
|
|
An array of scalars is required.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.




|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
| '''Manifold Edges'''<br>''(ManifoldEdges)''
|
|
This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)
If the value of this property is set to 1, manifold edges will be extracted. Manifold edges are defined as edges that are used by exactly two polygons.


| 1
| 0
|
|
An array of vectors is required.
Only the values 0 and 1 are accepted.




|-
|-
| '''Orient'''<br>''(SetOrient)''
| '''Non-Manifold Edges'''<br>''(NonManifoldEdges)''
|
|
If this property is set to 1, the glyphs will be oriented based on the selected vector array.
If the value of this property is set to 1, non-manifold ediges will be extracted. Non-manifold edges are defined as edges that are use by three or more polygons.


| 1
| 1
Line 2,835: Line 2,838:




|-
|}
| '''Set Scale Factor'''<br>''(SetScaleFactor)''
|
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.


| 1
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.


==Gaussian Resampling==


The value must lie within the range of the selected data array.


Splat points into a volume with an elliptical, Gaussian distribution.


The value must lie within the range of the selected data array.
vtkGaussianSplatter is a filter that injects input points into a<br>
 
structured points (volume) dataset. As each point is injected, it "splats"<br>
or distributes values to nearby voxels. Data is distributed using an<br>
elliptical, Gaussian distribution function. The distribution function is<br>
modified using scalar values (expands distribution) or normals<br>
(creates ellipsoidal distribution rather than spherical).<br><br><br>
Warning: results may be incorrect in parallel as points can't splat<br>
into other processor's cells.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Scale Mode'''<br>''(SetScaleMode)''
| '''Splat Accumulation Mode'''<br>''(Accumulation Mode)''
|
|
The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.
Specify the scalar accumulation mode. This mode expresses how scalar values are combined when splats are overlapped. The Max mode acts like a set union operation and is the most commonly used; the Min mode acts like a set intersection, and the sum is just weird.


| 1
| 1
|
|
The value must be one of the following: scalar (0), vector (1), vector_components (2), off (3).
The value must be one of the following: Min (0), Max (1), Sum (2).




|-
|-
| '''Glyph Type'''<br>''(Source)''
| '''Fill Value'''<br>''(CapValue)''
|
|
This property determines which type of glyph will be placed at the points in the input dataset.
Specify the cap value to use. (This instance variable only has effect if the ivar Capping is on.)


| 0
|
|
|
The selected object must be the result of the following: sources (includes readers), glyph_sources.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
The value must be set to one of the following: ArrowSource, ConeSource, CubeSource, CylinderSource, LineSource, SphereSource, GlyphSource2D.
|-
|-
| '''Mask Points'''<br>''(UseMaskPoints)''
| '''Fill Volume Boundary'''<br>''(Capping)''
|
|
If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)
Turn on/off the capping of the outer boundary of the volume to a specified cap value. This can be used to close surfaces (after iso-surfacing) and create other effects.


| 1
| 1
Line 2,887: Line 2,888:




|}
|-
| '''Ellipitical Eccentricity'''<br>''(Eccentricity)''
|
Control the shape of elliptical splatting. Eccentricity is the ratio of the major axis (aligned along normal) to the minor (axes) aligned along other two axes. So Eccentricity gt 1 creates needles with the long axis in the direction of the normal; Eccentricity lt 1 creates pancakes perpendicular to the normal vector.


| 2.5
|
|-
| '''Gaussian Exponent Factor'''<br>''(ExponentFactor)''
|
Set / get the sharpness of decay of the splats. This is the exponent constant in the Gaussian equation. Normally this is a negative value.


==Glyph With Custom Source==
| -5
 
|
 
The value must be less than or equal to 0.
This filter generates a glyph at each point of the input data set. The glyphs can be oriented and scaled by point attributes of the input dataset.


The Glyph filter generates a glyph at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.
This property specifies the input to the filter.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a point array with 1 components.




Line 2,917: Line 2,922:


|-
|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
| '''Extent to Resample'''<br>''(ModelBounds)''
|
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)
Set / get the (xmin,xmax, ymin,ymax, zmin,zmax) bounding box in which the sampling is performed. If any of the (min,max) bounds values are min >= max, then the bounds will be computed automatically from the input data. Otherwise, the user-specified bounds will be used.


| 5000
| 0 0 0 0 0 0
|
|
The value must be greater than or equal to 0.
|-
|-
| '''Random Mode'''<br>''(RandomMode)''
| '''Elliptical Splats'''<br>''(NormalWarping)''
|
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.
Turn on/off the generation of elliptical splats. If normal warping is on, then the input normals affect the distribution of the splat. This boolean is used in combination with the Eccentricity ivar.


| 1
| 1
Line 2,937: Line 2,939:


|-
|-
| '''Scalars'''<br>''(SelectInputScalars)''
| '''Empty Cell Value'''<br>''(NullValue)''
|
|
This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)
Set the Null value for output points not receiving a contribution from the input points. (This is the initial value of the voxel samples.)


| 0
|
|
|-
| '''Gaussian Splat Radius'''<br>''(Radius)''
|
|
An array of scalars is required.
Set / get the radius of propagation of the splat. This value is expressed as a percentage of the length of the longest side of the sampling volume. Smaller numbers greatly reduce execution time.
 


| 0.1
|
|-
|-
| '''Vectors'''<br>''(SelectInputVectors)''
| '''Resampling Grid'''<br>''(SampleDimensions)''
|
|
This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)
Set / get the dimensions of the sampling structured point set. Higher values produce better results but are much slower.


| 1
| 50 50 50
|
|
An array of vectors is required.
|-
|-
| '''Orient'''<br>''(SetOrient)''
| '''Scale Splats'''<br>''(ScalarWarping)''
|
|
If this property is set to 1, the glyphs will be oriented based on the selected vector array.
Turn on/off the scaling of splats by scalar value.


| 1
| 1
Line 2,967: Line 2,970:


|-
|-
| '''Set Scale Factor'''<br>''(SetScaleFactor)''
| '''Scale Factor'''<br>''(ScaleFactor)''
|
|
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.
Multiply Gaussian splat distribution by this value. If ScalarWarping is on, then the Scalar value will be multiplied by the ScaleFactor times the Gaussian function.


| 1
| 1
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.
|-
| '''Resample Field'''<br>''(SelectInputScalars)''
|
Choose a scalar array to splat into the output cells. If ignore arrays is chosen, point density will be counted instead.


|
|
An array of scalars is required.


The value must lie within the range of the selected data array.


Valud array names will be chosen from point and cell data.


The value must lie within the range of the selected data array.


|}
==Generate Ids==
Generate scalars from point and cell ids.
This filter generates scalars  using cell and point ids. That is, the point attribute data scalars are generated from the point ids, and the cell attribute data scalars or field data are generated from the the cell ids.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Scale Mode'''<br>''(SetScaleMode)''
| '''Array Name'''<br>''(ArrayName)''
|
|
The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.
The name of the array that will contain ids.


| 1
| Ids
|
|
The value must be one of the following: scalar (0), vector (1), vector_components (2), off (3).
|-
|-
| '''Glyph Type'''<br>''(Source)''
| '''Input'''<br>''(Input)''
|
|
This property determines which type of glyph will be placed at the points in the input dataset.
This property specifies the input to the Cell Data to Point Data filter.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), glyph_sources.
The selected object must be the result of the following: sources (includes readers), filters.




The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
 
 
|-
| '''Mask Points'''<br>''(UseMaskPoints)''
|
If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)
 
| 1
|
Only the values 0 and 1 are accepted.




Line 3,018: Line 3,028:




==Gradient==
==Generate Quadrature Points==




This filter computes gradient vectors for an image/volume.
Create a point set with data at quadrature points.


The Gradient filter computes the gradient vector at each point in an image or volume. This filter uses central differences to compute the gradients. The Gradient filter operates on uniform rectilinear (image) data and produces image data output.<br>
"Create a point set with data at quadrature points."<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,032: Line 3,042:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Dimensionality'''<br>''(Dimensionality)''
| '''Input'''<br>''(Input)''
| This property specifies the input of the filter.
|
|
This property indicates whether to compute the gradient in two dimensions or in three. If the gradient is being computed in two dimensions, the X and Y dimensions are used.
| 3
|
|
The value must be one of the following: Two (2), Three (3).
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The dataset must contain a cell array.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.




|-
|-
| '''Input'''<br>''(Input)''
| '''Select Source Array'''<br>''(SelectSourceArray)''
|
|
This property specifies the input to the Gradient filter.
Specifies the offset array from which we generate quadrature points.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
An array of scalars is required.
 
 
|}




The dataset must contain a point array with 1 components.
==Generate Quadrature Scheme Dictionary==




The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData.
Generate quadrature scheme dictionaries in data sets that do not have them.


Generate quadrature scheme dictionaries in data sets that do not have them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Select Input Scalars'''<br>''(SelectInputScalars)''
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
| This property specifies the input of the filter.
|
|
|
This property lists the name of the array from which to compute the gradient.
The selected object must be the result of the following: sources (includes readers), filters.
 


|
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
|
An array of scalars is required.




Line 3,070: Line 3,095:




==Gradient Of Unstructured DataSet==
==Generate Surface Normals==




Estimate the gradient for each point or cell in any type of dataset.
This filter will produce surface normals used for smooth shading. Splitting is used to avoid smoothing across feature edges.


The Gradient (Unstructured) filter estimates the gradient vector at each point or cell. It operates on any type of vtkDataSet, and the output is the same type as the input. If the dataset is a vtkImageData, use the Gradient filter instead; it will be more efficient for this type of dataset.<br>
This filter generates surface normals at the points of the input polygonal dataset to provide smooth shading of the dataset. The resulting dataset is also polygonal. The filter works by calculating a normal vector for each polygon in the dataset and then averaging the normals at the shared points.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,084: Line 3,109:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Compute Vorticity'''<br>''(ComputeVorticity)''
| '''Compute Cell Normals'''<br>''(ComputeCellNormals)''
|
|
When this flag is on, the gradient filter will compute the
This filter computes the normals at the points in the data set. In the process of doing this it computes polygon normals too. If you want these normals to be passed to the output of this filter, set the value of this property to 1.
vorticity/curl of a 3 component array.


| 0
| 0
Line 3,095: Line 3,119:


|-
|-
| '''Faster Approximation'''<br>''(FasterApproximation)''
| '''Consistency'''<br>''(Consistency)''
|
|
When this flag is on, the gradient filter will provide a less
The value of this property controls whether consistent polygon ordering is enforced. Generally the normals for a data set should either all point inward or all point outward. If the value of this property is 1, then this filter will reorder the points of cells that whose normal vectors are oriented the opposite direction from the rest of those in the data set.
accurate (but close) algorithm that performs fewer derivative
 
calculations (and is therefore faster). The error contains some
| 1
smoothing of the output data and some possible errors on the
|
boundary. This parameter has no effect when performing the
Only the values 0 and 1 are accepted.
gradient of cell data.
 
 
|-
| '''Feature Angle'''<br>''(FeatureAngle)''
|
The value of this property  defines a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. If Splitting is on, points are duplicated along these feature edges. (See the Splitting property.)
 
| 30
|
The value must be greater than or equal to 0 and less than or equal to 180.
 
 
|-
| '''Flip Normals'''<br>''(FlipNormals)''
|
If the value of this property is 1, this filter will reverse the normal direction (and reorder the points accordingly) for all polygons in the data set; this changes front-facing polygons to back-facing ones, and vice versa. You might want to do this if your viewing position will be inside the data set instead of outside of it.


| 0
| 0
Line 3,112: Line 3,151:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Gradient (Unstructured) filter.
This property specifies the input to the Normals Generation filter.


|
|
Line 3,119: Line 3,158:




The dataset must contain a point or cell array.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 


|-
| '''Non-Manifold Traversal'''<br>''(NonManifoldTraversal)''
|
Turn on/off traversal across non-manifold edges. Not traversing non-manifold edges will prevent problems where the consistency of polygonal ordering is corrupted due to topological loops.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.
| 1
|
Only the values 0 and 1 are accepted.




|-
|-
| '''Result Array Name'''<br>''(ResultArrayName)''
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
|
This property provides a name for the output array containing the gradient vectors.
Turn this option to to produce the same results regardless of the number of processors used (i.e., avoid seams along processor boundaries). Turn this off if you do want to process ghost levels and do not mind seams.


| Gradients
| 1
|
|
Only the values 0 and 1 are accepted.
|-
|-
| '''Scalar Array'''<br>''(SelectInputScalars)''
| '''Splitting'''<br>''(Splitting)''
|
|
This property lists the name of the scalar array from which to compute the gradient.
This property controls the splitting of sharp edges. If sharp edges are split (property value = 1), then points are duplicated along these edges, and separate normals are computed for both sets of points to give crisp (rendered) surface definition.


| 1
|
|
|
Only the values 0 and 1 are accepted.
An array of scalars is required.




Valud array names will be chosen from point and cell data.
|}




|}
==Glyph==




==Grid Connectivity==
This filter generates an arrow, cone, cube, cylinder, line, sphere, or 2D glyph at each point of the input data set.  The glyphs can be oriented and scaled by point attributes of the input dataset.


 
The Glyph filter generates a glyph (i.e., an arrow, cone, cube, cylinder, line, sphere, or 2D glyph) at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.<br>
Mass properties of connected fragments for unstructured grids.
 
This filter works on multiblock unstructured grid inputs and also works in<br>
parallel. It Ignores any cells with a cell data Status value of 0.<br>
It performs connectivity to distict fragments separately. It then integrates<br>
attributes of the fragments.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,165: Line 3,208:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''Glyph Transform'''<br>''(GlyphTransform)''
| This property specifies the input of the filter.
|
The values in this property allow you to specify the transform
(translation, rotation, and scaling) to apply to the glyph source.
 
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The value must be set to one of the following: Transform2.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid, vtkCompositeDataSet.
 
 
|}
 
 
==Group Datasets==
 
 
Group data sets.


Groups multiple datasets to create a multiblock dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property indicates the the inputs to the Group Datasets filter.
This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.


|
|
Line 3,201: Line 3,228:




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|}
|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)


| 5000
|
The value must be greater than or equal to 0.


==Histogram==


|-
| '''Random Mode'''<br>''(RandomMode)''
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.


Extract a histogram from field data.
| 1
|
Only the values 0 and 1 are accepted.




{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Scalars'''<br>''(SelectInputScalars)''
| '''Description'''
|
| '''Default Value(s)'''
This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)
| '''Restrictions'''
 
|
|
An array of scalars is required.
 
 
|-
|-
| '''Bin Count'''<br>''(BinCount)''
| '''Vectors'''<br>''(SelectInputVectors)''
|
|
The value of this property specifies the number of bins for the histogram.
This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)


| 10
| 1
|
|
The value must be greater than or equal to 1 and less than or equal to 256.
An array of vectors is required.




|-
|-
| '''Calculate Averages'''<br>''(CalculateAverages)''
| '''Orient'''<br>''(SetOrient)''
|
|
This option controls whether the algorithm calculates averages
If this property is set to 1, the glyphs will be oriented based on the selected vector array.
of variables other than the primary variable that fall into each
bin.


| 1
| 1
Line 3,242: Line 3,282:


|-
|-
| '''Component'''<br>''(Component)''
| '''Set Scale Factor'''<br>''(SetScaleFactor)''
|
|
The value of this property specifies the array component from which the histogram should be computed.
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.


| 0
| 1
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.
The value must lie within the range of the selected data array.
The value must lie within the range of the selected data array.
|-
|-
| '''Custom Bin Ranges'''<br>''(CustomBinRanges)''
| '''Scale Mode'''<br>''(SetScaleMode)''
|
|
Set custom bin ranges to use. These are used only when
The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.
UseCustomBinRanges is set to true.


| 0 100
| 1
|
|
The value must lie within the range of the selected data array.
The value must be one of the following: scalar (0), vector (1), vector_components (2), off (3).




|-
|-
| '''Input'''<br>''(Input)''
| '''Glyph Type'''<br>''(Source)''
|
|
This property specifies the input to the Histogram filter.
This property determines which type of glyph will be placed at the points in the input dataset.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected object must be the result of the following: sources (includes readers), glyph_sources.




The dataset must contain a point or cell array.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The value must be set to one of the following: ArrowSource, ConeSource, CubeSource, CylinderSource, LineSource, SphereSource, GlyphSource2D.




|-
|-
| '''Select Input Array'''<br>''(SelectInputArray)''
| '''Mask Points'''<br>''(UseMaskPoints)''
|
|
This property indicates the name of the array from which to compute the histogram.
If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)


|
| 1
|
An array of scalars is required.
 
 
Valud array names will be chosen from point and cell data.
 
 
|-
| '''Use Custom Bin Ranges'''<br>''(UseCustomBinRanges)''
|
When set to true, CustomBinRanges will be used instead of using the
full range for the selected array. By default, set to false.
 
| 0
|
|
Only the values 0 and 1 are accepted.
Only the values 0 and 1 are accepted.
Line 3,302: Line 3,336:




==Integrate Variables==
==Glyph With Custom Source==




This filter integrates cell and point attributes.
This filter generates a glyph at each point of the input data set.  The glyphs can be oriented and scaled by point attributes of the input dataset.


The Integrate Attributes filter integrates point and cell data over lines and surfaces. It also computes length of lines, area of surface, or volume.<br>
The Glyph filter generates a glyph at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,318: Line 3,352:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Integrate Attributes filter.
This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.


|
|
Line 3,328: Line 3,362:




|}
|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)


| 5000
|
The value must be greater than or equal to 0.


==Interpolate to Quadrature Points==


|-
| '''Random Mode'''<br>''(RandomMode)''
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.


Create scalar/vector data arrays interpolated to quadrature points.
| 1
|
Only the values 0 and 1 are accepted.


"Create scalar/vector data arrays interpolated to quadrature points."<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Scalars'''<br>''(SelectInputScalars)''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
| This property specifies the input of the filter.
|
|
This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)
|
|
The selected object must be the result of the following: sources (includes readers), filters.
|
An array of scalars is required.
 


|-
| '''Vectors'''<br>''(SelectInputVectors)''
|
This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)


The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
| 1
|
An array of vectors is required.




|-
|-
| '''Select Source Array'''<br>''(SelectSourceArray)''
| '''Orient'''<br>''(SetOrient)''
|
|
Specifies the offset array from which we interpolate values to quadrature points.
If this property is set to 1, the glyphs will be oriented based on the selected vector array.


| 1
|
|
Only the values 0 and 1 are accepted.
|-
| '''Set Scale Factor'''<br>''(SetScaleFactor)''
|
|
An array of scalars is required.
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.


 
| 1
|}
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.




==Intersect Fragments==
The value must lie within the range of the selected data array.




The Intersect Fragments filter perform geometric intersections on sets of fragments.
The value must lie within the range of the selected data array.


The Intersect Fragments filter perform geometric intersections on sets of<br>
fragments. The filter takes two inputs, the first containing fragment<br>
geometry and the second containing fragment centers. The filter has two<br>
outputs. The first is geometry that results from the intersection. The<br>
second is a set of points that is an approximation of the center of where<br>
each fragment has been intersected.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Scale Mode'''<br>''(SetScaleMode)''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Slice Type'''<br>''(CutFunction)''
|
|
This property sets the type of intersecting geometry, and
The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.
associated parameters.


| 1
|
|
|
The value must be one of the following: scalar (0), vector (1), vector_components (2), off (3).
The value must be set to one of the following: Plane, Box, Sphere.




|-
|-
| '''Input'''<br>''(Input)''
| '''Glyph Type'''<br>''(Source)''
|
|
This input must contian fragment geometry.
This property determines which type of glyph will be placed at the points in the input dataset.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
The selected object must be the result of the following: sources (includes readers), glyph_sources.




The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.




|-
|-
| '''Source'''<br>''(Source)''
| '''Mask Points'''<br>''(UseMaskPoints)''
|
|
This input must contian fragment centers.
If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)


| 1
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.




Line 3,426: Line 3,464:




==Iso Volume==
==Gradient==




This filter extracts cells by clipping cells that have point scalars not in the specified range.
This filter computes gradient vectors for an image/volume.


This filter clip away the cells using lower and upper thresholds.<br>
The Gradient filter computes the gradient vector at each point in an image or volume. This filter uses central differences to compute the gradients. The Gradient filter operates on uniform rectilinear (image) data and produces image data output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,439: Line 3,477:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Dimensionality'''<br>''(Dimensionality)''
|
This property indicates whether to compute the gradient in two dimensions or in three. If the gradient is being computed in two dimensions, the X and Y dimensions are used.
| 3
|
The value must be one of the following: Two (2), Three (3).
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Threshold filter.
This property specifies the input to the Gradient filter.


|
|
Line 3,449: Line 3,497:




The dataset must contain a point or cell array with 1 components.
The dataset must contain a point array with 1 components.




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData.




|-
|-
| '''Input Scalars'''<br>''(SelectInputScalars)''
| '''Select Input Scalars'''<br>''(SelectInputScalars)''
|
|
The value of this property contains the name of the scalar array from which to perform thresholding.
This property lists the name of the array from which to compute the gradient.


|
|
Line 3,465: Line 3,513:




Valud array names will be chosen from point and cell data.
|}




|-
==Gradient Of Unstructured DataSet==
| '''Threshold Range'''<br>''(ThresholdBetween)''
|
The values of this property specify the upper and lower bounds of the thresholding operation.


| 0 0
|
The value must lie within the range of the selected data array.


Estimate the gradient for each point or cell in any type of dataset.


|}
The Gradient (Unstructured) filter estimates the gradient vector at each point or cell. It operates on any type of vtkDataSet, and the output is the same type as the input. If the dataset is a vtkImageData, use the Gradient filter instead; it will be more efficient for this type of dataset.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Compute QCriterion'''<br>''(ComputeQCriterion)''
|
When this flag is on, the gradient filter will compute the
Q-criterion of a 3 component array.


==K Means==
| 0
|
Only the values 0 and 1 are accepted.




Compute a statistical model of a dataset and/or assess the dataset with a statistical model.
|-
| '''Compute Vorticity'''<br>''(ComputeVorticity)''
|
When this flag is on, the gradient filter will compute the
vorticity/curl of a 3 component array.


This filter either computes a statistical model of a dataset or takes such a model as its second input. Then, the model (however it is obtained) may optionally be used to assess the input dataset.
| 0
<br>
|
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select. The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center. The model is then a set of cluster centers. Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br>
Only the values 0 and 1 are accepted.




{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Faster Approximation'''<br>''(FasterApproximation)''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|
|
Specify which type of field data the arrays will be drawn from.
When this flag is on, the gradient filter will provide a less
accurate (but close) algorithm that performs fewer derivative
calculations (and is therefore faster).  The error contains some
smoothing of the output data and some possible errors on the
boundary.  This parameter has no effect when performing the
gradient of cell data.


| 0
| 0
|
|
Valud array names will be chosen from point and cell data.
Only the values 0 and 1 are accepted.




Line 3,510: Line 3,569:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
The input to the filter. Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
This property specifies the input to the Gradient (Unstructured) filter.


|
|
Line 3,520: Line 3,579:




The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.




|-
|-
| '''k'''<br>''(K)''
| '''Result Array Name'''<br>''(ResultArrayName)''
|
|
Specify the number of clusters.
This property provides a name for the output array containing the gradient vectors.


| 5
| Gradients
|
|
The value must be greater than or equal to 1.
|-
|-
| '''Max Iterations'''<br>''(MaxNumIterations)''
| '''Scalar Array'''<br>''(SelectInputScalars)''
|
|
Specify the maximum number of iterations in which cluster centers are moved before the algorithm terminates.
This property lists the name of the scalar array from which to compute the gradient.


| 50
|
|
The value must be greater than or equal to 1.
|
An array of scalars is required.




|-
Valud array names will be chosen from point and cell data.
| '''Model Input'''<br>''(ModelInput)''
|
A previously-calculated model with which to assess a separate dataset. This input is optional.


|
|
The selected object must be the result of the following: sources (includes readers), filters.


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.


==Grid Connectivity==


|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|
Choose arrays whose entries will be used to form observations for statistical analysis.


|
Mass properties of connected fragments for unstructured grids.
|
An array of scalars is required.


This filter works on multiblock unstructured grid inputs and also works in<br>
parallel.  It Ignores any cells with a cell data Status value of 0.<br>
It performs connectivity to distict fragments separately.  It then integrates<br>
attributes of the fragments.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Task'''<br>''(Task)''
| '''Input'''<br>''(Input)''
| This property specifies the input of the filter.
|
|
|
Specify the task to be performed: modeling and/or assessment.
The selected object must be the result of the following: sources (includes readers), filters.
#  "Statistics of all the data," creates an output table (or tables) summarizing the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset. The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The ''Training fraction'' setting will be ignored for tasks 1 and 3.


| 3
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid, vtkCompositeDataSet.
|
The value must be one of the following: Statistics of all the data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).




|-
|}
| '''Tolerance'''<br>''(Tolerance)''
|
Specify the relative tolerance that will cause early termination.


| 0.01
|
The value must be greater than or equal to 0 and less than or equal to 1.


==Group Datasets==


|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


| 0.1
Group data sets.
|
The value must be greater than or equal to 0 and less than or equal to 1.


 
Groups multiple datasets to create a multiblock dataset<br>
|}
 
 
==Level Scalars==
 
 
The Level Scalars filter uses colors to show levels of a hierarchical dataset.
 
The Level Scalars filter uses colors to show levels of a hierarchical dataset.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,621: Line 3,651:
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Level Scalars filter.
This property indicates the the inputs to the Group Datasets filter.


|
|
Line 3,628: Line 3,658:




The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.




Line 3,634: Line 3,664:




==Linear Extrusion==
==Histogram==




This filter creates a swept surface defined by translating the input along a vector.
Extract a histogram from field data.


The Linear Extrusion filter creates a swept surface by translating the input dataset along a specified vector. This filter is intended to operate on 2D polygonal data. This filter operates on polygonal data and produces polygonal data output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,648: Line 3,677:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Capping'''<br>''(Capping)''
| '''Bin Count'''<br>''(BinCount)''
|
|
The value of this property indicates whether to cap the ends of the swept surface. Capping works by placing a copy of the input dataset on either end of the swept surface, so it behaves properly if the input is a 2D surface composed of filled polygons. If the input dataset is a closed solid (e.g., a sphere), then if capping is on (i.e., this property is set to 1), two copies of the data set will be displayed on output (the second translated from the first one along the specified vector). If instead capping is off (i.e., this property is set to 0), then an input closed solid will produce no output.
The value of this property specifies the number of bins for the histogram.


| 1
| 10
|
|
Only the values 0 and 1 are accepted.
The value must be greater than or equal to 1 and less than or equal to 256.




|-
|-
| '''Input'''<br>''(Input)''
| '''Calculate Averages'''<br>''(CalculateAverages)''
|
|
This property specifies the input to the Linear Extrusion filter.
This option controls whether the algorithm calculates averages
of variables other than the primary variable that fall into each
bin.


| 1
|
|
|
Only the values 0 and 1 are accepted.
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.




|-
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
| '''Component'''<br>''(Component)''
|
|
The value of this property determines whether the output will be the same regardless of the number of processors used to compute the result. The difference is whether there are internal polygonal faces on the processor boundaries. A value of 1 will keep the results the same; a value of 0 will allow internal faces on processor boundaries.
The value of this property specifies the array component from which the histogram should be computed.


| 0
| 0
|
|
Only the values 0 and 1 are accepted.
|-
|-
| '''Scale Factor'''<br>''(ScaleFactor)''
| '''Custom Bin Ranges'''<br>''(CustomBinRanges)''
|
|
The value of this property determines the distance along the vector the dataset will be translated. (A scale factor of 0.5 will move the dataset half the length of the vector, and a scale factor of 2 will move it twice the vector's length.)
Set custom bin ranges to use. These are used only when
UseCustomBinRanges is set to true.


| 1
| 0 100
|
|
The value must lie within the range of the selected data array.
|-
|-
| '''Vector'''<br>''(Vector)''
| '''Input'''<br>''(Input)''
|
|
The value of this property indicates the X, Y, and Z components of the vector along which to sweep the input dataset.
This property specifies the input to the Histogram filter.


| 0 0 1
|
|
|}
|
The selected object must be the result of the following: sources (includes readers), filters.




==Loop Subdivision==
The dataset must contain a point or cell array.




This filter iteratively divides each triangle into four triangles. New points are placed so the output surface is smooth.
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


The Loop Subdivision filter increases the granularity of a polygonal mesh. It works by dividing each triangle in the input into four new triangles. It is named for Charles Loop, the person who devised this subdivision scheme. This filter only operates on triangles, so a data set that contains other types of polygons should be passed through the Triangulate filter before applying this filter to it. This filter only operates on polygonal data (specifically triangle meshes), and it produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
|-
| '''Property'''
| '''Select Input Array'''<br>''(SelectInputArray)''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Loop Subdivision filter.
This property indicates the name of the array from which to compute the histogram.


|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
An array of scalars is required.




The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
Valud array names will be chosen from point and cell data.




|-
|-
| '''Number of Subdivisions'''<br>''(NumberOfSubdivisions)''
| '''Use Custom Bin Ranges'''<br>''(UseCustomBinRanges)''
|
|
Set the number of subdivision iterations to perform. Each subdivision divides single triangles into four new triangles.
When set to true, CustomBinRanges will  be used instead of using the
full range for the selected array. By default, set to false.


| 1
| 0
|
|
The value must be greater than or equal to 1 and less than or equal to 4.
Only the values 0 and 1 are accepted.




Line 3,736: Line 3,759:




==Mask Points==
==Integrate Variables==




Reduce the number of points. This filter is often used before glyphing. Generating vertices is an option.
This filter integrates cell and point attributes.


The Mask Points filter reduces the number of points in the dataset. It operates on any type of dataset, but produces only points / vertices as output. This filter is often used before the Glyph filter, but the basic point-masking functionality is also available on the Properties page for the Glyph filter.<br>
The Integrate Attributes filter integrates point and cell data over lines and surfaces. It also computes length of lines, area of surface, or volume.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,749: Line 3,772:
| '''Default Value(s)'''
| '''Default Value(s)'''
| '''Restrictions'''
| '''Restrictions'''
|-
| '''Generate Vertices'''<br>''(GenerateVertices)''
|
This property specifies whether to generate vertex cells as the topography of the output. If set to 1, the geometry (vertices) will be displayed in the rendering window; otherwise no geometry will be displayed.
| 0
|
Only the values 0 and 1 are accepted.
|-
|-
| '''Input'''<br>''(Input)''
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Mask Points filter.
This property specifies the input to the Integrate Attributes filter.


|
|
Line 3,772: Line 3,785:




|-
|}
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
|
The value of this property indicates the maximum number of points in the output dataset.


| 5000
|
The value must be greater than or equal to 0.


==Interpolate to Quadrature Points==


|-
| '''Offset'''<br>''(Offset)''
|
The value of this property indicates the point in the input dataset from which to start masking.


| 0
Create scalar/vector data arrays interpolated to quadrature points.
|
The value must be greater than or equal to 0.


"Create scalar/vector data arrays interpolated to quadrature points."<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''On Ratio'''<br>''(OnRatio)''
| '''Input'''<br>''(Input)''
| This property specifies the input of the filter.
|
|
The value of this property specifies the ratio of points to retain in the output. (For example, if the on ratio is 3, then the output will contain 1/3 as many points -- up to the value of the MaximumNumberOfPoints property -- as the input.)
| 2
|
|
The value must be greater than or equal to 1.
The selected object must be the result of the following: sources (includes readers), filters.




|-
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
| '''Random'''<br>''(RandomMode)''
|
If the value of this property is set to 0, then the points in the output will be randomly selected from the input; otherwise this filter will subsample regularly. Selecting points at random is helpful to avoid striping when masking the points of a structured dataset.
 
| 0
|
Only the values 0 and 1 are accepted.




|-
|-
| '''Single Vertex Per Cell'''<br>''(SingleVertexPerCell)''
| '''Select Source Array'''<br>''(SelectSourceArray)''
|
|
Tell filter to only generate one vertex per cell instead of multiple vertices in one cell.
Specifies the offset array from which we interpolate values to quadrature points.


| 0
|
|
Only the values 0 and 1 are accepted.
|
An array of scalars is required.




Line 3,825: Line 3,825:




==Material Interface Filter==
==Intersect Fragments==




The Material Interface filter finds volumes in the input data containg material above a certain material fraction.
The Intersect Fragments filter perform geometric intersections on sets of fragments.


The Material Interface filter finds voxels inside of which a material<br>
The Intersect Fragments filter perform geometric intersections on sets of<br>
fraction (or normalized amount of material) is higher than a given<br>
fragments. The filter takes two inputs, the first containing fragment<br>
threshold. As these voxels are identified surfaces enclosing adjacent<br>
geometry and the second containing fragment centers. The filter has two<br>
voxels above the threshold are generated. The resulting volume and its<br>
outputs. The first is geometry that results from the intersection. The<br>
surface are what we call a fragment. The filter has the ability to<br>
second is a set of points that is an approximation of the center of where<br>
compute various volumetric attributes such as fragment volume, mass,<br>
each fragment has been intersected.<br>
center of mass as well as volume and mass weighted averages for any of<br>
the fields present. Any field selected for such computation will be also<br>
be coppied into the fragment surface's point data for visualization. The<br>
filter also has the ability to generate Oriented Bounding Boxes (OBB) for<br>
each fragment.<br><br><br>
The data generated by the filter is organized in three outputs. The<br>
"geometry" output, containing the fragment surfaces. The "statistics"<br>
output, containing a point set of the centers of mass. The "obb<br>
representaion" output, containing OBB representations (poly data). All<br>
computed attributes are coppied into the statistics and geometry output.<br>
The obb representation output is used for validation and debugging<br>
puproses and is turned off by default.<br><br><br>
To measure the size of craters, the filter can invert a volume fraction<br>
and clip the volume fraction with a sphere and/or a plane.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
{| class="PropertiesTable" border="1" cellpadding="5"
Line 3,858: Line 3,844:
| '''Restrictions'''
| '''Restrictions'''
|-
|-
| '''Clip Type'''<br>''(ClipFunction)''
| '''Slice Type'''<br>''(CutFunction)''
|
|
This property sets the type of clip geometry, and
This property sets the type of intersecting geometry, and
associated parameters.
associated parameters.


|
|
|
|
The value must be set to one of the following: None, Plane, Sphere.
The value must be set to one of the following: Plane, Box, Sphere.




|-
|-
| '''Compute OBB'''<br>''(ComputeOBB)''
| '''Input'''<br>''(Input)''
|
|
Compute Object Oriented Bounding boxes (OBB). When active the result of
This input must contian fragment geometry.
this computation is coppied into the statistics output. In the case
that the filter is built in its validation mode, the OBB's are
rendered.


| 0
|
|
Only the values 0 and 1 are accepted.
|
The selected object must be the result of the following: sources (includes readers), filters.
 
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.




|-
|-
| '''Input'''<br>''(Input)''
| '''Source'''<br>''(Source)''
|
|
Input to the filter can be a hierarchical box data set containing image
This input must contian fragment centers.
data or a multi-block of rectilinear grids.


|
|
Line 3,892: Line 3,877:




The dataset must contain a cell array.
The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.




The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.
|}




|-
==Iso Volume==
| '''Invert Volume Fraction'''<br>''(InvertVolumeFraction)''
|
Inverting the volume fraction generates the negative of the material.
It is useful for analyzing craters.


| 0
|
Only the values 0 and 1 are accepted.


This filter extracts cells by clipping cells that have point        scalars not in the specified range.


This filter clip away the cells using lower and upper thresholds.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
|-
| '''Material Fraction Threshold'''<br>''(MaterialFractionThreshold)''
| '''Input'''<br>''(Input)''
|
|
Material fraction is defined as normalized amount of material per
This property specifies the input to the Threshold filter.
voxel. Any voxel in the input data set with a material fraction greater
than this value is included in the output data set.


| 0.5