Difference between revisions of "ParaView/Users Guide/List of filters"

From KitwarePublic
Jump to navigationJump to search
Line 3: Line 3:




==AMR Contour=
==AMR Contour==








{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Capping'''<br>''(Capping)''
|
If this property is on, the the boundary of the data set is capped.


{| class="PropertiesTable" border="1" cellpadding="5
| 1
|
|
| '''Property''
Only the values 0 and 1 are accepted.
| '''Description''
 
| '''Default Value(s)''
| '''Restrictions''
|-
| '''Isosurface'''<br>''(ContourValue)''
|
|
| '''Capping'''<br>''(Capping)'
This property specifies the values at which to compute the isosurface.


If this property is on, the the boundary of the data set is capped
| 1
|
The value must lie within the range of the selected data array.


|-
| '''Degenerate Cells'''<br>''(DegenerateCells)''
|
|
If this property is on, a transition mesh between levels is created.


Only the values 0 and 1 are accepted
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Input'''<br>''(Input)''
| �
| �
|
The selected object must be the result of the following: sources (includes readers), filters.
The dataset must contain a cell array with 1 components.


|
| '''Isosurface'''<br>''(ContourValue)'


This property specifies the values at which to compute the isosurface
The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.


|-
| '''Merge Points'''<br>''(MergePoints)''
|
|
Use more memory to merge points on the boundaries of blocks.


The value must lie within the range of the selected data array
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)''
|
If this property is off, each process executes independantly.


| 1
|
|
| '''Degenerate Cells'''<br>''(DegenerateCells)'
Only the values 0 and 1 are accepted.


If this property is on, a transition mesh between levels is created
|-
| '''Contour By'''<br>''(SelectInputScalars)''
|
This property specifies the name of the cell scalar array from which the contour filter will compute isolines and/or isosurfaces.


| �
|
|
An array of scalars is required.


Only the values 0 and 1 are accepted
|-
| '''Skip Ghost Copy'''<br>''(SkipGhostCopy)''
|
A simple test to see if ghost values are already set properly.


| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Triangulate'''<br>''(Triangulate)''
|
|
| '''Input'''<br>''(Input)'
Use triangles instead of quads on capping surfaces.
|
 
| 1
|
|
Only the values 0 and 1 are accepted.


The selected object must be the result of the following: sources (includes readers), filters
|}




The dataset must contain a cell array with 1 components
==AMR Dual Clip==




The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet
Clip with scalars.  Tetrahedra.




{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Degenerate Cells'''<br>''(DegenerateCells)''
|
|
| '''Merge Points'''<br>''(MergePoints)'
If this property is on, a transition mesh between levels is created.


Use more memory to merge points on the boundaries of blocks
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Input'''<br>''(Input)''
| �
| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The dataset must contain a cell array with 1 components.


|
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)'


If this property is off, each process executes independantly
The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet.


|-
| '''Merge Points'''<br>''(MergePoints)''
|
|
Use more memory to merge points on the boundaries of blocks.


Only the values 0 and 1 are accepted
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)''
|
If this property is off, each process executes independantly.


| 1
|
|
| '''Contour By'''<br>''(SelectInputScalars)'
Only the values 0 and 1 are accepted.


This property specifies the name of the cell scalar array from which the contour filter will compute isolines and/or isosurfaces
|-
| '''Select Material Arrays'''<br>''(SelectMaterialArrays)''
|
This property specifies the cell arrays from which the clip filter will
compute clipped cells.


| �
|
|
An array of scalars is required.


An array of scalars is required
 
|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|
This property specifies the values at which to compute the isosurface.


| 0.1
|
|
| '''Skip Ghost Copy'''<br>''(SkipGhostCopy)'
The value must be greater than or equal to 0 and less than or equal to 1.


A simple test to see if ghost values are already set properly
|}


|


Only the values 0 and 1 are accepted
==Annotate Time Filter==




|
Shows input data time as text annnotation in the view.
| '''Triangulate'''<br>''(Triangulate)'


Use triangles instead of quads on capping surfaces
The Annotate Time filter can be used to show the data time in a text annotation.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Format'''<br>''(Format)''
|
|
The value of this property is a format string used to display the input time. The format string is specified using printf style.


Only the values 0 and 1 are accepted
| Time: %f
| �
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input dataset for which to display the time.


| �
|
The selected object must be the result of the following: sources (includes readers), filters.


|-
| '''Scale'''<br>''(Scale)''
|
|
The factor by which the input time is scaled.


| 1
| �
|-
| '''Shift'''<br>''(Shift)''
|
The amount of time the input is shifted (after scaling).


==AMR Dual Clip=
| 0
| �
|}




Clip with scalars.  Tetrahedra
==Append Attributes==




{| class="PropertiesTable" border="1" cellpadding="5
Copies geometry from first input.  Puts all of the arrays into the output.
|
| '''Property''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Degenerate Cells'''<br>''(DegenerateCells)'


If this property is on, a transition mesh between levels is created
The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Append Attributes filter.


Only the values 0 and 1 are accepted
| �
|
The selected object must be the result of the following: sources (includes readers), filters.




|
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
| '''Input'''<br>''(Input)'
|
|


The selected object must be the result of the following: sources (includes readers), filters
|}




The dataset must contain a cell array with 1 components
==Append Datasets==




The selected dataset must be one of the following types (or a subclass of one of them): vtkCompositeDataSet
Takes an input of multiple datasets and output has only one unstructured grid.


The Append Datasets filter operates on multiple data sets of any type (polygonal, structured, etc.). It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Merge Points'''<br>''(MergePoints)'
This property specifies the datasets to be merged into a single dataset by the Append Datasets filter.


Use more memory to merge points on the boundaries of blocks
| �
|
The selected object must be the result of the following: sources (includes readers), filters.


|


Only the values 0 and 1 are accepted
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|}


|
| '''Multiprocess Communication'''<br>''(MultiprocessCommunication)'


If this property is off, each process executes independantly
==Append Geometry==


|


Only the values 0 and 1 are accepted
Takes an input of multiple poly data parts and output has only one part.


The Append Geometry filter operates on multiple polygonal data sets. It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Select Material Arrays'''<br>''(SelectMaterialArrays)'
Set the input to the Append Geometry filter.


This property specifies the cell arrays from which the clip filter wil
| �
compute clipped cells
|
The selected object must be the result of the following: sources (includes readers), filters.


|


An array of scalars is required
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|}


|
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)'


This property specifies the values at which to compute the isosurface
==Block Scalars==


| 0.


The value must be greater than or equal to 0 and less than or equal to 1
The Level Scalars filter uses colors to show levels of a multiblock dataset.


The Level Scalars filter uses colors to show levels of a multiblock dataset.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Level Scalars filter.


| �
|
The selected object must be the result of the following: sources (includes readers), filters.


==Annotate Time Filter=


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.


Shows input data time as text annnotation in the view
|}


The Annotate Time filter can be used to show the data time in a text annotation.<br


{| class="PropertiesTable" border="1" cellpadding="5
==Calculator==
|
| '''Property''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Format'''<br>''(Format)'


The value of this property is a format string used to display the input time. The format string is specified using printf style


| Time: %
Compute new attribute arrays as function of existing arrays.
|
|
| '''Input'''<br>''(Input)'


This property specifies the input dataset for which to display the time
The Calculator filter computes a new data array or new point coordinates as a function of existing scalar or vector arrays. If point-centered arrays are used in the computation of a new data array, the resulting array will also be point-centered. Similarly, computations using cell-centered arrays will produce a new cell-centered array. If the function is computing point coordinates, the result of the function must be a three-component vector. The Calculator interface operates similarly to a scientific calculator. In creating the function to evaluate, the standard order of operations applies.<br>
Each of the calculator functions is described below. Unless otherwise noted, enclose the operand in parentheses using the ( and ) buttons.<br>
Clear: Erase the current function (displayed in the read-only text box above the calculator buttons).<br>
/: Divide one scalar by another. The operands for this function are not required to be enclosed in parentheses.<br>
*: Multiply two scalars, or multiply a vector by a scalar (scalar multiple). The operands for this function are not required to be enclosed in parentheses.<br>
-: Negate a scalar or vector (unary minus), or subtract one scalar or vector from another. The operands for this function are not required to be enclosed in parentheses.<br>
+: Add two scalars or two vectors. The operands for this function are not required to be enclosed in parentheses.<br>
sin: Compute the sine of a scalar.<br>
cos: Compute the cosine of a scalar.<br>
tan: Compute the tangent of a scalar.<br>
asin: Compute the arcsine of a scalar.<br>
acos: Compute the arccosine of a scalar.<br>
atan: Compute the arctangent of a scalar.<br>
sinh: Compute the hyperbolic sine of a scalar.<br>
cosh: Compute the hyperbolic cosine of a scalar.<br>
tanh: Compute the hyperbolic tangent of a scalar.<br>
min: Compute minimum of two scalars.<br>
max: Compute maximum of two scalars.<br>
x^y: Raise one scalar to the power of another scalar. The operands for this function are not required to be enclosed in parentheses.<br>
sqrt: Compute the square root of a scalar.<br>
e^x: Raise e to the power of a scalar.<br>
log: Compute the logarithm of a scalar (deprecated. same as log10).<br>
log10: Compute the logarithm of a scalar to the base 10.<br>
ln: Compute the logarithm of a scalar to the base 'e'.<br>
ceil: Compute the ceiling of a scalar.<br>
floor: Compute the floor of a scalar.<br>
abs: Compute the absolute value of a scalar.<br>
v1.v2: Compute the dot product of two vectors. The operands for this function are not required to be enclosed in parentheses.<br>
cross: Compute cross product of two vectors.<br>
mag: Compute the magnitude of a vector.<br>
norm: Normalize a vector.<br>
The operands are described below.<br>
The digits 0 - 9 and the decimal point are used to enter constant scalar values.<br>
iHat, jHat, and kHat are vector constants representing unit vectors in the X, Y, and Z directions, respectively.<br>
The scalars menu lists the names of the scalar arrays and the components of the vector arrays of either the point-centered or cell-centered data. The vectors menu lists the names of the point-centered or cell-centered vector arrays. The function will be computed for each point (or cell) using the scalar or vector value of the array at that point (or cell).<br>
The filter operates on any type of data set, but the input data set must have at least one scalar or vector array. The arrays can be either point-centered or cell-centered. The Calculator filter's output is of the same data set type as the input.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|
|
This property determines whether the computation is to be performed on point-centered or cell-centered data.


The selected object must be the result of the following: sources (includes readers), filters
| 0
|
The value must be one of the following: point_data (1), cell_data (2), field_data (5).


|-
| '''Coordinate Results'''<br>''(CoordinateResults)''
|
The value of this property determines whether the results of this computation should be used as point coordinates or as a new array.


| 0
|
|
| '''Scale'''<br>''(Scale)'
Only the values 0 and 1 are accepted.


The factor by which the input time is scaled
|-
| '''Function'''<br>''(Function)''
|
This property contains the equation for computing the new array.


| �
| �
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input dataset to the Calculator filter. The scalar and vector variables may be chosen from this dataset's arrays.
| �
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.
| '''Shift'''<br>''(Shift)'
 


The amount of time the input is shifted (after scaling)
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Replace Invalid Results'''<br>''(ReplaceInvalidValues)''
|
|
This property determines whether invalid values in the computation will be replaced with a specific value. (See the ReplacementValue property.)
| 1
|
|
Only the values 0 and 1 are accepted.
|-
| '''Replacement Value'''<br>''(ReplacementValue)''
|
|
If invalid values in the computation are to be replaced with another value, this property contains that value.


| 0
| �
|-
| '''Result Array Name'''<br>''(ResultArrayName)''
|
This property contains the name for the output array containing the result of this computation.


==Append Attributes=
| Result
| �
|}




Copies geometry from first input.  Puts all of the arrays into the output
==Cell Centers==


The Append Attributes filter takes multiple input data sets with the same geometry and merges their point and cell attributes to produce a single output containing all the point and cell attributes of the inputs. Any inputs without the same number of points and cells as the first input are ignored. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br


{| class="PropertiesTable" border="1" cellpadding="5
Create a point (no geometry) at the center of each input cell.
|
| '''Property''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Input'''<br>''(Input)'


This property specifies the input to the Append Attributes filter
The Cell Centers filter places a point at the center of each cell in the input data set. The center computed is the parametric center of the cell, not necessarily the geometric or bounding box center. The cell attributes of the input will be associated with these newly created points of the output. You have the option of creating a vertex cell per point in the outpuut. This is useful because vertex cells are rendered, but points are not. The points themselves could be used for placing glyphs (using the Glyph filter). The Cell Centers filter takes any type of data set as input and produces a polygonal data set as output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Cell Centers filter.


The selected object must be the result of the following: sources (includes readers), filters
| �
|
The selected object must be the result of the following: sources (includes readers), filters.




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Vertex Cells'''<br>''(VertexCells)''
|
If set to 1, a vertex cell will be generated per point in the output. Otherwise only points will be generated.


| 0
|
|
Only the values 0 and 1 are accepted.


|}


==Append Datasets=


==Cell Data to Point Data==


Takes an input of multiple datasets and output has only one unstructured grid


The Append Datasets filter operates on multiple data sets of any type (polygonal, structured, etc.). It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output. The input data sets must already be collected together, either as a result of a reader that loads multiple parts (e.g., EnSight reader) or because the Group Parts filter has been run to form a collection of data sets.<br
Create point attributes by averaging cell attributes.


{| class="PropertiesTable" border="1" cellpadding="5
The Cell Data to Point Data filter averages the values of the cell attributes of the cells surrounding a point to compute point attributes. The Cell Data to Point Data filter operates on any type of data set, and the output data set is of the same type as the input.<br>
|
 
| '''Property''
{| class="PropertiesTable" border="1" cellpadding="5"
| '''Description''
|-
| '''Default Value(s)''
| '''Property'''
| '''Restrictions''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Input'''<br>''(Input)'
This property specifies the input to the Cell Data to Point Data filter.
 
This property specifies the datasets to be merged into a single dataset by the Append Datasets filter


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected object must be the result of the following: sources (includes readers), filters


The dataset must contain a cell array.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Pass Cell Data'''<br>''(PassCellData)''
|
|
If this property is set to 1, then the input cell data is passed through to the output; otherwise, only the generated point data will be available in the output.


| 0
|
Only the values 0 and 1 are accepted.


==Append Geometry=
|}




Takes an input of multiple poly data parts and output has only one part
==Clean==


The Append Geometry filter operates on multiple polygonal data sets. It merges their geometry into a single data set. Only the point and cell attributes that all of the input data sets have in common will appear in the output.<br


{| class="PropertiesTable" border="1" cellpadding="5
Merge coincident points if they do not meet a feature edge criteria.
|
| '''Property''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Input'''<br>''(Input)'


Set the input to the Append Geometry filter
The Clean filter takes polygonal data as input and generates polygonal data as output. This filter can merge duplicate points, remove unused points, and transform degenerate cells into their appropriate forms (e.g., a triangle is converted into a line if two of its points are merged).<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Absolute Tolerance'''<br>''(AbsoluteTolerance)''
|
|
If merging nearby points (see PointMerging property) and using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging in the spatial units of the input data set.


The selected object must be the result of the following: sources (includes readers), filters
| 1
|
The value must be greater than or equal to 0.


|-
| '''Convert Lines To Points'''<br>''(ConvertLinesToPoints)''
|
If this property is set to 1, degenerate lines (a "line" whose endpoints are at the same spatial location) will be converted to points.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Convert Polys To Lines'''<br>''(ConvertPolysToLines)''
|
If this property is set to 1, degenerate polygons (a "polygon" with only two distinct point coordinates) will be converted to lines.


| 1
|
|
Only the values 0 and 1 are accepted.


|-
| '''Convert Strips To Polys'''<br>''(ConvertStripsToPolys)''
|
If this property is set to 1, degenerate triangle strips (a triangle "strip" containing only one triangle) will be converted to triangles.


==Block Scalars=
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Input'''<br>''(Input)''
|
Set the input to the Clean filter.


The Level Scalars filter uses colors to show levels of a multiblock dataset
| �
|
The selected object must be the result of the following: sources (includes readers), filters.


The Level Scalars filter uses colors to show levels of a multiblock dataset.<br


{| class="PropertiesTable" border="1" cellpadding="5
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.
 
|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
|
| '''Property''
If this property is set to 1, the whole data set will be processed at once so that cleaning the data set always produces the same results. If it is set to 0, the data set can be processed one piece at a time, so it is not necessary for the entire data set to fit into memory; however the results are not guaranteed to be the same as they would be if the Piece invariant option was on. Setting this option to 0 may produce seams in the output dataset when ParaView is run in parallel.
| '''Description''
 
| '''Default Value(s)''
| 1
| '''Restrictions''
|
|
| '''Input'''<br>''(Input)'
Only the values 0 and 1 are accepted.


This property specifies the input to the Level Scalars filter
|-
| '''Point Merging'''<br>''(PointMerging)''
|
If this property is set to 1, then points will be merged if they are within the specified Tolerance or AbsoluteTolerance (see the Tolerance and AbsoluteTolerance propertys), depending on the value of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute property.) If this property is set to 0, points will not be merged.


| 1
|
|
Only the values 0 and 1 are accepted.


The selected object must be the result of the following: sources (includes readers), filters
|-
| '''Tolerance'''<br>''(Tolerance)''
|
If merging nearby points (see PointMerging property) and not using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging as a fraction of the length of the diagonal of the bounding box of the input data set.


| 0
|
The value must be greater than or equal to 0 and less than or equal to 1.


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet
|-
| '''Tolerance Is Absolute'''<br>''(ToleranceIsAbsolute)''
|
This property determines whether to use absolute or relative (a percentage of the bounding box) tolerance when performing point merging.


| 0
|
Only the values 0 and 1 are accepted.


|
|}




==Calculator=
==Clean to Grid==




Compute new attribute arrays as function of existing arrays
This filter merges points and converts the data set to unstructured grid.


The Calculator filter computes a new data array or new point coordinates as a function of existing scalar or vector arrays. If point-centered arrays are used in the computation of a new data array, the resulting array will also be point-centered. Similarly, computations using cell-centered arrays will produce a new cell-centered array. If the function is computing point coordinates, the result of the function must be a three-component vector. The Calculator interface operates similarly to a scientific calculator. In creating the function to evaluate, the standard order of operations applies.<br
The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.<br>
Each of the calculator functions is described below. Unless otherwise noted, enclose the operand in parentheses using the ( and ) buttons.<br
Clear: Erase the current function (displayed in the read-only text box above the calculator buttons).<br
/: Divide one scalar by another. The operands for this function are not required to be enclosed in parentheses.<br
*: Multiply two scalars, or multiply a vector by a scalar (scalar multiple). The operands for this function are not required to be enclosed in parentheses.<br
-: Negate a scalar or vector (unary minus), or subtract one scalar or vector from another. The operands for this function are not required to be enclosed in parentheses.<br
+: Add two scalars or two vectors. The operands for this function are not required to be enclosed in parentheses.<br
sin: Compute the sine of a scalar.<br
cos: Compute the cosine of a scalar.<br
tan: Compute the tangent of a scalar.<br
asin: Compute the arcsine of a scalar.<br
acos: Compute the arccosine of a scalar.<br
atan: Compute the arctangent of a scalar.<br
sinh: Compute the hyperbolic sine of a scalar.<br
cosh: Compute the hyperbolic cosine of a scalar.<br
tanh: Compute the hyperbolic tangent of a scalar.<br
min: Compute minimum of two scalars.<br
max: Compute maximum of two scalars.<br
x^y: Raise one scalar to the power of another scalar. The operands for this function are not required to be enclosed in parentheses.<br
sqrt: Compute the square root of a scalar.<br
e^x: Raise e to the power of a scalar.<br
log: Compute the logarithm of a scalar (deprecated. same as log10).<br
log10: Compute the logarithm of a scalar to the base 10.<br
ln: Compute the logarithm of a scalar to the base 'e'.<br
ceil: Compute the ceiling of a scalar.<br
floor: Compute the floor of a scalar.<br
abs: Compute the absolute value of a scalar.<br
v1.v2: Compute the dot product of two vectors. The operands for this function are not required to be enclosed in parentheses.<br
cross: Compute cross product of two vectors.<br
mag: Compute the magnitude of a vector.<br
norm: Normalize a vector.<br
The operands are described below.<br
The digits 0 - 9 and the decimal point are used to enter constant scalar values.<br
iHat, jHat, and kHat are vector constants representing unit vectors in the X, Y, and Z directions, respectively.<br
The scalars menu lists the names of the scalar arrays and the components of the vector arrays of either the point-centered or cell-centered data. The vectors menu lists the names of the point-centered or cell-centered vector arrays. The function will be computed for each point (or cell) using the scalar or vector value of the array at that point (or cell).<br
The filter operates on any type of data set, but the input data set must have at least one scalar or vector array. The arrays can be either point-centered or cell-centered. The Calculator filter's output is of the same data set type as the input.<br


{| class="PropertiesTable" border="1" cellpadding="5
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Property''
This property specifies the input to the Clean to Grid filter.
| '''Description''
 
| '''Default Value(s)''
|
| '''Restrictions''
|
|
| '''Attribute Mode'''<br>''(AttributeMode)'
The selected object must be the result of the following: sources (includes readers), filters.
 
This property determines whether the computation is to be performed on point-centered or cell-centered data


|


The value must be one of the following: point_data (1), cell_data (2), field_data (5)
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|}


|
| '''Coordinate Results'''<br>''(CoordinateResults)'


The value of this property determines whether the results of this computation should be used as point coordinates or as a new array
==Clip==


|


Only the values 0 and 1 are accepted
Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid.


The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Clip Type'''<br>''(ClipFunction)''
|
|
| '''Function'''<br>''(Function)'
This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset.


This property contains the equation for computing the new array
| �
|
The value must be set to one of the following: Plane, Box, Sphere, Scalar.


|-
| '''Input'''<br>''(Input)''
|
|
|
This property specifies the dataset on which the Clip filter will operate.
|
| '''Input'''<br>''(Input)'
 
This property specifies the input dataset to the Calculator filter. The scalar and vector variables may be chosen from this dataset's arrays


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected object must be the result of the following: sources (includes readers), filters


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
|-
| '''Inside Out'''<br>''(InsideOut)''
|
If this property is set to 0, the clip filter will return that portion of the dataset that lies within the clip function. If set to 1, the portions of the dataset that lie outside the clip function will be returned instead.


| 0
|
Only the values 0 and 1 are accepted.


|-
| '''Scalars'''<br>''(SelectInputScalars)''
|
|
| '''Replace Invalid Results'''<br>''(ReplaceInvalidValues)'
If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation.
 
This property determines whether invalid values in the computation will be replaced with a specific value. (See the ReplacementValue property.


| �
|
|
An array of scalars is required.


Only the values 0 and 1 are accepted


Valud array names will be chosen from point and cell data.


|-
| '''Use Value As Offset'''<br>''(UseValueAsOffset)''
|
|
| '''Replacement Value'''<br>''(ReplacementValue)'
If UseValueAsOffset is true, Value is used as an offset parameter to the implicit function. Otherwise, Value is used only when clipping using a scalar array.


If invalid values in the computation are to be replaced with another value, this property contains that value
| 0
|
Only the values 0 and 1 are accepted.


|-
| '''Value'''<br>''(Value)''
|
|
If clipping with scalars, this property sets the scalar value about which to clip the dataset based on the scalar array chosen. (See SelectInputScalars.) If clipping with a clip function, this property specifies an offset from the clip function to use in the clipping operation. Neither functionality is currently available in ParaView's user interface.
| 0
|
|
|
The value must lie within the range of the selected data array.
| '''Result Array Name'''<br>''(ResultArrayName)'


This property contains the name for the output array containing the result of this computation
|}


| Resul
|
|


==Clip Closed Surface==


==Cell Centers=


Clip a polygonal dataset with a plane to produce closed surfaces


Create a point (no geometry) at the center of each input cell
This clip filter cuts away a portion of the input polygonal dataset using a plane to generate a new polygonal dataset.<br>


The Cell Centers filter places a point at the center of each cell in the input data set. The center computed is the parametric center of the cell, not necessarily the geometric or bounding box center. The cell attributes of the input will be associated with these newly created points of the output. You have the option of creating a vertex cell per point in the outpuut. This is useful because vertex cells are rendered, but points are not. The points themselves could be used for placing glyphs (using the Glyph filter). The Cell Centers filter takes any type of data set as input and produces a polygonal data set as output.<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Base Color'''<br>''(BaseColor)''
|
Specify the color for the faces from the input.


{| class="PropertiesTable" border="1" cellpadding="5
| 0.1 0.1 1
|
|
| '''Property''
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1).
| '''Description''
 
| '''Default Value(s)''
| '''Restrictions''
|-
| '''Clip Color'''<br>''(ClipColor)''
|
|
| '''Input'''<br>''(Input)'
Specifiy the color for the capping faces (generated on the clipping interface).


This property specifies the input to the Cell Centers filter
| 1 0.11 0.1
|
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1).


|-
| '''Clipping Plane'''<br>''(ClippingPlane)''
|
|
This property specifies the parameters of the clipping plane used to clip the polygonal data.


The selected object must be the result of the following: sources (includes readers), filters
| �
|
The value must be set to one of the following: Plane.


|-
| '''Generate Cell Origins'''<br>''(GenerateColorScalars)''
|
Generate (cell) data for coloring purposes such that the newly generated cells (including capping faces and clipping outlines) can be distinguished from the input cells.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
| 0
|
Only the values 0 and 1 are accepted.


|-
| '''Generate Faces'''<br>''(GenerateFaces)''
|
Generate polygonal faces in the output.


| 1
|
|
| '''Vertex Cells'''<br>''(VertexCells)'
Only the values 0 and 1 are accepted.


If set to 1, a vertex cell will be generated per point in the output. Otherwise only points will be generated
|-
| '''Generate Outline'''<br>''(GenerateOutline)''
|
Generate clipping outlines in the output wherever an input face is cut by the clipping plane.


| 0
|
|
Only the values 0 and 1 are accepted.


Only the values 0 and 1 are accepted
 
|-
| '''Input'''<br>''(Input)''
|
This property specifies the dataset on which the Clip filter will operate.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.




==Cell Data to Point Data=
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
| '''Inside Out'''<br>''(InsideOut)''
|
If this flag is turned off, the clipper will return the portion of the data that lies within the clipping plane. Otherwise, the clipper will return the portion of the data that lies outside the clipping plane.


Create point attributes by averaging cell attributes
| 0
 
The Cell Data to Point Data filter averages the values of the cell attributes of the cells surrounding a point to compute point attributes. The Cell Data to Point Data filter operates on any type of data set, and the output data set is of the same type as the input.<br
 
{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
Only the values 0 and 1 are accepted.
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Input'''<br>''(Input)'
 
This property specifies the input to the Cell Data to Point Data filter


|-
| '''Clipping Tolerance'''<br>''(Tolerance)''
|
|
Specify the tolerance for creating new points. A small value might incur degenerate triangles.


The selected object must be the result of the following: sources (includes readers), filters
| 1e-06
| �
|}




The dataset must contain a cell array
==Compute Derivatives==




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
This filter computes derivatives of scalars and vectors.


CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Pass Cell Data'''<br>''(PassCellData)'
This property specifies the input to the filter.
 
If this property is set to 1, then the input cell data is passed through to the output; otherwise, only the generated point data will be available in the output


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Output Tensor Type'''<br>''(OutputTensorType)''
|
|
This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor.


| 1
|
The value must be one of the following: Nothing (0), Vector Gradient (1), Strain (2).


==Clean=
|-
| '''Output Vector Type'''<br>''(OutputVectorType)''
|
This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data.


| 1
|
The value must be one of the following: Nothing (0), Scalar Gradient (1), Vorticity (2).


Merge coincident points if they do not meet a feature edge criteria
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|
This property indicates the name of the scalar array to differentiate.


The Clean filter takes polygonal data as input and generates polygonal data as output. This filter can merge duplicate points, remove unused points, and transform degenerate cells into their appropriate forms (e.g., a triangle is converted into a line if two of its points are merged).<br
| �
|
An array of scalars is required.


{| class="PropertiesTable" border="1" cellpadding="5
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|
|
| '''Property''
This property indicates the name of the vector array to differentiate.
| '''Description''
 
| '''Default Value(s)''
| 1
| '''Restrictions''
|
|
| '''Absolute Tolerance'''<br>''(AbsoluteTolerance)'
An array of vectors is required.


If merging nearby points (see PointMerging property) and using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging in the spatial units of the input data set
|}


|


The value must be greater than or equal to 0
==Connectivity==




|
Mark connected components with integer point attribute array.
| '''Convert Lines To Points'''<br>''(ConvertLinesToPoints)'


If this property is set to 1, degenerate lines (a "line" whose endpoints are at the same spatial location) will be converted to points
The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Color Regions'''<br>''(ColorRegions)''
|
|
Controls the coloring of the connected regions.


Only the values 0 and 1 are accepted
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Extraction Mode'''<br>''(ExtractionMode)''
|
Controls the extraction of connected surfaces.


| 5
|
|
| '''Convert Polys To Lines'''<br>''(ConvertPolysToLines)'
The value must be one of the following: Extract Point Seeded Regions (1), Extract Cell Seeded Regions (2), Extract Specified Regions (3), Extract Largest Region (4), Extract All Regions (5), Extract Closes Point Region (6).


If this property is set to 1, degenerate polygons (a "polygon" with only two distinct point coordinates) will be converted to lines
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Connectivity filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|
| '''Convert Strips To Polys'''<br>''(ConvertStripsToPolys)'
|}


If this property is set to 1, degenerate triangle strips (a triangle "strip" containing only one triangle) will be converted to triangles


|
==Contingency Statistics==


Only the values 0 and 1 are accepted


Compute a statistical model of a dataset and/or assess the dataset with a statistical model.


This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.<br>
This filter computes contingency tables between pairs of attributes.  This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.  Data is assessed by computing <br>
*  the probability of observing both variables simultaneously;<br>
*  the probability of each variable conditioned on the other (the two values need not be identical); and<br>
*  the pointwise mutual information (PMI).
<br>
Finally, the summary statistics include the information entropy of the observations.<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|
Specify which type of field data the arrays will be drawn from.
| 0
|
|
| '''Input'''<br>''(Input)'
Valud array names will be chosen from point and cell data.


Set the input to the Clean filter
|-
| '''Input'''<br>''(Input)''
|
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected object must be the result of the following: sources (includes readers), filters


The dataset must contain a point or cell array.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.


|-
| '''Model Input'''<br>''(ModelInput)''
|
|
| '''Piece Invariant'''<br>''(PieceInvariant)'
A previously-calculated model with which to assess a separate dataset. This input is optional.
 
If this property is set to 1, the whole data set will be processed at once so that cleaning the data set always produces the same results. If it is set to 0, the data set can be processed one piece at a time, so it is not necessary for the entire data set to fit into memory; however the results are not guaranteed to be the same as they would be if the Piece invariant option was on. Setting this option to 0 may produce seams in the output dataset when ParaView is run in parallel


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.


|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|
|
| '''Point Merging'''<br>''(PointMerging)'
Choose arrays whose entries will be used to form observations for statistical analysis.
 
If this property is set to 1, then points will be merged if they are within the specified Tolerance or AbsoluteTolerance (see the Tolerance and AbsoluteTolerance propertys), depending on the value of the ToleranceIsAbsolute property. (See the ToleranceIsAbsolute property.) If this property is set to 0, points will not be merged


| �
|
|
An array of scalars is required.


Only the values 0 and 1 are accepted
 
|-
 
| '''Task'''<br>''(Task)''
|
|
| '''Tolerance'''<br>''(Tolerance)'
Specify the task to be performed: modeling and/or assessment.
#  "Statistics of all the data," creates an output table (or tables) summarizing the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


If merging nearby points (see PointMerging property) and not using absolute tolerance (see ToleranceIsAbsolute property), this property specifies the tolerance for performing merging as a fraction of the length of the diagonal of the bounding box of the input data set
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.


| 3
|
|
The value must be one of the following: Statistics of all the data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).


The value must be greater than or equal to 0 and less than or equal to 1
 
|-
 
| '''Training Fraction'''<br>''(TrainingFraction)''
|
|
| '''Tolerance Is Absolute'''<br>''(ToleranceIsAbsolute)'
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.
 
This property determines whether to use absolute or relative (a percentage of the bounding box) tolerance when performing point merging


| 0.1
|
|
The value must be greater than or equal to 0 and less than or equal to 1.


Only the values 0 and 1 are accepted
|}




|
==Contour==




==Clean to Grid=
Generate isolines or isosurfaces using point scalars.


The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br>


This filter merges points and converts the data set to unstructured grid
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Compute Gradients'''<br>''(ComputeGradients)''
|
If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1.


The Clean to Grid filter merges points that are exactly coincident. It also converts the data set to an unstructured grid. You may wish to do this if you want to apply a filter to your data set that is available for unstructured grids but not for the initial type of your data set (e.g., applying warp vector to volumetric data). The Clean to Grid filter operates on any type of data set.<br
| 0
|
Only the values 0 and 1 are accepted.


{| class="PropertiesTable" border="1" cellpadding="5
|-
| '''Compute Normals'''<br>''(ComputeNormals)''
|
|
| '''Property''
If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0.
| '''Description''
Select whether to compute normals.
| '''Default Value(s)''
 
| '''Restrictions''
| 1
|
|
| '''Input'''<br>''(Input)'
Only the values 0 and 1 are accepted.


This property specifies the input to the Clean to Grid filter
|-
| '''Compute Scalars'''<br>''(ComputeScalars)''
|
If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array.


| 0
|
|
Only the values 0 and 1 are accepted.


The selected object must be the result of the following: sources (includes readers), filters
|-
| '''Isosurfaces'''<br>''(ContourValues)''
|
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values.


| �
|
The value must lie within the range of the selected data array.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
 
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input dataset to be used by the contour filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.




==Clip=
The dataset must contain a point or cell array with 1 components.
 


Clip with an implicit plane. Clipping does not reduce the dimensionality of the data set. The output data type of this filter is always an unstructured grid


The Clip filter cuts away a portion of the input data set using an implicit plane. This filter operates on all types of data sets, and it returns unstructured grid data on output.<br
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


{| class="PropertiesTable" border="1" cellpadding="5
|
|-
| '''Property''
| '''Point Merge Method'''<br>''(Locator)''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Clip Type'''<br>''(ClipFunction)'
This property specifies an incremental point locator for merging duplicate / coincident points.
 
This property specifies the parameters of the clip function (an implicit plane) used to clip the dataset


| �
|
|
The selected object must be the result of the following: incremental_point_locators.


The value must be set to one of the following: Plane, Box, Sphere, Scalar


The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator.


|-
| '''Contour By'''<br>''(SelectInputScalars)''
|
|
| '''Input'''<br>''(Input)'
This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces.
 
This property specifies the dataset on which the Clip filter will operate


| �
|
|
An array of scalars is required.


The selected object must be the result of the following: sources (includes readers), filters


Valud array names will be chosen from point and cell data.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
|}




|
==Cosmology FOF Halo Finder==
| '''Inside Out'''<br>''(InsideOut)'


If this property is set to 0, the clip filter will return that portion of the dataset that lies within the clip function. If set to 1, the portions of the dataset that lie outside the clip function will be returned instead


|
Sorry, no help is currently available.


Only the values 0 and 1 are accepted




{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''bb (linking length/distance)'''<br>''(BB)''
|
|
| '''Scalars'''<br>''(SelectInputScalars)'
Linking length measured in units of interparticle spacing and is dimensionless.  Used to link particles into halos for the friend-of-a-friend algorithm.


If clipping with scalars, this property specifies the name of the scalar array on which to perform the clip operation
| 0.2
|
The value must be greater than or equal to 0.


|-
| '''Compute the most bound particle for halos'''<br>''(ComputeMostBoundParticle)''
|
|
If checked, the most bound particle will be calculated.  This can be very slow.


An array of scalars is required
| 0
|
Only the values 0 and 1 are accepted.


|-
| '''Compute the most connected particle for halos'''<br>''(ComputeMostConnectedParticle)''
|
If checked, the most connected particle will be calculated.  This can be very slow.


Valud array names will be chosen from point and cell data
| 0
|
Only the values 0 and 1 are accepted.


|-
| '''Copy halo catalog information to original particles'''<br>''(CopyHaloDataToParticles)''
|
If checked, the halo catalog information will be copied to the original particles as well.


| 1
|
|
| '''Use Value As Offset'''<br>''(UseValueAsOffset)'
Only the values 0 and 1 are accepted.
 
If UseValueAsOffset is true, Value is used as an offset parameter to the implicit function. Otherwise, Value is used only when clipping using a scalar array


|-
| '''Halo position for 3D visualization'''<br>''(HaloPositionType)''
|
|
This sets the position for the halo catalog particles (second output) in 3D space for visualization.  Input particle positions (first output) will be unaltered by this.  MBP and MCP for particle positions can potentially take a very long time to calculate.


Only the values 0 and 1 are accepted
| 0
 
 
|
|
| '''Value'''<br>''(Value)'
The value must be one of the following: Average (0), Center of Mass (1), Most Bound Particle (2), Most Connected Particle (3).
 
If clipping with scalars, this property sets the scalar value about which to clip the dataset based on the scalar array chosen. (See SelectInputScalars.) If clipping with a clip function, this property specifies an offset from the clip function to use in the clipping operation. Neither functionality is currently available in ParaView's user interface


|-
| '''Input'''<br>''(Input)''
| �
| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value must lie within the range of the selected data array


The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.


|-
| '''np (number of seeded particles in one dimension, i.e., total particles = np^3)'''<br>''(NP)''
|
|
Number of seeded particles in one dimension.  Therefore, total simulation particles is np^3 (cubed).


| 256
|
The value must be greater than or equal to 0.


==Clip Closed Surface=
|-
| '''overlap (shared point/ghost cell gap distance)'''<br>''(Overlap)''
|
The space in rL units to extend processor particle ownership for ghost particles/cells.  Needed for correct halo calculation when halos cross processor boundaries in parallel computation.


| 5
|
The value must be greater than or equal to 0.


Clip a polygonal dataset with a plane to produce closed surface
|-
| '''pmin (minimum particle threshold for a halo)'''<br>''(PMin)''
|
Minimum number of particles (threshold) needed before a group is called a halo.


This clip filter cuts away a portion of the input polygonal dataset using a plane to generate a new polygonal dataset.<br
| 10
|
The value must be greater than or equal to 1.


{| class="PropertiesTable" border="1" cellpadding="5
|-
| '''rL (physical box side length)'''<br>''(RL)''
|
|
| '''Property''
The box side length used to wrap particles around if they exceed rL (or less than 0) in any dimension (only positive positions are allowed in the input, or the are wrapped around).
| '''Description''
 
| '''Default Value(s)''
| 90.1408
| '''Restrictions''
|
|
| '''Base Color'''<br>''(BaseColor)'
The value must be greater than or equal to 0.


Specify the color for the faces from the input
|}


| 0.1 0.1


The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1)
==Curvature==




|
This filter will compute the Gaussian or mean curvature of the mesh at each point.
| '''Clip Color'''<br>''(ClipColor)'


Specifiy the color for the capping faces (generated on the clipping interface)
The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures.<br><br><br>
; the type can be selected from the Curvature type menu button.<br>


| 1 0.11 0.
{| class="PropertiesTable" border="1" cellpadding="5"
 
|-
The value must be greater than or equal to (0, 0, 0) and less than or equal to (1, 1, 1)
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Curvature Type'''<br>''(CurvatureType)''
|
This propery specifies which type of curvature to compute.


| 0
|
The value must be one of the following: Gaussian (0), Mean (1).


|-
| '''Input'''<br>''(Input)''
|
|
| '''Clipping Plane'''<br>''(ClippingPlane)'
This property specifies the input to the Curvature filter.
 
This property specifies the parameters of the clipping plane used to clip the polygonal data


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value must be set to one of the following: Plane


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
| '''Invert Mean Curvature'''<br>''(InvertMeanCurvature)''
|
|
| '''Generate Cell Origins'''<br>''(GenerateColorScalars)'
If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals.
 
Generate (cell) data for coloring purposes such that the newly generated cells (including capping faces and clipping outlines) can be distinguished from the input cells


| 0
|
|
Only the values 0 and 1 are accepted.


Only the values 0 and 1 are accepted
|}




|
==D3==
| '''Generate Faces'''<br>''(GenerateFaces)'


Generate polygonal faces in the output


|
Repartition a data set into load-balanced spatially convex regions.  Create ghost cells if requested.


Only the values 0 and 1 are accepted
The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Boundary Mode'''<br>''(BoundaryMode)''
|
This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering.


| 0
|
|
| '''Generate Outline'''<br>''(GenerateOutline)'
The value must be one of the following: Assign cells uniquely (0), Duplicate cells (1), Divide cells (2).


Generate clipping outlines in the output wherever an input face is cut by the clipping plane
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the D3 filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Minimal Memory'''<br>''(UseMinimalMemory)''
|
|
| '''Input'''<br>''(Input)'
If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction.
 
This property specifies the dataset on which the Clip filter will operate


| 0
|
|
Only the values 0 and 1 are accepted.


The selected object must be the result of the following: sources (includes readers), filters
|}




The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData
==Decimate==




|
Simplify a polygonal model using an adaptive edge collapse algorithm.  This filter works with triangles only.
| '''Inside Out'''<br>''(InsideOut)'


If this flag is turned off, the clipper will return the portion of the data that lies within the clipping plane. Otherwise, the clipper will return the portion of the data that lies outside the clipping plane
The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Boundary Vertex Deletion'''<br>''(BoundaryVertexDeletion)''
|
|
If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target.


Only the values 0 and 1 are accepted
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Feature Angle'''<br>''(FeatureAngle)''
|
The value of thie property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split.


| 15
|
|
| '''Clipping Tolerance'''<br>''(Tolerance)'
The value must be greater than or equal to 0 and less than or equal to 180.


Specify the tolerance for creating new points. A small value might incur degenerate triangles
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Decimate filter.


| 1e-0
|
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.




==Compute Derivatives=
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
| '''Preserve Topology'''<br>''(PreserveTopology)''
|
If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced.


This filter computes derivatives of scalars and vectors
| 0
|
Only the values 0 and 1 are accepted.


CellDerivatives is a filter that computes derivatives of scalars and vectors at the center of cells. You can choose to generate different output including the scalar gradient (a vector), computed tensor vorticity (a vector), gradient of input vectors (a tensor), and strain matrix of the input vectors (a tensor); or you may choose to pass data through to the output.<br
|-
| '''Target Reduction'''<br>''(TargetReduction)''
|
This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.)


{| class="PropertiesTable" border="1" cellpadding="5
| 0.9
|
|
| '''Property''
The value must be greater than or equal to 0 and less than or equal to 1.
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Input'''<br>''(Input)'


This property specifies the input to the filter
|}


|


The selected object must be the result of the following: sources (includes readers), filters
==Delaunay 2D==




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution.


Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh.<br><br><br>
The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation.<br><br><br>
Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed.<br><br><br>
The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Alpha'''<br>''(Alpha)''
|
|
| '''Output Tensor Type'''<br>''(OutputTensorType)'
The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output.


This property controls how the filter works to generate tensor cell data. You can choose to compute the gradient of the input vectors, or compute the strain tensor of the vector gradient tensor. By default, the filter will take the gradient of the vector data to construct a tensor
| 0
|
The value must be greater than or equal to 0.


|-
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
|
|
If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output.


The value must be one of the following: Nothing (0), Vector Gradient (1), Strain (2)
| 0
 
|
Only the values 0 and 1 are accepted.


|-
| '''Input'''<br>''(Input)''
|
|
| '''Output Vector Type'''<br>''(OutputVectorType)'
This property specifies the input dataset to the Delaunay 2D filter.
 
This property Controls how the filter works to generate vector cell data. You can choose to compute the gradient of the input scalars, or extract the vorticity of the computed vector gradient tensor. By default, the filter will take the gradient of the input scalar data


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value must be one of the following: Nothing (0), Scalar Gradient (1), Vorticity (2)


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.


|-
| '''Offset'''<br>''(Offset)''
|
|
| '''Scalars'''<br>''(SelectInputScalars)'
This property is a multiplier to control the size of the initial, bounding Delaunay triangulation.


This property indicates the name of the scalar array to differentiate
| 1
|
The value must be greater than or equal to 0.75.


|-
| '''Projection Plane Mode'''<br>''(ProjectionPlaneMode)''
|
|
This property determines type of projection plane to use in performing the triangulation.


An array of scalars is required
| 0
 
|
The value must be one of the following: XY Plane (0), Best-Fitting Plane (2).


|-
| '''Tolerance'''<br>''(Tolerance)''
|
|
| '''Vectors'''<br>''(SelectInputVectors)'
This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.
 
This property indicates the name of the vector array to differentiate


| 1e-05
|
|
The value must be greater than or equal to 0 and less than or equal to 1.


An array of vectors is required
|}




|
==Delaunay 3D==




==Connectivity=
Create a 3D Delaunay triangulation of input                                points.  It expects a vtkPointSet as input and                                produces vtkUnstructuredGrid as output.


Delaunay3D is a filter that constructs a 3D Delaunay triangulation<br>
from a list of input points. These points may be represented by any<br>
dataset of type vtkPointSet and subclasses. The output of the filter<br>
is an unstructured grid dataset. Usually the output is a tetrahedral<br>
mesh, but if a non-zero alpha distance value is specified (called<br>
the "alpha" value), then only tetrahedra, triangles, edges, and<br>
vertices lying within the alpha radius are output. In other words,<br>
non-zero alpha values may result in arbitrary combinations of<br>
tetrahedra, triangles, lines, and vertices. (The notion of alpha<br>
value is derived from Edelsbrunner's work on "alpha shapes".)<br><br><br>
The 3D Delaunay triangulation is defined as the triangulation that<br>
satisfies the Delaunay criterion for n-dimensional simplexes (in<br>
this case n=3 and the simplexes are tetrahedra). This criterion<br>
states that a circumsphere of each simplex in a triangulation<br>
contains only the n+1 defining points of the simplex. (See text for<br>
more information.) While in two dimensions this translates into an<br>
"optimal" triangulation, this is not true in 3D, since a measurement<br>
for optimality in 3D is not agreed on.<br><br><br>
Delaunay triangulations are used to build topological structures<br>
from unorganized (or unstructured) points. The input to this filter<br>
is a list of points specified in 3D. (If you wish to create 2D<br>
triangulations see Delaunay2D.) The output is an unstructured<br>
grid.<br><br><br>
The Delaunay triangulation can be numerically sensitive. To prevent<br>
problems, try to avoid injecting points that will result in<br>
triangles with bad aspect ratios (1000:1 or greater). In practice<br>
this means inserting points that are "widely dispersed", and enables<br>
smooth transition of triangle sizes throughout the mesh. (You may<br>
even want to add extra points to create a better point<br>
distribution.) If numerical problems are present, you will see a<br>
warning message to this effect at the end of the triangulation<br>
process.<br><br><br>
Warning:<br>
Points arranged on a regular lattice (termed degenerate cases) can<br>
be triangulated in more than one way (at least according to the<br>
Delaunay criterion). The choice of triangulation (as implemented by<br>
this algorithm) depends on the order of the input points. The first<br>
four points will form a tetrahedron; other degenerate points<br>
(relative to this initial tetrahedron) will not break it.<br><br><br>
Points that are coincident (or nearly so) may be discarded by the<br>
algorithm. This is because the Delaunay triangulation requires<br>
unique input points. You can control the definition of coincidence<br>
with the "Tolerance" instance variable.<br><br><br>
The output of the Delaunay triangulation is supposedly a convex<br>
hull. In certain cases this implementation may not generate the<br>
convex hull. This behavior can be controlled by the Offset instance<br>
variable. Offset is a multiplier used to control the size of the<br>
initial triangulation. The larger the offset value, the more likely<br>
you will generate a convex hull; and the more likely you are to see<br>
numerical problems.<br><br><br>
The implementation of this algorithm varies from the 2D Delaunay<br>
algorithm (i.e., Delaunay2D) in an important way. When points are<br>
injected into the triangulation, the search for the enclosing<br>
tetrahedron is quite different. In the 3D case, the closest<br>
previously inserted point point is found, and then the connected<br>
tetrahedra are searched to find the containing one. (In 2D, a "walk"<br>
towards the enclosing triangle is performed.) If the triangulation<br>
is Delaunay, then an enclosing tetrahedron will be found. However,<br>
in degenerate cases an enclosing tetrahedron may not be found and<br>
the point will be rejected.<br>


Mark connected components with integer point attribute array
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Alpha'''<br>''(Alpha)''
|
This property specifies the alpha (or distance) value to control
the output of this filter.  For a non-zero alpha value, only
edges, faces, or tetra contained within the circumsphere (of
radius alpha) will be output.  Otherwise, only tetrahedra will be
output.


The Connectivity filter assigns a region id to connected components of the input data set. (The region id is assigned as a point scalar value.) This filter takes any data set type as input and produces unstructured grid output.<br
| 0
|
The value must be greater than or equal to 0.


{| class="PropertiesTable" border="1" cellpadding="5
|-
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)''
|
|
| '''Property''
This boolean controls whether bounding triangulation points (and
| '''Description''
associated triangles) are included in the output. (These are
| '''Default Value(s)''
introduced as an initial triangulation to begin the triangulation
| '''Restrictions''
process. This feature is nice for debugging output.)
 
| 0
|
|
| '''Color Regions'''<br>''(ColorRegions)'
Only the values 0 and 1 are accepted.


Controls the coloring of the connected regions
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input dataset to the Delaunay 3D filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.


|-
| '''Offset'''<br>''(Offset)''
|
|
| '''Extraction Mode'''<br>''(ExtractionMode)'
This property specifies a multiplier to control the size of the
 
initial, bounding Delaunay triangulation.
Controls the extraction of connected surfaces


| 2.5
|
|
The value must be greater than or equal to 2.5.


The value must be one of the following: Extract Point Seeded Regions (1), Extract Cell Seeded Regions (2), Extract Specified Regions (3), Extract Largest Region (4), Extract All Regions (5), Extract Closes Point Region (6)
 
|-
 
| '''Tolerance'''<br>''(Tolerance)''
|
|
| '''Input'''<br>''(Input)'
This property specifies a tolerance to control discarding of
 
closely spaced points. This tolerance is specified as a fraction
This property specifies the input to the Connectivity filter
of the diagonal length of the bounding box of the points.


| 0.001
|
|
The value must be greater than or equal to 0 and less than or equal to 1.


The selected object must be the result of the following: sources (includes readers), filters
|}




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
==Descriptive Statistics==




|
Compute a statistical model of a dataset and/or assess the dataset with a statistical model.


This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.
<br>
This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select.


==Contingency Statistics=
<br>
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided. Data is assessed using this model by detrending the data (i.e., subtracting the mean) and then dividing by the standard deviation. Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br>




Compute a statistical model of a dataset and/or assess the dataset with a statistical model
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|
Specify which type of field data the arrays will be drawn from.


This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.<br
| 0
This filter computes contingency tables between pairs of attributes.  This result is a tabular bivariate probability distribution which serves as a Bayesian-style prior model.  Data is assessed by computing <br
|
*  the probability of observing both variables simultaneously;<br
Valud array names will be chosen from point and cell data.
*  the probability of each variable conditioned on the other (the two values need not be identical); and<br
*  the pointwise mutual information (PMI)
<br
Finally, the summary statistics include the information entropy of the observations.<br


{| class="PropertiesTable" border="1" cellpadding="5
|-
| '''Input'''<br>''(Input)''
|
|
| '''Property''
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
| '''Description''
 
| '''Default Value(s)''
|
| '''Restrictions''
|
|
| '''Attribute Mode'''<br>''(AttributeMode)'
The selected object must be the result of the following: sources (includes readers), filters.


Specify which type of field data the arrays will be drawn from


|
The dataset must contain a point or cell array.


Valud array names will be chosen from point and cell data


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph.


|-
| '''Model Input'''<br>''(ModelInput)''
|
|
| '''Input'''<br>''(Input)'
A previously-calculated model with which to assess a separate datasetThis input is optional.
 
The input to the filterArrays from this dataset will be used for computing statistics and/or assessed by a statistical model


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected object must be the result of the following: sources (includes readers), filters


The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet.


The dataset must contain a point or cell array
|-
| '''Variables of Interest'''<br>''(SelectArrays)''
|
Choose arrays whose entries will be used to form observations for statistical analysis.


| �
|
An array of scalars is required.


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph
|-
| '''Deviations should be'''<br>''(SignedDeviations)''
|
Should the assessed values be signed deviations or unsigned?


| 0
|
The value must be one of the following: Unsigned (0), Signed (1).


|-
| '''Task'''<br>''(Task)''
|
|
| '''Model Input'''<br>''(ModelInput)'
Specify the task to be performed: modeling and/or assessment.
#  "Statistics of all the data," creates an output table (or tables) summarizing the '''entire''' input dataset;
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset;
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; and
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model.


A previously-calculated model with which to assess a separate dataset. This input is optional
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training. You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3.


| 3
|
|
The value must be one of the following: Statistics of all the data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3).


The selected object must be the result of the following: sources (includes readers), filters
|-
| '''Training Fraction'''<br>''(TrainingFraction)''
|
Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset.


| 0.1
|
The value must be greater than or equal to 0 and less than or equal to 1.


The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet
|}




|
==Elevation==
| '''Variables of Interest'''<br>''(SelectArrays)'


Choose arrays whose entries will be used to form observations for statistical analysis


|
Create point attribute array by projecting points onto an elevation vector.


An array of scalars is required
The Elevation filter generates point scalar values for an input dataset along a specified direction vector.<br><br><br>
The Input menu allows the user to select the data set to which this filter will be applied. Use the Scalar range entry boxes to specify the minimum and maximum scalar value to be generated. The Low Point and High Point define a line onto which each point of the data set is projected. The minimum scalar value is associated with the Low Point, and the maximum scalar value is associated with the High Point. The scalar value for each point in the data set is determined by the location along the line to which that point projects.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''High Point'''<br>''(HighPoint)''
|
This property defines the other end of the direction vector (large scalar values).


| 0 0 1
|
|
| '''Task'''<br>''(Task)'
The coordinate must lie within the bounding box of the dataset. It will default to the maximum in each dimension.


Specify the task to be performed: modeling and/or assessment
#  "Statistics of all the data," creates an output table (or tables) summarizing the '''entire''' input dataset
|-
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset
| '''Input'''<br>''(Input)''
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; an
|
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model
This property specifies the input dataset to the Elevation filter.
 
When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting.  The ''Training fraction'' setting will be ignored for tasks 1 and 3


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value must be one of the following: Statistics of all the data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3)


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Low Point'''<br>''(LowPoint)''
|
|
| '''Training Fraction'''<br>''(TrainingFraction)'
This property defines one end of the direction vector (small scalar values).


Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset
| 0 0 0
|
The coordinate must lie within the bounding box of the dataset. It will default to the minimum in each dimension.


| 0.
 
|-
The value must be greater than or equal to 0 and less than or equal to 1
| '''Scalar Range'''<br>''(ScalarRange)''
|
This property determines the range into which scalars will be mapped.


| 0 1
| �
|}


|


==Extract AMR Blocks==


==Contour=


This filter extracts a list of datasets from hierarchical datasets.


Generate isolines or isosurfaces using point scalars
This filter extracts a list of datasets from hierarchical datasets.<br>


The Contour filter computes isolines or isosurfaces using a selected point-centered scalar array. The Contour filter operates on any type of data set, but the input is required to have at least one point-centered scalar (single-component) array. The output of this filter is polygonal.<br
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Extract Datasets filter.


{| class="PropertiesTable" border="1" cellpadding="5
|
|
|
| '''Property''
The selected object must be the result of the following: sources (includes readers), filters.
| '''Description''
 
| '''Default Value(s)''
| '''Restrictions''
|
| '''Compute Gradients'''<br>''(ComputeGradients)'


If this property is set to 1, a scalar array containing a gradient value at each point in the isosurface or isoline will be created by this filter; otherwise an array of gradients will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0. Not that if ComputeNormals is set to 1, then gradients will have to be calculated, but they will only be stored in the output dataset if ComputeGradients is also set to 1
The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.


|-
| '''Selected Data Sets'''<br>''(SelectedDataSets)''
|
|
This property provides a list of datasets to extract.


Only the values 0 and 1 are accepted
| �
| �
|}




|
==Extract Block==
| '''Compute Normals'''<br>''(ComputeNormals)'


If this property is set to 1, a scalar array containing a normal value at each point in the isosurface or isoline will be created by the contour filter; otherwise an array of normals will not be computed. This operation is fairly expensive both in terms of computation time and memory required, so if the output dataset produced by the contour filter will be processed by filters that modify the dataset's topology or geometry, it may be wise to set the value of this property to 0
Select whether to compute normals


|
This filter extracts a range of blocks from a multiblock dataset.


Only the values 0 and 1 are accepted
This filter extracts a range of groups from a multiblock dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Block Indices'''<br>''(BlockIndices)''
|
This property lists the ids of the blocks to extract
from the input multiblock dataset.


| �
| �
|-
| '''Input'''<br>''(Input)''
|
|
| '''Compute Scalars'''<br>''(ComputeScalars)'
This property specifies the input to the Extract Group filter.
 
If this property is set to 1, an array of scalars (containing the contour value) will be added to the output dataset. If set to 0, the output will not contain this array


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.


|-
| '''Maintain Structure'''<br>''(MaintainStructure)''
|
|
| '''Isosurfaces'''<br>''(ContourValues)'
This is used only when PruneOutput is ON. By default, when pruning the
 
output i.e. remove empty blocks, if node has only 1 non-null child
This property specifies the values at which to compute isosurfaces/isolines and also the number of such values
block, then that node is removed. To preserve these parent nodes, set
this flag to true.


| 0
|
|
Only the values 0 and 1 are accepted.


The value must lie within the range of the selected data array
 
|-
 
| '''Prune Output'''<br>''(PruneOutput)''
|
|
| '''Input'''<br>''(Input)'
When set, the output mutliblock dataset will be pruned to remove empty
 
nodes. On by default.
This property specifies the input dataset to be used by the contour filter


| 1
|
|
Only the values 0 and 1 are accepted.


The selected object must be the result of the following: sources (includes readers), filters
|}




The dataset must contain a point or cell array with 1 components
==Extract CTH Parts==




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
Create a surface from a CTH volume fraction.


Extract CTH Parts is a specialized filter for visualizing the data from a CTH simulation. It first converts the selected cell-centered arrays to point-centered ones. It then contours each array at a value of 0.5. The user has the option of clipping the resulting surface(s) with a plane. This filter only operates on unstructured data. It produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Double Volume Arrays'''<br>''(AddDoubleVolumeArrayName)''
|
|
| '''Point Merge Method'''<br>''(Locator)'
This property specifies the name(s) of the volume fraction array(s) for generating parts.


This property specifies an incremental point locator for merging duplicate / coincident points
| �
|
An array of scalars is required.


|-
| '''Float Volume Arrays'''<br>''(AddFloatVolumeArrayName)''
|
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


The selected object must be the result of the following: incremental_point_locators
| �
|
An array of scalars is required.


|-
| '''Unsigned Character Volume Arrays'''<br>''(AddUnsignedCharVolumeArrayName)''
|
This property specifies the name(s) of the volume fraction array(s) for generating parts.


The value must be set to one of the following: MergePoints, IncrementalOctreeMergePoints, NonMergingPointLocator
| �
|
An array of scalars is required.


|-
| '''Clip Type'''<br>''(ClipPlane)''
|
This property specifies whether to clip the dataset, and if so, it also specifies the parameters of the plane with which to clip.


| �
|
|
| '''Contour By'''<br>''(SelectInputScalars)'
The value must be set to one of the following: None, Plane, Box, Sphere.


This property specifies the name of the scalar array from which the contour filter will compute isolines and/or isosurfaces
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Extract CTH Parts filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


An array of scalars is required


The dataset must contain a cell array with 1 components.


Valud array names will be chosen from point and cell data


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)''
|
|
The value of this property is the volume fraction value for the surface.


| 0.1
|
The value must be greater than or equal to 0 and less than or equal to 1.


==Cosmology FOF Halo Finder=
|}




Sorry, no help is currently available
==Extract Cells By Region==




This filter extracts cells that are inside/outside a region or at a region boundary.


This filter extracts from its input dataset all cells that are either completely inside or outside of a specified region (implicit function). On output, the filter generates an unstructured grid.<br>
To use this filter you must specify a region  (implicit function). You must also specify whethter to extract cells lying inside or outside of the region. An option exists to extract cells that are neither inside or outside (i.e., boundary).<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Extract intersected'''<br>''(Extract intersected)''
|
This parameter controls whether to extract cells that are on the boundary of the region.


{| class="PropertiesTable" border="1" cellpadding="5
| 0
|
|
| '''Property''
Only the values 0 and 1 are accepted.
| '''Description''
 
| '''Default Value(s)''
| '''Restrictions''
|-
| '''Extract only intersected'''<br>''(Extract only intersected)''
|
|
| '''bb (linking length/distance)'''<br>''(BB)'
This parameter controls whether to extract only cells that are on the boundary of the region. If this parameter is set, the Extraction Side parameter is ignored. If Extract Intersected is off, this parameter has no effect.


Linking length measured in units of interparticle spacing and is dimensionless. Used to link particles into halos for the friend-of-a-friend algorithm
| 0
|
Only the values 0 and 1 are accepted.


| 0.
|-
| '''Extraction Side'''<br>''(ExtractInside)''
|
This parameter controls whether to extract cells that are inside or outside the region.


The value must be greater than or equal to 0
| 1
|
The value must be one of the following: outside (0), inside (1).


|-
| '''Intersect With'''<br>''(ImplicitFunction)''
|
This property sets the region used to extract cells.


| �
|
|
| '''Compute the most bound particle for halos'''<br>''(ComputeMostBoundParticle)'
The value must be set to one of the following: Plane, Box, Sphere.


If checked, the most bound particle will be calculated. This can be very slow
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Slice filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|}


|
| '''Compute the most connected particle for halos'''<br>''(ComputeMostConnectedParticle)'


If checked, the most connected particle will be calculated.  This can be very slow
==Extract Edges==


|


Only the values 0 and 1 are accepted
Extract edges of 2D and 3D cells as lines.


The Extract Edges filter produces a wireframe version of the input dataset by extracting all the edges of the dataset's cells as lines. This filter operates on any type of data set and produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Copy halo catalog information to original particles'''<br>''(CopyHaloDataToParticles)'
This property specifies the input to the Extract Edges filter.
 
If checked, the halo catalog information will be copied to the original particles as well


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|
| '''Halo position for 3D visualization'''<br>''(HaloPositionType)'
|}


This sets the position for the halo catalog particles (second output) in 3D space for visualization.  Input particle positions (first output) will be unaltered by this.  MBP and MCP for particle positions can potentially take a very long time to calculate


|
==Extract Level==


The value must be one of the following: Average (0), Center of Mass (1), Most Bound Particle (2), Most Connected Particle (3)


This filter extracts a range of groups from a hierarchical dataset.


This filter extracts a range of levels from a hierarchical dataset<br>
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Input'''<br>''(Input)'
This property specifies the input to the Extract Group filter.
|
 
|
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected object must be the result of the following: sources (includes readers), filters


The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet.


The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid
|-
| '''Levels'''<br>''(Levels)''
|
This property lists the levels to extract
from the input hierarchical dataset.


| �
| �
|}


|
| '''np (number of seeded particles in one dimension, i.e., total particles = np^3)'''<br>''(NP)'


Number of seeded particles in one dimension.  Therefore, total simulation particles is np^3 (cubed)
==Extract Selection==


| 25


The value must be greater than or equal to 0
Extract different type of selections.


This filter extracts a set of cells/points given a selection.<br>
The selection can be obtained from a rubber-band selection<br>
(either cell, visible or in a frustum) or threshold selection<br>
and passed to the filter or specified by providing an ID list.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''overlap (shared point/ghost cell gap distance)'''<br>''(Overlap)'
This property specifies the input from which the selection is extracted.
 
The space in rL units to extend processor particle ownership for ghost particles/cells. Needed for correct halo calculation when halos cross processor boundaries in parallel computation


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value must be greater than or equal to 0


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable.


|-
| '''Preserve Topology'''<br>''(PreserveTopology)''
|
|
| '''pmin (minimum particle threshold for a halo)'''<br>''(PMin)'
If this property is set to 1 the output preserves the topology of its
input and adds an insidedness array to mark which cells are inside or
out. If 0 then the output is an unstructured grid which contains only
the subset of cells that are inside.


Minimum number of particles (threshold) needed before a group is called a halo
| 0
 
|
| 1
Only the values 0 and 1 are accepted.
 
The value must be greater than or equal to 1


|-
| '''Selection'''<br>''(Selection)''
|
The input that provides the selection object.


| �
|
|
| '''rL (physical box side length)'''<br>''(RL)'
The selected object must be the result of the following: sources (includes readers), filters.


The box side length used to wrap particles around if they exceed rL (or less than 0) in any dimension (only positive positions are allowed in the input, or the are wrapped around)


| 90.140
The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.


The value must be greater than or equal to 0
|-
| '''Show Bounds'''<br>''(ShowBounds)''
|
For frustum selection, if this property is set to 1 the output is the
outline of the frustum instead of the contents of the input that lie
within the frustum.


| 0
|
Only the values 0 and 1 are accepted.


|
|}




==Curvature=
==Extract Subset==




This filter will compute the Gaussian or mean curvature of the mesh at each point
Extract a subgrid from a structured grid with the option of setting subsample strides.


The Curvature filter computes the curvature at each point in a polygonal data set. This filter supports both Gaussian and mean curvatures.<br><br><br
The Extract Grid filter returns a subgrid of a structured input data set (uniform rectilinear, curvilinear, or nonuniform rectilinear). The output data set type of this filter is the same as the input type.<br>
; the type can be selected from the Curvature type menu button.<br


{| class="PropertiesTable" border="1" cellpadding="5
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Include Boundary'''<br>''(IncludeBoundary)''
|
|
| '''Property''
If the value of this property is 1, then if the sample rate in any dimension is greater than 1, the boundary indices of the input dataset will be passed to the output even if the boundary extent is not an even multiple of the sample rate in a given dimension.
| '''Description''
 
| '''Default Value(s)''
| 0
| '''Restrictions''
|
|
| '''Curvature Type'''<br>''(CurvatureType)'
Only the values 0 and 1 are accepted.


This propery specifies which type of curvature to compute
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Extract Grid filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value must be one of the following: Gaussian (0), Mean (1)


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkRectilinearGrid, vtkStructuredPoints, vtkStructuredGrid.


|-
| '''Sample Rate I'''<br>''(SampleRateI)''
|
|
| '''Input'''<br>''(Input)'
This property indicates the sampling rate in the I dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


This property specifies the input to the Curvature filter
| 1
|
The value must be greater than or equal to 1.


|-
| '''Sample Rate J'''<br>''(SampleRateJ)''
|
|
This property indicates the sampling rate in the J dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


The selected object must be the result of the following: sources (includes readers), filters
| 1
|
The value must be greater than or equal to 1.


|-
| '''Sample Rate K'''<br>''(SampleRateK)''
|
This property indicates the sampling rate in the K dimension. A value grater than 1 results in subsampling; every nth index will be included in the output.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData
| 1
|
The value must be greater than or equal to 1.


|-
| '''V OI'''<br>''(VOI)''
|
This property specifies the minimum and maximum point indices along each of the I, J, and K axes; these values indicate the volume of interest (VOI). The output will have the (I,J,K) extent specified here.


| 0 0 0 0 0 0
|
|
| '''Invert Mean Curvature'''<br>''(InvertMeanCurvature)'
The values must lie within the extent of the input dataset.


If this property is set to 1, the mean curvature calculation will be inverted. This is useful for meshes with inward-pointing normals
|}


|


Only the values 0 and 1 are accepted
==Extract Surface==




|
Extract a 2D boundary surface using neighbor relations to eliminate internal faces.


The Extract Surface filter extracts the polygons forming the outer surface of the input dataset. This filter operates on any type of data and produces polygonal data as output.<br>


==D3=
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Extract Surface filter.


| �
|
The selected object must be the result of the following: sources (includes readers), filters.


Repartition a data set into load-balanced spatially convex regions.  Create ghost cells if requested


The D3 filter is available when ParaView is run in parallel. It operates on any type of data set to evenly divide it across the processors into spatially contiguous regions. The output of this filter is of type unstructured grid.<br
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


{| class="PropertiesTable" border="1" cellpadding="5
|-
| '''Nonlinear Subdivision Level'''<br>''(NonlinearSubdivisionLevel)''
|
|
| '''Property''
If the input is an unstructured grid with nonlinear faces, this
| '''Description''
parameter determines how many times the face is subdivided into
| '''Default Value(s)''
linear faces.  If 0, the output is the equivalent of its linear
| '''Restrictions''
couterpart (and the midpoints determining the nonlinear
interpolation are discarded).  If 1, the nonlinear face is
triangulated based on the midpoints.  If greater than 1, the
triangulated pieces are recursively subdivided to reach the
desired subdivision.  Setting the value to greater than 1 may
cause some point data to not be passed even if no quadratic faces
exist.  This option has no effect if the input is not an
unstructured grid.
 
| 1
|
|
| '''Boundary Mode'''<br>''(BoundaryMode)'
The value must be greater than or equal to 0 and less than or equal to 4.
 
This property determines how cells that lie on processor boundaries are handled. The "Assign cells uniquely" option assigns each boundary cell to exactly one process, which is useful for isosurfacing. Selecting "Duplicate cells" causes the cells on the boundaries to be copied to each process that shares that boundary. The "Divide cells" option breaks cells across process boundary lines so that pieces of the cell lie in different processes. This option is useful for volume rendering


|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
|
If the value of this property is set to 1, internal surfaces along process boundaries will be removed. NOTE: Enabling this option might cause multiple executions of the data source because more information is needed to remove internal surfaces.


The value must be one of the following: Assign cells uniquely (0), Duplicate cells (1), Divide cells (2)
| 1
 
 
|
|
| '''Input'''<br>''(Input)'
Only the values 0 and 1 are accepted.


This property specifies the input to the D3 filter
|}


|


The selected object must be the result of the following: sources (includes readers), filters
==FFT Of Selection Over Time==




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
Extracts selection over time and plots the FFT


Extracts the data of a selection (e.g. points or cells) over time,<br>
takes the FFT of them, and plots them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Minimal Memory'''<br>''(UseMinimalMemory)'
The input from which the selection is extracted.
 
If this property is set to 1, the D3 filter requires communication routines to use minimal memory than without this restriction


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet, vtkTable, vtkCompositeDataSet.


|-
| '''Selection'''<br>''(Selection)''
|
|
The input that provides the selection object.


| �
|
The selected object must be the result of the following: sources (includes readers), filters.


==Decimate=


The selected dataset must be one of the following types (or a subclass of one of them): vtkSelection.


Simplify a polygonal model using an adaptive edge collapse algorithm.  This filter works with triangles only
|}


The Decimate filter reduces the number of triangles in a polygonal data set. Because this filter only operates on triangles, first run the Triangulate filter on a dataset that contains polygons other than triangles.<br


{| class="PropertiesTable" border="1" cellpadding="5
==Feature Edges==
|
| '''Property''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Boundary Vertex Deletion'''<br>''(BoundaryVertexDeletion)'
 
If this property is set to 1, then vertices on the boundary of the dataset can be removed. Setting the value of this property to 0 preserves the boundary of the dataset, but it may cause the filter not to reach its reduction target


|


Only the values 0 and 1 are accepted
This filter will extract edges along sharp edges of surfaces or boundaries of surfaces.


The Feature Edges filter extracts various subsets of edges from the input data set. This filter operates on polygonal data and produces polygonal output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Boundary Edges'''<br>''(BoundaryEdges)''
|
|
| '''Feature Angle'''<br>''(FeatureAngle)'
If the value of this property is set to 1, boundary edges will be extracted. Boundary edges are defined as lines cells or edges that are used by only one polygon.
 
The value of thie property is used in determining where the data set may be split. If the angle between two adjacent triangles is greater than or equal to the FeatureAngle value, then their boundary is considered a feature edge where the dataset can be split


| 1
| 1
|
Only the values 0 and 1 are accepted.


The value must be greater than or equal to 0 and less than or equal to 180
|-
| '''Coloring'''<br>''(Coloring)''
|
If the value of this property is set to 1, then the extracted edges are assigned a scalar value based on the type of the edge.


| 0
|
Only the values 0 and 1 are accepted.


|-
| '''Feature Angle'''<br>''(FeatureAngle)''
|
|
| '''Input'''<br>''(Input)'
Ths value of this property is used to define a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. (See the FeatureEdges property.)


This property specifies the input to the Decimate filter
| 30
|
The value must be greater than or equal to 0 and less than or equal to 180.


|-
| '''Feature Edges'''<br>''(FeatureEdges)''
|
|
If the value of this property is set to 1, feature edges will be extracted. Feature edges are defined as edges that are used by two polygons whose dihedral angle is greater than the feature angle. (See the FeatureAngle property.)
Toggle whether to extract feature edges.


The selected object must be the result of the following: sources (includes readers), filters
| 1
 
|
 
Only the values 0 and 1 are accepted.
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData
 


|-
| '''Input'''<br>''(Input)''
|
|
| '''Preserve Topology'''<br>''(PreserveTopology)'
This property specifies the input to the Feature Edges filter.
 
If this property is set to 1, decimation will not split the dataset or produce holes, but it may keep the filter from reaching the reduction target. If it is set to 0, better reduction can occur (reaching the reduction target), but holes in the model may be produced


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
| '''Manifold Edges'''<br>''(ManifoldEdges)''
|
|
| '''Target Reduction'''<br>''(TargetReduction)'
If the value of this property is set to 1, manifold edges will be extracted. Manifold edges are defined as edges that are used by exactly two polygons.


This property specifies the desired reduction in the total number of polygons in the output dataset. For example, if the TargetReduction value is 0.9, the Decimate filter will attempt to produce an output dataset that is 10% the size of the input.
| 0
|
Only the values 0 and 1 are accepted.


| 0.
|-
| '''Non-Manifold Edges'''<br>''(NonManifoldEdges)''
|
If the value of this property is set to 1, non-manifold ediges will be extracted. Non-manifold edges are defined as edges that are use by three or more polygons.


The value must be greater than or equal to 0 and less than or equal to 1
| 1
|
Only the values 0 and 1 are accepted.


|}


|


==Generate Ids==


==Delaunay 2D=


Generate scalars from point and cell ids.


Create 2D Delaunay triangulation of input points. It expects a vtkPointSet as input and produces vtkPolyData as output. The points are expected to be in a mostly planar distribution
This filter generates scalars  using cell and point ids. That is, the point attribute data scalars are generated from the point ids, and the cell attribute data scalars or field data are generated from the the cell ids.<br>


Delaunay2D is a filter that constructs a 2D Delaunay triangulation from a list of input points. These points may be represented by any dataset of type vtkPointSet and subclasses. The output of the filter is a polygonal dataset containing a triangle mesh.<br><br><br
{| class="PropertiesTable" border="1" cellpadding="5"
The 2D Delaunay triangulation is defined as the triangulation that satisfies the Delaunay criterion for n-dimensional simplexes (in this case n=2 and the simplexes are triangles). This criterion states that a circumsphere of each simplex in a triangulation contains only the n+1 defining points of the simplex. In two dimensions, this translates into an optimal triangulation. That is, the maximum interior angle of any triangle is less than or equal to that of any possible triangulation.<br><br><br
|-
Delaunay triangulations are used to build topological structures from unorganized (or unstructured) points. The input to this filter is a list of points specified in 3D, even though the triangulation is 2D. Thus the triangulation is constructed in the x-y plane, and the z coordinate is ignored (although carried through to the output). You can use the option ProjectionPlaneMode in order to compute the best-fitting plane to the set of points, project the points and that plane and then perform the triangulation using their projected positions and then use it as the plane in which the triangulation is performed.<br><br><br
| '''Property'''
The Delaunay triangulation can be numerically sensitive in some cases. To prevent problems, try to avoid injecting points that will result in triangles with bad aspect ratios (1000:1 or greater). In practice this means inserting points that are "widely dispersed", and enables smooth transition of triangle sizes throughout the mesh. (You may even want to add extra points to create a better point distribution.) If numerical problems are present, you will see a warning message to this effect at the end of the triangulation process.<br><br><br
| '''Description'''
Warning:<br
| '''Default Value(s)'''
Points arranged on a regular lattice (termed degenerate cases) can be triangulated in more than one way (at least according to the Delaunay criterion). The choice of triangulation (as implemented by this algorithm) depends on the order of the input points. The first three points will form a triangle; other degenerate points will not break this triangle.<br><br><br
| '''Restrictions'''
Points that are coincident (or nearly so) may be discarded by the algorithm. This is because the Delaunay triangulation requires unique input points. The output of the Delaunay triangulation is supposedly a convex hull. In certain cases this implementation may not generate the convex hull.<br
|-
| '''Array Name'''<br>''(ArrayName)''
|
The name of the array that will contain ids.


{| class="PropertiesTable" border="1" cellpadding="5
| Ids
| �
|-
| '''Input'''<br>''(Input)''
|
|
| '''Property''
This property specifies the input to the Cell Data to Point Data filter.
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Alpha'''<br>''(Alpha)'
 
The value of this property controls the output of this filter. For a non-zero alpha value, only edges or triangles contained within a sphere centered at mesh vertices will be output. Otherwise, only triangles will be output


| �
|
|
 
The selected object must be the result of the following: sources (includes readers), filters.
The value must be greater than or equal to 0




|
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)'


If this property is set to 1, bounding triangulation points (and associated triangles) are included in the output. These are introduced as an initial triangulation to begin the triangulation process. This feature is nice for debugging output
|}


|


Only the values 0 and 1 are accepted
==Generate Quadrature Points==




|
Create a point set with data at quadrature points.
| '''Input'''<br>''(Input)'


This property specifies the input dataset to the Delaunay 2D filter
"Create a point set with data at quadrature points."<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
| �
| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The selected object must be the result of the following: sources (includes readers), filters


The dataset must contain a cell array.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet


The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.


|-
| '''Select Source Array'''<br>''(SelectSourceArray)''
|
|
| '''Offset'''<br>''(Offset)'
Specifies the offset array from which we generate quadrature points.
 
This property is a multiplier to control the size of the initial, bounding Delaunay triangulation


| �
|
|
An array of scalars is required.


The value must be greater than or equal to 0.75
 
|}


|
| '''Projection Plane Mode'''<br>''(ProjectionPlaneMode)'


This property determines type of projection plane to use in performing the triangulation
==Generate Quadrature Scheme Dictionary==


|


The value must be one of the following: XY Plane (0), Best-Fitting Plane (2)
Generate quadrature scheme dictionaries in data sets that do not have them.


Generate quadrature scheme dictionaries in data sets that do not have them.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
| �
| �
|
|
| '''Tolerance'''<br>''(Tolerance)'
The selected object must be the result of the following: sources (includes readers), filters.


This property specifies a tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points


| 1e-0
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.


The value must be greater than or equal to 0 and less than or equal to 1
|}




|
==Generate Surface Normals==




==Delaunay 3D=
This filter will produce surface normals used for smooth shading. Splitting is used to avoid smoothing across feature edges.


This filter generates surface normals at the points of the input polygonal dataset to provide smooth shading of the dataset. The resulting dataset is also polygonal. The filter works by calculating a normal vector for each polygon in the dataset and then averaging the normals at the shared points.<br>


Create a 3D Delaunay triangulation of input                                points. It expects a vtkPointSet as input and                                produces vtkUnstructuredGrid as output
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Compute Cell Normals'''<br>''(ComputeCellNormals)''
|
This filter computes the normals at the points in the data set. In the process of doing this it computes polygon normals too. If you want these normals to be passed to the output of this filter, set the value of this property to 1.


Delaunay3D is a filter that constructs a 3D Delaunay triangulation<br
| 0
from a list of input points. These points may be represented by any<br
|
dataset of type vtkPointSet and subclasses. The output of the filter<br
Only the values 0 and 1 are accepted.
is an unstructured grid dataset. Usually the output is a tetrahedral<br
mesh, but if a non-zero alpha distance value is specified (called<br
the "alpha" value), then only tetrahedra, triangles, edges, and<br
vertices lying within the alpha radius are output. In other words,<br
non-zero alpha values may result in arbitrary combinations of<br
tetrahedra, triangles, lines, and vertices. (The notion of alpha<br
value is derived from Edelsbrunner's work on "alpha shapes".)<br><br><br
The 3D Delaunay triangulation is defined as the triangulation that<br
satisfies the Delaunay criterion for n-dimensional simplexes (in<br
this case n=3 and the simplexes are tetrahedra). This criterion<br
states that a circumsphere of each simplex in a triangulation<br
contains only the n+1 defining points of the simplex. (See text for<br
more information.) While in two dimensions this translates into an<br
"optimal" triangulation, this is not true in 3D, since a measurement<br
for optimality in 3D is not agreed on.<br><br><br
Delaunay triangulations are used to build topological structures<br
from unorganized (or unstructured) points. The input to this filter<br
is a list of points specified in 3D. (If you wish to create 2D<br
triangulations see Delaunay2D.) The output is an unstructured<br
grid.<br><br><br
The Delaunay triangulation can be numerically sensitive. To prevent<br
problems, try to avoid injecting points that will result in<br
triangles with bad aspect ratios (1000:1 or greater). In practice<br
this means inserting points that are "widely dispersed", and enables<br
smooth transition of triangle sizes throughout the mesh. (You may<br
even want to add extra points to create a better point<br
distribution.) If numerical problems are present, you will see a<br
warning message to this effect at the end of the triangulation<br
process.<br><br><br
Warning:<br
Points arranged on a regular lattice (termed degenerate cases) can<br
be triangulated in more than one way (at least according to the<br
Delaunay criterion). The choice of triangulation (as implemented by<br
this algorithm) depends on the order of the input points. The first<br
four points will form a tetrahedron; other degenerate points<br
(relative to this initial tetrahedron) will not break it.<br><br><br
Points that are coincident (or nearly so) may be discarded by the<br
algorithm. This is because the Delaunay triangulation requires<br
unique input points. You can control the definition of coincidence<br
with the "Tolerance" instance variable.<br><br><br
The output of the Delaunay triangulation is supposedly a convex<br
hull. In certain cases this implementation may not generate the<br
convex hull. This behavior can be controlled by the Offset instance<br
variable. Offset is a multiplier used to control the size of the<br
initial triangulation. The larger the offset value, the more likely<br
you will generate a convex hull; and the more likely you are to see<br
numerical problems.<br><br><br
The implementation of this algorithm varies from the 2D Delaunay<br
algorithm (i.e., Delaunay2D) in an important way. When points are<br
injected into the triangulation, the search for the enclosing<br
tetrahedron is quite different. In the 3D case, the closest<br
previously inserted point point is found, and then the connected<br
tetrahedra are searched to find the containing one. (In 2D, a "walk"<br
towards the enclosing triangle is performed.) If the triangulation<br
is Delaunay, then an enclosing tetrahedron will be found. However,<br
in degenerate cases an enclosing tetrahedron may not be found and<br
the point will be rejected.<br


{| class="PropertiesTable" border="1" cellpadding="5
|-
| '''Consistency'''<br>''(Consistency)''
|
|
| '''Property''
The value of this property controls whether consistent polygon ordering is enforced. Generally the normals for a data set should either all point inward or all point outward. If the value of this property is 1, then this filter will reorder the points of cells that whose normal vectors are oriented the opposite direction from the rest of those in the data set.
| '''Description''
 
| '''Default Value(s)''
| 1
| '''Restrictions''
|
|
| '''Alpha'''<br>''(Alpha)'
Only the values 0 and 1 are accepted.


This property specifies the alpha (or distance) value to contro
the output of this filter. For a non-zero alpha value, onl
|-
edges, faces, or tetra contained within the circumsphere (o
| '''Feature Angle'''<br>''(FeatureAngle)''
radius alpha) will be output.  Otherwise, only tetrahedra will b
|
output
The value of this property  defines a feature edge. If the surface normal between two adjacent triangles is at least as large as this Feature Angle, a feature edge exists. If Splitting is on, points are duplicated along these feature edges. (See the Splitting property.)


| 30
|
|
The value must be greater than or equal to 0 and less than or equal to 180.


The value must be greater than or equal to 0
|-
| '''Flip Normals'''<br>''(FlipNormals)''
|
If the value of this property is 1, this filter will reverse the normal direction (and reorder the points accordingly) for all polygons in the data set; this changes front-facing polygons to back-facing ones, and vice versa. You might want to do this if your viewing position will be inside the data set instead of outside of it.


| 0
|
Only the values 0 and 1 are accepted.


|-
| '''Input'''<br>''(Input)''
|
|
| '''Bounding Triangulation'''<br>''(BoundingTriangulation)'
This property specifies the input to the Normals Generation filter.
 
This boolean controls whether bounding triangulation points (an
associated triangles) are included in the output. (These ar
introduced as an initial triangulation to begin the triangulatio
process. This feature is nice for debugging output.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
| '''Non-Manifold Traversal'''<br>''(NonManifoldTraversal)''
|
|
| '''Input'''<br>''(Input)'
Turn on/off traversal across non-manifold edges. Not traversing non-manifold edges will prevent problems where the consistency of polygonal ordering is corrupted due to topological loops.


This property specifies the input dataset to the Delaunay 3D filter
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Piece Invariant'''<br>''(PieceInvariant)''
|
|
Turn this option to to produce the same results regardless of the number of processors used (i.e., avoid seams along processor boundaries). Turn this off if you do want to process ghost levels and do not mind seams.


The selected object must be the result of the following: sources (includes readers), filters
| 1
|
Only the values 0 and 1 are accepted.


|-
| '''Splitting'''<br>''(Splitting)''
|
This property controls the splitting of sharp edges. If sharp edges are split (property value = 1), then points are duplicated along these edges, and separate normals are computed for both sets of points to give crisp (rendered) surface definition.


The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet
| 1
|
Only the values 0 and 1 are accepted.


|}


|
| '''Offset'''<br>''(Offset)'


This property specifies a multiplier to control the size of th
==Glyph==
initial, bounding Delaunay triangulation


| 2.


The value must be greater than or equal to 2.5
This filter generates an arrow, cone, cube, cylinder, line, sphere, or 2D glyph at each point of the input data set.  The glyphs can be oriented and scaled by point attributes of the input dataset.


The Glyph filter generates a glyph (i.e., an arrow, cone, cube, cylinder, line, sphere, or 2D glyph) at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Glyph Transform'''<br>''(GlyphTransform)''
|
|
| '''Tolerance'''<br>''(Tolerance)'
The values in this property allow you to specify the transform
(translation, rotation, and scaling) to apply to the glyph source.


This property specifies a tolerance to control discarding o
| �
closely spaced points. This tolerance is specified as a fractio
|
of the diagonal length of the bounding box of the points
The value must be set to one of the following: Transform2.
 
| 0.00
 
The value must be greater than or equal to 0 and less than or equal to 1


|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.




==Descriptive Statistics=
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)


Compute a statistical model of a dataset and/or assess the dataset with a statistical model
| 5000
|
The value must be greater than or equal to 0.


This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset
<br
|-
This filter computes the min, max, mean, raw moments M2 through M4, standard deviation, skewness, and kurtosis for each array you select
| '''Random Mode'''<br>''(RandomMode)''
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.


<br
| 1
The model is simply a univariate Gaussian distribution with the mean and standard deviation provided. Data is assessed using this model by detrending the data (i.e., subtracting the mean) and then dividing by the standard deviation. Thus the assessment is an array whose entries are the number of standard deviations from the mean that each input point lies.<br
|
Only the values 0 and 1 are accepted.


|-
| '''Scalars'''<br>''(SelectInputScalars)''
|
This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)


{| class="PropertiesTable" border="1" cellpadding="5
|
|
|
| '''Property''
An array of scalars is required.
| '''Description''
 
| '''Default Value(s)''
| '''Restrictions''
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|
|
| '''Attribute Mode'''<br>''(AttributeMode)'
This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)


Specify which type of field data the arrays will be drawn from
| 1
|
An array of vectors is required.


|-
| '''Orient'''<br>''(SetOrient)''
|
|
If this property is set to 1, the glyphs will be oriented based on the selected vector array.


Valud array names will be chosen from point and cell data
| 1
 
|
Only the values 0 and 1 are accepted.


|-
| '''Set Scale Factor'''<br>''(SetScaleFactor)''
|
|
| '''Input'''<br>''(Input)'
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.
 
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model


| 1
|
|
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.


The selected object must be the result of the following: sources (includes readers), filters


The value must lie within the range of the selected data array.


The dataset must contain a point or cell array


The value must lie within the range of the selected data array.


The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData, vtkStructuredGrid, vtkPolyData, vtkUnstructuredGrid, vtkTable, vtkGraph
|-
| '''Scale Mode'''<br>''(SetScaleMode)''
|
The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.


| 1
|
The value must be one of the following: scalar (0), vector (1), vector_components (2), off (3).


|-
| '''Glyph Type'''<br>''(Source)''
|
|
| '''Model Input'''<br>''(ModelInput)'
This property determines which type of glyph will be placed at the points in the input dataset.
 
A previously-calculated model with which to assess a separate dataset. This input is optional


| �
|
|
The selected object must be the result of the following: sources (includes readers), glyph_sources.


The selected object must be the result of the following: sources (includes readers), filters


The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


The selected dataset must be one of the following types (or a subclass of one of them): vtkTable, vtkMultiBlockDataSet


The value must be set to one of the following: ArrowSource, ConeSource, CubeSource, CylinderSource, LineSource, SphereSource, GlyphSource2D.


|-
| '''Mask Points'''<br>''(UseMaskPoints)''
|
|
| '''Variables of Interest'''<br>''(SelectArrays)'
If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)
 
Choose arrays whose entries will be used to form observations for statistical analysis


| 1
|
|
Only the values 0 and 1 are accepted.


An array of scalars is required
|}




|
==Glyph With Custom Source==
| '''Deviations should be'''<br>''(SignedDeviations)'


Should the assessed values be signed deviations or unsigned


|
This filter generates a glyph at each point of the input data set.  The glyphs can be oriented and scaled by point attributes of the input dataset.


The value must be one of the following: Unsigned (0), Signed (1)
The Glyph filter generates a glyph at each point in the input dataset. The glyphs can be oriented and scaled by the input point-centered scalars and vectors. The Glyph filter operates on any type of data set. Its output is polygonal. This filter is available on the Toolbar.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Glyph filter. This is the dataset to which the glyphs will be applied.


| �
|
|
| '''Task'''<br>''(Task)'
The selected object must be the result of the following: sources (includes readers), filters.


Specify the task to be performed: modeling and/or assessment
#  "Statistics of all the data," creates an output table (or tables) summarizing the '''entire''' input dataset
#  "Model a subset of the data," creates an output table (or tables) summarizing a '''randomly-chosen subset''' of the input dataset
#  "Assess the data with a model," adds attributes to the first input dataset using a model provided on the second input port; an
#  "Model and assess the same data," is really just operations 2 and 3 above applied to the same input dataset.  The model is first trained using a fraction of the input data and then the entire dataset is assessed using that model


When the task includes creating a model (i.e., tasks 2, and 4), you may adjust the fraction of the input dataset used for training.  You should avoid using a large fraction of the input data for training as you will then not be able to detect overfitting. The ''Training fraction'' setting will be ignored for tasks 1 and 3
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Maximum Number of Points'''<br>''(MaximumNumberOfPoints)''
|
|
The value of this property specifies the maximum number of glyphs that should appear in the output dataset if the value of the UseMaskPoints property is 1. (See the UseMaskPoints property.)


The value must be one of the following: Statistics of all the data (0), Model a subset of the data (1), Assess the data with a model (2), Model and assess the same data (3)
| 5000
|
The value must be greater than or equal to 0.


|-
| '''Random Mode'''<br>''(RandomMode)''
|
If the value of this property is 1, then the points to glyph are chosen randomly. Otherwise the point ids chosen are evenly spaced.


| 1
|
|
| '''Training Fraction'''<br>''(TrainingFraction)'
Only the values 0 and 1 are accepted.


Specify the fraction of values from the input dataset to be used for model fitting. The exact set of values is chosen at random from the dataset
|-
| '''Scalars'''<br>''(SelectInputScalars)''
|
This property indicates the name of the scalar array on which to operate. The indicated array may be used for scaling the glyphs. (See the SetScaleMode property.)


| 0.
|
|
An array of scalars is required.


The value must be greater than or equal to 0 and less than or equal to 1
|-
| '''Vectors'''<br>''(SelectInputVectors)''
|
This property indicates the name of the vector array on which to operate. The indicated array may be used for scaling and/or orienting the glyphs. (See the SetScaleMode and SetOrient properties.)


| 1
|
An array of vectors is required.


|-
| '''Orient'''<br>''(SetOrient)''
|
|
If this property is set to 1, the glyphs will be oriented based on the selected vector array.


| 1
|
Only the values 0 and 1 are accepted.


==Elevation=
|-
| '''Set Scale Factor'''<br>''(SetScaleFactor)''
|
The value of this property will be used as a multiplier for scaling the glyphs before adding them to the output.


 
| 1
Create point attribute array by projecting points onto an elevation vector
 
The Elevation filter generates point scalar values for an input dataset along a specified direction vector.<br><br><br
The Input menu allows the user to select the data set to which this filter will be applied. Use the Scalar range entry boxes to specify the minimum and maximum scalar value to be generated. The Low Point and High Point define a line onto which each point of the data set is projected. The minimum scalar value is associated with the Low Point, and the maximum scalar value is associated with the High Point. The scalar value for each point in the data set is determined by the location along the line to which that point projects.<br
 
{| class="PropertiesTable" border="1" cellpadding="5
|
|
| '''Property''
The value must be less than the largest dimension of the dataset multiplied by a scale factor of 0.1.
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''High Point'''<br>''(HighPoint)'


This property defines the other end of the direction vector (large scalar values)


| 0 0
The value must lie within the range of the selected data array.


The coordinate must lie within the bounding box of the dataset. It will default to the maximum in each dimension


The value must lie within the range of the selected data array.


|-
| '''Scale Mode'''<br>''(SetScaleMode)''
|
|
| '''Input'''<br>''(Input)'
The value of this property specifies how/if the glyphs should be scaled based on the point-centered scalars/vectors in the input dataset.


This property specifies the input dataset to the Elevation filter
| 1
|
The value must be one of the following: scalar (0), vector (1), vector_components (2), off (3).


|-
| '''Glyph Type'''<br>''(Source)''
|
|
This property determines which type of glyph will be placed at the points in the input dataset.


The selected object must be the result of the following: sources (includes readers), filters
| �
|
The selected object must be the result of the following: sources (includes readers), glyph_sources.




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
The selected dataset must be one of the following types (or a subclass of one of them): vtkPolyData.


|-
| '''Mask Points'''<br>''(UseMaskPoints)''
|
If the value of this property is set to 1, limit the maximum number of glyphs to the value indicated by MaximumNumberOfPoints. (See the MaximumNumberOfPoints property.)


| 1
|
|
| '''Low Point'''<br>''(LowPoint)'
Only the values 0 and 1 are accepted.


This property defines one end of the direction vector (small scalar values)
|}
 
 
==Gradient==


| 0 0


The coordinate must lie within the bounding box of the dataset. It will default to the minimum in each dimension
This filter computes gradient vectors for an image/volume.


The Gradient filter computes the gradient vector at each point in an image or volume. This filter uses central differences to compute the gradients. The Gradient filter operates on uniform rectilinear (image) data and produces image data output.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Dimensionality'''<br>''(Dimensionality)''
|
|
| '''Scalar Range'''<br>''(ScalarRange)'
This property indicates whether to compute the gradient in two dimensions or in three. If the gradient is being computed in two dimensions, the X and Y dimensions are used.


This property determines the range into which scalars will be mapped
| 3
|
The value must be one of the following: Two (2), Three (3).


| 0
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Gradient filter.
| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.




==Extract AMR Blocks=
The dataset must contain a point array with 1 components.




This filter extracts a list of datasets from hierarchical datasets
The selected dataset must be one of the following types (or a subclass of one of them): vtkImageData.


This filter extracts a list of datasets from hierarchical datasets.<br
|-
| '''Select Input Scalars'''<br>''(SelectInputScalars)''
|
This property lists the name of the array from which to compute the gradient.


{| class="PropertiesTable" border="1" cellpadding="5
|
|
|
| '''Property''
An array of scalars is required.
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Input'''<br>''(Input)'


This property specifies the input to the Extract Datasets filter
|}


|


The selected object must be the result of the following: sources (includes readers), filters
==Gradient Of Unstructured DataSet==




The selected dataset must be one of the following types (or a subclass of one of them): vtkHierarchicalBoxDataSet
Estimate the gradient for each point or cell in any type of dataset.


The Gradient (Unstructured) filter estimates the gradient vector at each point or cell. It operates on any type of vtkDataSet, and the output is the same type as the input. If the dataset is a vtkImageData, use the Gradient filter instead; it will be more efficient for this type of dataset.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Compute Vorticity'''<br>''(ComputeVorticity)''
|
|
| '''Selected Data Sets'''<br>''(SelectedDataSets)'
When this flag is on, the gradient filter will compute the
vorticity/curl of a 3 component array.


This property provides a list of datasets to extract
| 0
|
Only the values 0 and 1 are accepted.


|-
| '''Faster Approximation'''<br>''(FasterApproximation)''
|
|
When this flag is on, the gradient filter will provide a less
accurate (but close) algorithm that performs fewer derivative
calculations (and is therefore faster).  The error contains some
smoothing of the output data and some possible errors on the
boundary.  This parameter has no effect when performing the
gradient of cell data.
| 0
|
|
Only the values 0 and 1 are accepted.
|-
| '''Input'''<br>''(Input)''
|
|
This property specifies the input to the Gradient (Unstructured) filter.


| �
|
The selected object must be the result of the following: sources (includes readers), filters.


==Extract Block=


The dataset must contain a point or cell array.


This filter extracts a range of blocks from a multiblock dataset


This filter extracts a range of groups from a multiblock dataset<br
The selected dataset must be one of the following types (or a subclass of one of them): vtkPointSet.


{| class="PropertiesTable" border="1" cellpadding="5
|-
| '''Result Array Name'''<br>''(ResultArrayName)''
|
|
| '''Property''
This property provides a name for the output array containing the gradient vectors.
| '''Description''
 
| '''Default Value(s)''
| Gradients
| '''Restrictions''
| �
|-
| '''Scalar Array'''<br>''(SelectInputScalars)''
|
|
| '''Block Indices'''<br>''(BlockIndices)'
This property lists the name of the scalar array from which to compute the gradient.
 
This property lists the ids of the blocks to extrac
from the input multiblock dataset


| �
|
|
|
An array of scalars is required.
|
| '''Input'''<br>''(Input)'


This property specifies the input to the Extract Group filter


|
Valud array names will be chosen from point and cell data.


The selected object must be the result of the following: sources (includes readers), filters
|}




The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet
==Grid Connectivity==




|
Mass properties of connected fragments for unstructured grids.
| '''Maintain Structure'''<br>''(MaintainStructure)'


This is used only when PruneOutput is ON. By default, when pruning th
This filter works on multiblock unstructured grid inputs and also works in<br>
output i.e. remove empty blocks, if node has only 1 non-null chil
parallel. It Ignores any cells with a cell data Status value of 0.<br>
block, then that node is removed. To preserve these parent nodes, se
It performs connectivity to distict fragments separately.  It then integrates<br>
this flag to true
attributes of the fragments.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
| �
| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


Only the values 0 and 1 are accepted


The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid, vtkCompositeDataSet.


|
| '''Prune Output'''<br>''(PruneOutput)'
|}
 
 
==Group Datasets==


When set, the output mutliblock dataset will be pruned to remove empt
nodes. On by default


|
Group data sets.


Only the values 0 and 1 are accepted
Groups multiple datasets to create a multiblock dataset<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
This property indicates the the inputs to the Group Datasets filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.




==Extract CTH Parts=
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataObject.


|}


Create a surface from a CTH volume fraction


Extract CTH Parts is a specialized filter for visualizing the data from a CTH simulation. It first converts the selected cell-centered arrays to point-centered ones. It then contours each array at a value of 0.5. The user has the option of clipping the resulting surface(s) with a plane. This filter only operates on unstructured data. It produces polygonal output.<br
==Histogram==


{| class="PropertiesTable" border="1" cellpadding="5
|
| '''Property''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
| '''Double Volume Arrays'''<br>''(AddDoubleVolumeArrayName)'


This property specifies the name(s) of the volume fraction array(s) for generating parts
Extract a histogram from field data.
 


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Bin Count'''<br>''(BinCount)''
|
|
The value of this property specifies the number of bins for the histogram.


An array of scalars is required
| 10
|
The value must be greater than or equal to 1 and less than or equal to 256.


|-
| '''Calculate Averages'''<br>''(CalculateAverages)''
|
This option controls whether the algorithm calculates averages
of variables other than the primary variable that fall into each
bin.


| 1
|
|
| '''Float Volume Arrays'''<br>''(AddFloatVolumeArrayName)'
Only the values 0 and 1 are accepted.


This property specifies the name(s) of the volume fraction array(s) for generating parts
|-
| '''Component'''<br>''(Component)''
|
The value of this property specifies the array component from which the histogram should be computed.


| 0
| �
|-
| '''Custom Bin Ranges'''<br>''(CustomBinRanges)''
|
|
Set custom bin ranges to use. These are used only when
UseCustomBinRanges is set to true.


An array of scalars is required
| 0 100
|
The value must lie within the range of the selected data array.


|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Histogram filter.


| �
|
|
| '''Unsigned Character Volume Arrays'''<br>''(AddUnsignedCharVolumeArrayName)'
The selected object must be the result of the following: sources (includes readers), filters.


This property specifies the name(s) of the volume fraction array(s) for generating parts


|
The dataset must contain a point or cell array.


An array of scalars is required


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|-
| '''Select Input Array'''<br>''(SelectInputArray)''
|
|
| '''Clip Type'''<br>''(ClipPlane)'
This property indicates the name of the array from which to compute the histogram.
 
This property specifies whether to clip the dataset, and if so, it also specifies the parameters of the plane with which to clip


| �
|
|
An array of scalars is required.


The value must be set to one of the following: None, Plane, Box, Sphere


Valud array names will be chosen from point and cell data.


|-
| '''Use Custom Bin Ranges'''<br>''(UseCustomBinRanges)''
|
|
| '''Input'''<br>''(Input)'
When set to true, CustomBinRanges will  be used instead of using the
 
full range for the selected array. By default, set to false.
This property specifies the input to the Extract CTH Parts filter


| 0
|
|
Only the values 0 and 1 are accepted.


The selected object must be the result of the following: sources (includes readers), filters
|}




The dataset must contain a cell array with 1 components
==Integrate Variables==




The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
This filter integrates cell and point attributes.


The Integrate Attributes filter integrates point and cell data over lines and surfaces.  It also computes length of lines, area of surface, or volume.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
|
| '''Volume Fraction Value'''<br>''(VolumeFractionSurfaceValue)'
This property specifies the input to the Integrate Attributes filter.


The value of this property is the volume fraction value for the surface
| �
|
The selected object must be the result of the following: sources (includes readers), filters.


| 0.


The value must be greater than or equal to 0 and less than or equal to 1
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


|}


|


==Interpolate to Quadrature Points==


==Extract Cells By Region=


Create scalar/vector data arrays interpolated to quadrature points.


This filter extracts cells that are inside/outside a region or at a region boundary
"Create scalar/vector data arrays interpolated to quadrature points."<br>


This filter extracts from its input dataset all cells that are either completely inside or outside of a specified region (implicit function). On output, the filter generates an unstructured grid.<br
{| class="PropertiesTable" border="1" cellpadding="5"
To use this filter you must specify a region  (implicit function). You must also specify whethter to extract cells lying inside or outside of the region. An option exists to extract cells that are neither inside or outside (i.e., boundary).<br
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
| �
| �
|
The selected object must be the result of the following: sources (includes readers), filters.


{| class="PropertiesTable" border="1" cellpadding="5
 
The selected dataset must be one of the following types (or a subclass of one of them): vtkUnstructuredGrid.
 
|-
| '''Select Source Array'''<br>''(SelectSourceArray)''
|
|
| '''Property''
Specifies the offset array from which we interpolate values to quadrature points.
| '''Description''
 
| '''Default Value(s)''
|
| '''Restrictions''
|
|
| '''Extract intersected'''<br>''(Extract intersected)'
An array of scalars is required.


This parameter controls whether to extract cells that are on the boundary of the region
|}


|


Only the values 0 and 1 are accepted
==Intersect Fragments==




|
The Intersect Fragments filter perform geometric intersections on sets of fragments.
| '''Extract only intersected'''<br>''(Extract only intersected)'


This parameter controls whether to extract only cells that are on the boundary of the region. If this parameter is set, the Extraction Side parameter is ignored. If Extract Intersected is off, this parameter has no effect
The Intersect Fragments filter perform geometric intersections on sets of<br>
fragments. The filter takes two inputs, the first containing fragment<br>
geometry and the second containing fragment centers. The filter has two<br>
outputs. The first is geometry that results from the intersection. The<br>
second is a set of points that is an approximation of the center of where<br>
each fragment has been intersected.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Slice Type'''<br>''(CutFunction)''
|
|
This property sets the type of intersecting geometry, and
associated parameters.


Only the values 0 and 1 are accepted
| �
 
|
The value must be set to one of the following: Plane, Box, Sphere.


|-
| '''Input'''<br>''(Input)''
|
|
| '''Extraction Side'''<br>''(ExtractInside)'
This input must contian fragment geometry.
 
This parameter controls whether to extract cells that are inside or outside the region


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value must be one of the following: outside (0), inside (1)


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.


|-
| '''Source'''<br>''(Source)''
|
|
| '''Intersect With'''<br>''(ImplicitFunction)'
This input must contian fragment centers.
 
This property sets the region used to extract cells


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.


The value must be set to one of the following: Plane, Box, Sphere


The selected dataset must be one of the following types (or a subclass of one of them): vtkMultiBlockDataSet.


|
| '''Input'''<br>''(Input)'
|}


This property specifies the input to the Slice filter


|
==Iso Volume==


The selected object must be the result of the following: sources (includes readers), filters


This filter extracts cells by clipping cells that have point        scalars not in the specified range.


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet
This filter clip away the cells using lower and upper thresholds.<br>


{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Input'''<br>''(Input)''
|
This property specifies the input to the Threshold filter.


| �
|
|
The selected object must be the result of the following: sources (includes readers), filters.




==Extract Edges=
The dataset must contain a point or cell array with 1 components.




Extract edges of 2D and 3D cells as lines
The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet.


The Extract Edges filter produces a wireframe version of the input dataset by extracting all the edges of the dataset's cells as lines. This filter operates on any type of data set and produces polygonal output.<br
|-
| '''Input Scalars'''<br>''(SelectInputScalars)''
|
The value of this property contains the name of the scalar array from which to perform thresholding.


{| class="PropertiesTable" border="1" cellpadding="5
|
|
|
| '''Property''
An array of scalars is required.
| '''Description''
 
| '''Default Value(s)''
| '''Restrictions''
|
| '''Input'''<br>''(Input)'


This property specifies the input to the Extract Edges filter
Valud array names will be chosen from point and cell data.


|-
| '''Threshold Range'''<br>''(ThresholdBetween)''
|
|
The values of this property specify the upper and lower bounds of the thresholding operation.


The selected object must be the result of the following: sources (includes readers), filters
| 0 0
|
The value must lie within the range of the selected data array.


|}


The selected dataset must be one of the following types (or a subclass of one of them): vtkDataSet


==K Means==


|


Compute a statistical model of a dataset and/or assess the dataset with a statistical model.


==Extract Level=
This filter either computes a statistical model of a dataset or takes such a model as its second input.  Then, the model (however it is obtained) may optionally be used to assess the input dataset.
<br>
This filter iteratively computes the center of k clusters in a space whose coordinates are specified by the arrays you select. The clusters are chosen as local minima of the sum of square Euclidean distances from each point to its nearest cluster center. The model is then a set of cluster centers. Data is assessed by assigning a cluster center and distance to the cluster to each point in the input data set.<br>




This filter extracts a range of groups from a hierarchical dataset
{| class="PropertiesTable" border="1" cellpadding="5"
|-
| '''Property'''
| '''Description'''
| '''Default Value(s)'''
| '''Restrictions'''
|-
| '''Attribute Mode'''<br>''(AttributeMode)''
|
Specify which type of field data the arrays will be drawn from.


This filter extracts a range of levels from a hierarchical dataset<br
| 0
|
Valud array names will be chosen from point and cell data.


{| class="PropertiesTable" border="1" cellpadding="5
|
|-
| '''Property''
| '''Input'''<br>''(Input)''
| '''Description''
| '''Default Value(s)''
| '''Restrictions''
|
|
| '''Input'''<br>''(Input)'
The input to the filter.  Arrays from this dataset will be used for computing statistics and/or assessed by a statistical model.
 
This property specifies the input to the Extract Group