Compiling ParaView3 for Cray supercomputers: Difference between revisions
No edit summary |
|||
(103 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
==Objectives and Overview== | ==Objectives and Overview== | ||
Our goal is to run pvbatch on Cray massively parallel processor systems. Pvbatch is ParaView's [http://en.wikipedia.org/wiki/Message_Passing_Interface MPI]-enabled batch application. It reads batch scripts written in python and distributes the work across many processors. Pvbatch will be built when you compile ParaView 3. Before you can compile ParaView 3 you must compile | Our goal is to run pvbatch on Cray massively parallel processor systems. Pvbatch is ParaView's [http://en.wikipedia.org/wiki/Message_Passing_Interface MPI]-enabled batch application. It reads batch scripts written in python and distributes the work across many processors. Pvbatch will be built when you compile ParaView 3. Before you can compile ParaView 3 you must compile CMake, OSMesa, and Python. The entire process will take about three hours to complete and you should have at least one gigabyte of workspace. | ||
These instructions are intended for Cray MPP systems running the Catamount operating system. Specifically, these instructions have been tested on the [http://www.psc.edu/machines/cray/xt3/bigben.html Bigben] XT3 supercomputer at [http://www.psc.edu/ Pittsburgh Supercomputing Center.] | These instructions are intended for Cray MPP systems running the Catamount operating system. Specifically, these instructions have been tested on the [http://www.psc.edu/machines/cray/xt3/bigben.html Bigben] XT3 supercomputer at [http://www.psc.edu/ Pittsburgh Supercomputing Center.] | ||
Line 12: | Line 10: | ||
* front end node - the computer/shell you log into and work on. | * front end node - the computer/shell you log into and work on. | ||
* native operating system - the operating system that runs on the front end node ( | * native operating system - the operating system that runs on the front end node (SuSE linux for example) | ||
* native build - software that executes on | * native build - software that is built for and executes on front end nodes | ||
* compute node - the computers/processors running | * compute node - the computers/processors running parallel computations | ||
* catamount - the operating system that runs on the compute nodes | * catamount - the operating system that runs on the compute nodes | ||
* catamount build - software that has been cross compiled to execute on the compute | * catamount build - software that has been cross compiled to execute on the compute nodes | ||
===Build steps=== | ===Build steps=== | ||
You will log into a shell on a front end node. You will download | You will log into a shell on a front end node. You will download source code and then compile CMake, OSMesa, Python, and ParaView3. Some of these packages must be compiled twice- one native version and one cross compiled version. The steps are: | ||
# Compile a CMake native build. | # Compile a CMake native build. | ||
# Compile an OSMesa catamount build. | # Compile an OSMesa catamount build. | ||
# Compile a Python native build. | # Compile a Python native build. | ||
# Compile a Python catamount build. | # Compile a Python catamount build. | ||
# Compile | # Compile ParaView native build tools. | ||
# Compile a ParaView | # Compile a ParaView catamount build. | ||
===Why are the native builds required?=== | ===Why are the native builds required?=== | ||
Line 38: | Line 32: | ||
During the ParaView build process helper binaries are compiled and executed to generate source files for future build targets. When you cross compile ParaView the helper binaries cannot execute since they are non-native to the front end node you are working on. The solution is to build a native version of ParaView first, and then tell CMake to use the native helper binaries while cross compiling. | During the ParaView build process helper binaries are compiled and executed to generate source files for future build targets. When you cross compile ParaView the helper binaries cannot execute since they are non-native to the front end node you are working on. The solution is to build a native version of ParaView first, and then tell CMake to use the native helper binaries while cross compiling. | ||
=== | ===Additional information=== | ||
Some | The instructions on this wiki page detail the steps required to build the software but do not provide additional background information. Some concepts used but not explained on this wiki page are TryRunResults and Toolchain files. You may find these pages very helpful: | ||
* [http://www.cmake.org/Wiki/CmakeCrayXt3 CMake/CrayXT3] | * [http://www.cmake.org/Wiki/CmakeCrayXt3 CMake/CrayXT3] | ||
Line 48: | Line 42: | ||
==Compilers== | ==Compilers== | ||
The front end nodes have more than one compiler installed. We will use the [http://www.pgroup.com/ PGI] and GNU compilers. At Bigben, the PGI compiler is the default compiler when you log in. You can switch | The front end nodes have more than one compiler installed. We will use both the [http://www.pgroup.com/ PGI] and GNU compilers. At Bigben, the PGI compiler is the default compiler when you log in. You can switch to gcc like this: | ||
<pre> | <pre> | ||
Line 60: | Line 54: | ||
===Toolchains=== | ===Toolchains=== | ||
When you cross compile with CMake you will input a toolchain file. | When you cross compile with CMake you will input a toolchain file. The instructions on this wiki page use individual toolchain files for each compiler, but in reality the toolchain files are identical. The instructions assume path names like this: | ||
<pre> | |||
~/ | |||
toolchains/ | |||
Toolchain-Catamount-gcc.cmake | |||
Toolchain-Catamount-pgi.cmake | |||
</pre> | |||
The contents of the toolchain files can be found on the [http://www.cmake.org/Wiki/CmakeCrayXt3 CMake/CrayXT3] page. | |||
==Directory structure== | ==Directory structure== | ||
Setup your directories however you'd like. | Setup your directories however you'd like. Path names on this wiki page are usually given in two forms, a general form and an example form, where ~/ is your home directory: | ||
<pre> | |||
General form Example form | |||
<install-dir> ~/install | |||
<catamount-install-dir> ~/install-catamount | |||
<toolchain-dir> ~/toolchains | |||
<paraview-source-dir> ~/projects/paraview/ParaView3 | |||
<paraview-native-build-dir> ~/projects/paraview/build-native | |||
... ... | |||
</pre> | |||
Here is how my directory tree looks: | |||
<pre> | <pre> | ||
Line 84: | Line 100: | ||
projects/ | projects/ | ||
cmake/ | cmake/ | ||
cmake-2.6.0/ | |||
build | build/ | ||
mesa/ | mesa/ | ||
Line 101: | Line 117: | ||
build-catamount/ | build-catamount/ | ||
</pre> | </pre> | ||
Note, some of these directories will be created automatically when you extract archives or checkout code from cvs/svn. | Note, some of these directories will be created automatically when you extract archives or checkout code from cvs/svn. The install directories and subdirectories are created automatically when you run "make install" commands. Here is a command you could use to set up the directory tree: | ||
The install directories and subdirectories are created automatically when you run "make install" commands. | |||
<pre> | |||
cd ~/ | |||
mkdir toolchains projects projects/cmake projects/cmake/build projects/mesa projects/python projects/python/python-with-cmake projects/python/build-native projects/python/build-catamount projects/paraview projects/paraview/build-native projects/paraview/build-catamount | |||
</pre> | |||
==Compiling CMake== | ==Compiling CMake== | ||
[http://www.cmake.org/HTML/Index.html CMake home page]<br/> | You will need CMake version 2.6 or greater. If CMake is already installed on the front end, you will not need to compile it. [http://www.cmake.org/HTML/Index.html CMake home page]<br/> | ||
===Getting the source=== | ===Getting the source=== | ||
You | You can download the latest release of CMake source code from the CMake homepage. | ||
<pre> | <pre> | ||
cd ~/projects/cmake | cd ~/projects/cmake | ||
wget http://www.cmake.org/files/v2.6/cmake-2.6.0.tar.gz | |||
tar -zxvf cmake-2.6.0.tar.gz | |||
</pre> | </pre> | ||
Line 123: | Line 141: | ||
It shouldn't matter which compiler you use to build CMake. I used the default PGI compiler. | It shouldn't matter which compiler you use to build CMake. I used the default PGI compiler. | ||
General command: | General build command: | ||
<pre> | <pre> | ||
cd <cmake-build-dir> | cd <cmake-build-dir> | ||
Line 131: | Line 149: | ||
</pre> | </pre> | ||
Example command: | Example build command: | ||
<pre> | <pre> | ||
cd ~/projects/cmake/build | cd ~/projects/cmake/build | ||
.. | ~/projects/cmake/cmake-2.6.0/bootstrap --prefix=~/install | ||
make | make | ||
make install | make install | ||
</pre> | </pre> | ||
==Compiling OSMesa== | ==Compiling OSMesa== | ||
You will download the Mesa source code and compile the [http://www.mesa3d.org/osmesa.html OSMesa] target. OSMesa (off screen mesa) allows rendering with the OpenGL API directly into main memory instead of using system display memory | You will download the Mesa source code and compile the [http://www.mesa3d.org/osmesa.html OSMesa] target. OSMesa (off screen mesa) allows rendering with the OpenGL API directly into main memory instead of using system display memory. | ||
[http://www.mesa3d.org/ Mesa home page]. | [http://www.mesa3d.org/ Mesa home page]. | ||
Line 150: | Line 165: | ||
===Getting the source=== | ===Getting the source=== | ||
You can download the Mesa source directly using wget. | You can download the Mesa source directly using wget. Alternatively, here is the [http://sourceforge.net/project/showfiles.php?group_id=3 Mesa download page] | ||
<pre> | <pre> | ||
cd ~/projects/mesa | cd ~/projects/mesa | ||
Line 156: | Line 171: | ||
tar -zxf MesaLib-7.0.2.tar.gz | tar -zxf MesaLib-7.0.2.tar.gz | ||
</pre> | </pre> | ||
===Catamount build=== | ===Catamount build=== | ||
Use the PGI compiler. | Use the PGI compiler. You might want to copy the source dir before you start (in order to preserve a clean original copy). | ||
<pre> | <pre> | ||
Line 195: | Line 189: | ||
make catamount-osmesa-pgi | make catamount-osmesa-pgi | ||
make install | make install | ||
</pre> | </pre> | ||
==Compiling Python== | ==Compiling Python== | ||
===Getting the source=== | |||
These instructions use Python from the subversion repository. It is possible to use Python release 2.5.1 and apply a patch, more details [http://www.cmake.org/Wiki/BuildingPythonWithCMake are here]. These instructions have been tested with python revision 61085 (future commits could break the CMake files). | |||
The following commands grab Python source from subversion and place it into a directory named <i>python-with-cmake</i>. Next cvs is used to download CMake files directly into the <i>python-with-cmake</i> directory. | |||
<pre> | <pre> | ||
cd ~/projects/python | cd ~/projects/python | ||
svn co http://svn.python.org/projects/python/trunk python-with-cmake | svn co http://svn.python.org/projects/python/trunk python-with-cmake -r 61085 | ||
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login | cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login | ||
Line 218: | Line 210: | ||
===Native build=== | ===Native build=== | ||
Use the GNU compiler. | Use the GNU compiler. Switch from PGI if you need to: | ||
<pre> | |||
module switch PrgEnv-pgi PrgEnv-gnu | |||
</pre> | |||
When configuring with CMake: | |||
* set the variable MODULE_posix_SHARED to OFF. | |||
General build command: | |||
<pre> | <pre> | ||
cd | cd <python-build-native-dir> | ||
<native-install-dir>/bin/ccmake <python-source-dir> -DCMAKE_INSTALL_PREFIX=<native-install-dir> | |||
## configure with ccmake | ## configure with ccmake | ||
Line 233: | Line 232: | ||
</pre> | </pre> | ||
Example build command: | |||
<pre> | <pre> | ||
cd | cd ~/projects/python/build-native | ||
~/install/bin/ccmake ~/projects/python/python-with-cmake -DCMAKE_INSTALL_PREFIX=~/install | |||
## configure with ccmake | ## configure with ccmake | ||
Line 245: | Line 244: | ||
</pre> | </pre> | ||
===Catamount build=== | |||
Use the GNU compiler. Switch from PGI if you need to: | |||
<pre> | |||
module switch PrgEnv-pgi PrgEnv-gnu | |||
</pre> | |||
When configuring with CMake: | |||
* Confirm all MODULE__*_SHARED options are off | * Confirm all MODULE__*_SHARED options are off | ||
Line 255: | Line 259: | ||
* Turn off WITH_THREAD | * Turn off WITH_THREAD | ||
General command: | General build command: | ||
<pre> | <pre> | ||
cd <python-build-catamount-dir> | cd <python-build-catamount-dir> | ||
<native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~<toolchain-dir>/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=<catamount-install-dir> -C <python-source-dir>/CMake/TryRunResults-Python-catamount-gcc.cmake <python-source-dir> | <native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~<toolchain-dir>/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=<catamount-install-dir> -C <python-source-dir>/CMake/TryRunResults-Python-catamount-gcc.cmake <python-source-dir> | ||
Line 266: | Line 269: | ||
make | make | ||
make install | make install | ||
</pre> | </pre> | ||
Example command: | Example build command: | ||
<pre> | <pre> | ||
cd ~/projects/python | cd ~/projects/python/build-catamount | ||
~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=~/install-catamount -C | ~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=~/install-catamount -C ~/projects/python/python-with-cmake/CMake/TryRunResults-Python-catamount-gcc.cmake ~/projects/python/python-with-cmake/ | ||
## configure with ccmake | ## configure with ccmake | ||
Line 281: | Line 282: | ||
make | make | ||
make install | make install | ||
</pre> | |||
==Compiling ParaView3== | |||
===Getting the source=== | ===Getting the source=== | ||
<pre> | |||
cd ~/projects/paraview | |||
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login | |||
## respond with empty password | |||
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co ParaView3 | |||
</pre> | |||
===Native build=== | ===Native build=== | ||
You only need to compile the pvHostTools target. Use the PGI compiler. Switch from GNU if you need to: | |||
<pre> | |||
module switch PrgEnv-gnu PrgEnv-pgi | |||
</pre> | |||
When configuring ccmake: | |||
* turn on BUILD_SHARED_LIBS | |||
General build command: | |||
<pre> | |||
cd <paraview-native-build-dir> | |||
<native-install-dir>/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 <paraview-source-dir> | |||
## configure with ccmake | |||
make pvHostTools | |||
</pre> | |||
Example build command: | |||
<pre> | |||
cd ~/projects/paraview/build-native | |||
~/install/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 ~/projects/paraview/ParaView3 | |||
## configure with ccmake | |||
make pvHostTools | |||
</pre> | |||
===Catamount build=== | ===Catamount build=== | ||
Use the PGI compiler. Switch from GNU if you need to: | |||
<pre> | |||
module switch PrgEnv-gnu PrgEnv-pgi | |||
</pre> | |||
When configuring with CMake: | |||
* turn on PARAVIEW_ENABLE_PYTHON | |||
* turn on PARAVIEW_USE_MPI | |||
* turn OFF VTK_USE_METAIO | |||
* confirm VTK_OPENGL_HAS_OSMESA: ON | |||
* confirm VTK_NO_PYTHON_THREADS: ON | |||
* confirm BUILD_SHARED_LIBS: OFF | |||
* confirm OSMESA_LIBRARY is the one you cross compiled and installed locally. For example, <catamount-install-dir>/lib/libOSMesa.a | |||
* confirm OSMESA_INCLUDE_DIR is set. For example, <catamount-install-dir>/include | |||
* if OPENGL_INCLUDE_DIR is not found, set it to the same path as OSMESA_INCLUDE_DIR | |||
* confirm PYTHON_LIBRARY is the one you cross compiled and installed locally. For example, <catamount-install-dir>/lib/libpython2.6.a | |||
* confirm PYTHON_INCLUDE_PATH is set. For example, <catamount-install-dir>/include/python2.6 | |||
* set PYTHON_EXECUTABLE to the native python binary, NOT the cross compiled python binary. For example, <native-install-dir>/bin/python | |||
Before building, test your python interpreter by following the [[#Python paths | python paths]] notes. | |||
When cross compiling, don't be surprised to see warning messages such as: "warning: ... is not implemented and will always fail" | |||
General build command: | |||
<pre> | |||
cd <paraview-catamount-build-dir> | |||
<native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=<toolchain-dir>/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=<paraview-native-build-dir> -DPARAVIEW_BUILD_QT_GUI=0 -C <paraview-source-dir>/CMake/TryRunResults-ParaView3-catamount-pgi.cmake <paraview-source-dir> | |||
## configure with ccmake | |||
make | |||
</pre> | |||
Example build command: | |||
<pre> | |||
cd ~/projects/paraview/build-catamount | |||
~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=~/projects/paraview/build-native -DPARAVIEW_BUILD_QT_GUI=0 -C ~/projects/paraview/ParaView3/CMake/TryRunResults-ParaView3-catamount-pgi.cmake ~/projects/paraview/ParaView3 | |||
## configure with ccmake | |||
make | |||
</pre> | |||
==Testing== | ==Testing== | ||
===pvbatch=== | |||
When ParaView has compiled, we'll want to test pvbatch. | |||
Here is a simple python script, <i>coloredSphere.py</i>: | |||
<pre> | |||
from paraview.servermanager import * | |||
Connect() | |||
sphere =sources.SphereSource() | |||
sphere.ThetaResolution = 100 | |||
sphere.PhiResolution = 100 | |||
filter = filters.ProcessIdScalars() | |||
filter.Input = sphere | |||
view = CreateRenderView() | |||
display = CreateRepresentation(filter, view) | |||
lt = rendering.PVLookupTable() | |||
display.LookupTable = lt | |||
display.ColorAttributeType = 0; # Point Data | |||
display.ColorArrayName = "ProcessId" | |||
lt.RGBPoints = [0.0, 0, 0, 1, 1, 1, 0, 0] | |||
lt.ColorSpace = 1 ; # HSV | |||
view.StillRender() | |||
view.ResetCamera() | |||
view.StillRender() | |||
view.WriteImage("/path/to/home/coloredSphere.png","vtkPNGWriter"); | |||
</pre> | |||
Note the script above contains an absolute path to write its output file, coloredSphere.png. Make sure to use a correct path. | |||
The script could be run with pvbatch like this: | |||
<pre> | |||
mpirun -np 2 /path/to/pvbatch coloredSphere.py | |||
</pre> | |||
But we want mpirun and pvbatch to execute on the supercomputer, so we write a job script <i>coloredSphere.job</i>: | |||
<pre> | |||
#!/bin/sh | |||
#PBS -l size=2 | |||
#PBS -l walltime=30 | |||
#PBS -j oe | |||
#PBS -q debug | |||
set echo | |||
pbsyod -size $PBS_O_SIZE ${HOME}/projects/paraview/build-catamount/bin/pvbatch ${HOME}/coloredSphere.py | |||
</pre> | |||
Make sure to use correct path names in the above script. The script is submitted by typing: | |||
<pre> | |||
qsub coloredSphere.job | |||
</pre> | |||
You can check the status of submitted jobs by typing: | |||
<pre> | |||
qstat -a | |||
</pre> | |||
More information about running jobs at Bigben can be [http://www.psc.edu/machines/cray/xt3/bigben.html#running found here]. | |||
===Python paths=== | |||
The python interpreter is executed during the ParaView build process. You probably configured the python interpreter to use shared libraries, in which case you need to set the LD_LIBRARY_PATH environment variable. Depending on your shell, use one of the commands: | |||
<pre> | |||
setenv LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:<native-install-dir>/lib/ | |||
</pre> | |||
<pre> | |||
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:<native-install-dir>/lib/ | |||
</pre> | |||
Test your python interpreter... | |||
General command: | |||
<pre> | |||
<native-install-dir>/bin/python | |||
</pre> | |||
Example command: | |||
<pre> | |||
~/install/bin/python | |||
</pre> | |||
At the python prompt try to import the compileall module (used by ParaView): | |||
<pre> | |||
>>> import compileall | |||
</pre> | |||
Finally, be sure that the cmake variable PYTHON_EXECUTABLE is set to the natively built python interpretor, not the cross compiled one. |
Latest revision as of 13:49, 21 July 2008
Objectives and Overview
Our goal is to run pvbatch on Cray massively parallel processor systems. Pvbatch is ParaView's MPI-enabled batch application. It reads batch scripts written in python and distributes the work across many processors. Pvbatch will be built when you compile ParaView 3. Before you can compile ParaView 3 you must compile CMake, OSMesa, and Python. The entire process will take about three hours to complete and you should have at least one gigabyte of workspace.
These instructions are intended for Cray MPP systems running the Catamount operating system. Specifically, these instructions have been tested on the Bigben XT3 supercomputer at Pittsburgh Supercomputing Center.
Terminology
These terms are probably self explanatory, but just to clarify...
- front end node - the computer/shell you log into and work on.
- native operating system - the operating system that runs on the front end node (SuSE linux for example)
- native build - software that is built for and executes on front end nodes
- compute node - the computers/processors running parallel computations
- catamount - the operating system that runs on the compute nodes
- catamount build - software that has been cross compiled to execute on the compute nodes
Build steps
You will log into a shell on a front end node. You will download source code and then compile CMake, OSMesa, Python, and ParaView3. Some of these packages must be compiled twice- one native version and one cross compiled version. The steps are:
- Compile a CMake native build.
- Compile an OSMesa catamount build.
- Compile a Python native build.
- Compile a Python catamount build.
- Compile ParaView native build tools.
- Compile a ParaView catamount build.
Why are the native builds required?
During the ParaView build process helper binaries are compiled and executed to generate source files for future build targets. When you cross compile ParaView the helper binaries cannot execute since they are non-native to the front end node you are working on. The solution is to build a native version of ParaView first, and then tell CMake to use the native helper binaries while cross compiling.
Additional information
The instructions on this wiki page detail the steps required to build the software but do not provide additional background information. Some concepts used but not explained on this wiki page are TryRunResults and Toolchain files. You may find these pages very helpful:
Compilers
The front end nodes have more than one compiler installed. We will use both the PGI and GNU compilers. At Bigben, the PGI compiler is the default compiler when you log in. You can switch to gcc like this:
## switch from PGI to GNU compiler module switch PrgEnv-pgi PrgEnv-gnu ## switch from GNU to PGI compiler module switch PrgEnv-gnu PrgEnv-pgi
Toolchains
When you cross compile with CMake you will input a toolchain file. The instructions on this wiki page use individual toolchain files for each compiler, but in reality the toolchain files are identical. The instructions assume path names like this:
~/ toolchains/ Toolchain-Catamount-gcc.cmake Toolchain-Catamount-pgi.cmake
The contents of the toolchain files can be found on the CMake/CrayXT3 page.
Directory structure
Setup your directories however you'd like. Path names on this wiki page are usually given in two forms, a general form and an example form, where ~/ is your home directory:
General form Example form <install-dir> ~/install <catamount-install-dir> ~/install-catamount <toolchain-dir> ~/toolchains <paraview-source-dir> ~/projects/paraview/ParaView3 <paraview-native-build-dir> ~/projects/paraview/build-native ... ...
Here is how my directory tree looks:
~/ install/ bin/ include/ lib/ install-catamount/ bin/ include/ lib/ toolchains/ projects/ cmake/ cmake-2.6.0/ build/ mesa/ mesa-native/ mesa-catamount/ python/ python-for-cmake/ build-native/ build-catamount/ paraview/ ParaView3/ build-native/ build-catamount/
Note, some of these directories will be created automatically when you extract archives or checkout code from cvs/svn. The install directories and subdirectories are created automatically when you run "make install" commands. Here is a command you could use to set up the directory tree:
cd ~/ mkdir toolchains projects projects/cmake projects/cmake/build projects/mesa projects/python projects/python/python-with-cmake projects/python/build-native projects/python/build-catamount projects/paraview projects/paraview/build-native projects/paraview/build-catamount
Compiling CMake
You will need CMake version 2.6 or greater. If CMake is already installed on the front end, you will not need to compile it. CMake home page
Getting the source
You can download the latest release of CMake source code from the CMake homepage.
cd ~/projects/cmake wget http://www.cmake.org/files/v2.6/cmake-2.6.0.tar.gz tar -zxvf cmake-2.6.0.tar.gz
Native build
It shouldn't matter which compiler you use to build CMake. I used the default PGI compiler.
General build command:
cd <cmake-build-dir> <cmake-src-dir>/bootstrap --prefix=<native-install-dir> make make install
Example build command:
cd ~/projects/cmake/build ~/projects/cmake/cmake-2.6.0/bootstrap --prefix=~/install make make install
Compiling OSMesa
You will download the Mesa source code and compile the OSMesa target. OSMesa (off screen mesa) allows rendering with the OpenGL API directly into main memory instead of using system display memory.
Getting the source
You can download the Mesa source directly using wget. Alternatively, here is the Mesa download page
cd ~/projects/mesa wget http://easynews.dl.sourceforge.net/sourceforge/mesa3d/MesaLib-7.0.2.tar.gz tar -zxf MesaLib-7.0.2.tar.gz
Catamount build
Use the PGI compiler. You might want to copy the source dir before you start (in order to preserve a clean original copy).
cd ~/projects/mesa cp -r Mesa-7.0.2 mesa-catamount ## edit mesa-catamount/configs/default ## ## replace line: INSTALL_DIR = /usr/local ## with: INSTALL_DIR = ~/install-catamount ## or: INSTALL_DIR = <catamount-install-dir> cd mesa-catamount make catamount-osmesa-pgi make install
Compiling Python
Getting the source
These instructions use Python from the subversion repository. It is possible to use Python release 2.5.1 and apply a patch, more details are here. These instructions have been tested with python revision 61085 (future commits could break the CMake files).
The following commands grab Python source from subversion and place it into a directory named python-with-cmake. Next cvs is used to download CMake files directly into the python-with-cmake directory.
cd ~/projects/python svn co http://svn.python.org/projects/python/trunk python-with-cmake -r 61085 cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login ## respond with empty password cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co -d python-with-cmake ParaView3/Utilities/CMakeBuildForPython
Native build
Use the GNU compiler. Switch from PGI if you need to:
module switch PrgEnv-pgi PrgEnv-gnu
When configuring with CMake:
- set the variable MODULE_posix_SHARED to OFF.
General build command:
cd <python-build-native-dir> <native-install-dir>/bin/ccmake <python-source-dir> -DCMAKE_INSTALL_PREFIX=<native-install-dir> ## configure with ccmake make make install
Example build command:
cd ~/projects/python/build-native ~/install/bin/ccmake ~/projects/python/python-with-cmake -DCMAKE_INSTALL_PREFIX=~/install ## configure with ccmake make make install
Catamount build
Use the GNU compiler. Switch from PGI if you need to:
module switch PrgEnv-pgi PrgEnv-gnu
When configuring with CMake:
- Confirm all MODULE__*_SHARED options are off
- Turn off MODULE__pwd_ENABLE
- Turn off ENABLE_IPV6
- Turn off WITH_THREAD
General build command:
cd <python-build-catamount-dir> <native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~<toolchain-dir>/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=<catamount-install-dir> -C <python-source-dir>/CMake/TryRunResults-Python-catamount-gcc.cmake <python-source-dir> ## configure with ccmake make make install
Example build command:
cd ~/projects/python/build-catamount ~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=~/install-catamount -C ~/projects/python/python-with-cmake/CMake/TryRunResults-Python-catamount-gcc.cmake ~/projects/python/python-with-cmake/ ## configure with ccmake make make install
Compiling ParaView3
Getting the source
cd ~/projects/paraview cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login ## respond with empty password cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co ParaView3
Native build
You only need to compile the pvHostTools target. Use the PGI compiler. Switch from GNU if you need to:
module switch PrgEnv-gnu PrgEnv-pgi
When configuring ccmake:
- turn on BUILD_SHARED_LIBS
General build command:
cd <paraview-native-build-dir> <native-install-dir>/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 <paraview-source-dir> ## configure with ccmake make pvHostTools
Example build command:
cd ~/projects/paraview/build-native ~/install/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 ~/projects/paraview/ParaView3 ## configure with ccmake make pvHostTools
Catamount build
Use the PGI compiler. Switch from GNU if you need to:
module switch PrgEnv-gnu PrgEnv-pgi
When configuring with CMake:
- turn on PARAVIEW_ENABLE_PYTHON
- turn on PARAVIEW_USE_MPI
- turn OFF VTK_USE_METAIO
- confirm VTK_OPENGL_HAS_OSMESA: ON
- confirm VTK_NO_PYTHON_THREADS: ON
- confirm BUILD_SHARED_LIBS: OFF
- confirm OSMESA_LIBRARY is the one you cross compiled and installed locally. For example, <catamount-install-dir>/lib/libOSMesa.a
- confirm OSMESA_INCLUDE_DIR is set. For example, <catamount-install-dir>/include
- if OPENGL_INCLUDE_DIR is not found, set it to the same path as OSMESA_INCLUDE_DIR
- confirm PYTHON_LIBRARY is the one you cross compiled and installed locally. For example, <catamount-install-dir>/lib/libpython2.6.a
- confirm PYTHON_INCLUDE_PATH is set. For example, <catamount-install-dir>/include/python2.6
- set PYTHON_EXECUTABLE to the native python binary, NOT the cross compiled python binary. For example, <native-install-dir>/bin/python
Before building, test your python interpreter by following the python paths notes.
When cross compiling, don't be surprised to see warning messages such as: "warning: ... is not implemented and will always fail"
General build command:
cd <paraview-catamount-build-dir> <native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=<toolchain-dir>/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=<paraview-native-build-dir> -DPARAVIEW_BUILD_QT_GUI=0 -C <paraview-source-dir>/CMake/TryRunResults-ParaView3-catamount-pgi.cmake <paraview-source-dir> ## configure with ccmake make
Example build command:
cd ~/projects/paraview/build-catamount ~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=~/projects/paraview/build-native -DPARAVIEW_BUILD_QT_GUI=0 -C ~/projects/paraview/ParaView3/CMake/TryRunResults-ParaView3-catamount-pgi.cmake ~/projects/paraview/ParaView3 ## configure with ccmake make
Testing
pvbatch
When ParaView has compiled, we'll want to test pvbatch.
Here is a simple python script, coloredSphere.py:
from paraview.servermanager import * Connect() sphere =sources.SphereSource() sphere.ThetaResolution = 100 sphere.PhiResolution = 100 filter = filters.ProcessIdScalars() filter.Input = sphere view = CreateRenderView() display = CreateRepresentation(filter, view) lt = rendering.PVLookupTable() display.LookupTable = lt display.ColorAttributeType = 0; # Point Data display.ColorArrayName = "ProcessId" lt.RGBPoints = [0.0, 0, 0, 1, 1, 1, 0, 0] lt.ColorSpace = 1 ; # HSV view.StillRender() view.ResetCamera() view.StillRender() view.WriteImage("/path/to/home/coloredSphere.png","vtkPNGWriter");
Note the script above contains an absolute path to write its output file, coloredSphere.png. Make sure to use a correct path.
The script could be run with pvbatch like this:
mpirun -np 2 /path/to/pvbatch coloredSphere.py
But we want mpirun and pvbatch to execute on the supercomputer, so we write a job script coloredSphere.job:
#!/bin/sh #PBS -l size=2 #PBS -l walltime=30 #PBS -j oe #PBS -q debug set echo pbsyod -size $PBS_O_SIZE ${HOME}/projects/paraview/build-catamount/bin/pvbatch ${HOME}/coloredSphere.py
Make sure to use correct path names in the above script. The script is submitted by typing:
qsub coloredSphere.job
You can check the status of submitted jobs by typing:
qstat -a
More information about running jobs at Bigben can be found here.
Python paths
The python interpreter is executed during the ParaView build process. You probably configured the python interpreter to use shared libraries, in which case you need to set the LD_LIBRARY_PATH environment variable. Depending on your shell, use one of the commands:
setenv LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:<native-install-dir>/lib/
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:<native-install-dir>/lib/
Test your python interpreter...
General command:
<native-install-dir>/bin/python
Example command:
~/install/bin/python
At the python prompt try to import the compileall module (used by ParaView):
>>> import compileall
Finally, be sure that the cmake variable PYTHON_EXECUTABLE is set to the natively built python interpretor, not the cross compiled one.