Compiling ParaView3 for Cray supercomputers: Difference between revisions

From KitwarePublic
Jump to navigationJump to search
 
(75 intermediate revisions by the same user not shown)
Line 1: Line 1:
This wiki page is currently a work in progress.
==Objectives and Overview==
==Objectives and Overview==


Line 13: Line 11:
* front end node - the computer/shell you log into and work on.
* front end node - the computer/shell you log into and work on.
* native operating system - the operating system that runs on the front end node (SuSE linux for example)
* native operating system - the operating system that runs on the front end node (SuSE linux for example)
* native build - software that executes on the front end nodes
* native build - software that is built for and executes on front end nodes


* compute node - the computers/processors running scientific computation
* compute node - the computers/processors running parallel computations
* catamount - the operating system that runs on the compute nodes
* catamount - the operating system that runs on the compute nodes
* catamount build - software that has been cross compiled to execute on the compute node
* catamount build - software that has been cross compiled to execute on the compute nodes


===Build steps===
===Build steps===


You will log into a shell on a front end node.  You will download the source code and then compile CMake, OSMesa, Python, and ParaView3.  Some of these packages must be compiled twice- one native version and one cross compiled version.  The steps are:
You will log into a shell on a front end node.  You will download source code and then compile CMake, OSMesa, Python, and ParaView3.  Some of these packages must be compiled twice- one native version and one cross compiled version.  The steps are:


# Compile a CMake native build.
# Compile a CMake native build.
# Compile an OSMesa native build.
# Compile an OSMesa catamount build.
# Compile an OSMesa catamount build.
# Compile a Python native build.
# Compile a Python native build.
# Compile a Python catamount build.
# Compile a Python catamount build.
# Compile a ParaView 3 native build.
# Compile ParaView native build tools.
# Compile a ParaView 3 catamount build.
# Compile a ParaView catamount build.
 
*Step 2 is optional if your front end system already has OSMesa installed.
*Step 4 is optional if your front end system already has Python installed.


===Why are the native builds required?===
===Why are the native builds required?===
Line 40: Line 34:
===Additional information===
===Additional information===


The instructions on this wiki page detail the steps required to build the software but do not provide additional information.  Some concepts used but not explained on this wiki page are TryRunResults and Toolchain files.  You may find these pages very helpful:
The instructions on this wiki page detail the steps required to build the software but do not provide additional background information.  Some concepts used but not explained on this wiki page are TryRunResults and Toolchain files.  You may find these pages very helpful:


* [http://www.cmake.org/Wiki/CmakeCrayXt3 CMake/CrayXT3]
* [http://www.cmake.org/Wiki/CmakeCrayXt3 CMake/CrayXT3]
Line 48: Line 42:
==Compilers==
==Compilers==


The front end nodes have more than one compiler installed.  We will use the [http://www.pgroup.com/ PGI] compiler.  At Bigben, the PGI compiler is the default compiler when you log in.  You can switch to gcc like this:
The front end nodes have more than one compiler installed.  We will use both the [http://www.pgroup.com/ PGI] and GNU compilers.  At Bigben, the PGI compiler is the default compiler when you log in.  You can switch to gcc like this:


<pre>
<pre>
Line 60: Line 54:
===Toolchains===
===Toolchains===


When you cross compile with CMake you will input a toolchain file.  You will use one toolchain file when you use the PGI compiler, and a different toolchain file when you use GNU.
When you cross compile with CMake you will input a toolchain file.  The instructions on this wiki page use individual toolchain files for each compiler, but in reality the toolchain files are identical.  The instructions assume path names like this:
 
<pre>
~/
  toolchains/
    Toolchain-Catamount-gcc.cmake
    Toolchain-Catamount-pgi.cmake
</pre>


For more information see the [http://www.cmake.org/Wiki/CmakeCrayXt3 CMake/CrayXT3] page.
The contents of the toolchain files can be found on the [http://www.cmake.org/Wiki/CmakeCrayXt3 CMake/CrayXT3] page.


==Directory structure==
==Directory structure==
Line 99: Line 100:
   projects/
   projects/
     cmake/
     cmake/
       CMake/
       cmake-2.6.0/
       build-native/
       build/


     mesa/
     mesa/
Line 120: Line 121:
<pre>
<pre>
cd ~/
cd ~/
mkdir toolchains projects projects/cmake projects/cmake/build projects/mesa projects/python projects/python/build-native projects/python/build-catamount projects/paraview projects/paraview/build-native projects/paraview/build-catamount  
mkdir toolchains projects projects/cmake projects/cmake/build projects/mesa projects/python projects/python/python-with-cmake projects/python/build-native projects/python/build-catamount projects/paraview projects/paraview/build-native projects/paraview/build-catamount  
</pre>
</pre>


==Compiling CMake==
==Compiling CMake==


[http://www.cmake.org/HTML/Index.html CMake home page]<br/>
You will need CMake version 2.6 or greater.  If CMake is already installed on the front end, you will not need to compile it.  [http://www.cmake.org/HTML/Index.html CMake home page]<br/>


===Getting the source===
===Getting the source===


You will need the latest version of CMake from CVS.
You can download the latest release of CMake source code from the CMake homepage.


<pre>
<pre>
cd ~/projects/cmake
cd ~/projects/cmake
cvs -d :pserver:anonymous@www.cmake.org:/cvsroot/CMake login
wget http://www.cmake.org/files/v2.6/cmake-2.6.0.tar.gz
## respond with password: cmake
tar -zxvf cmake-2.6.0.tar.gz
cvs -d :pserver:anonymous@www.cmake.org:/cvsroot/CMake co CMake
</pre>
</pre>


Line 152: Line 152:
<pre>
<pre>
cd ~/projects/cmake/build
cd ~/projects/cmake/build
~/projects/cmake/CMake/bootstrap --prefix=~/install
~/projects/cmake/cmake-2.6.0/bootstrap --prefix=~/install
make
make
make install
make install
Line 159: Line 159:
==Compiling OSMesa==
==Compiling OSMesa==


You will download the Mesa source code and compile the [http://www.mesa3d.org/osmesa.html OSMesa] target.  OSMesa (off screen mesa) allows rendering with the OpenGL API directly into main memory instead of using system display memory.  The native build is only required if your native system does not have OSMesa already installed.  At Bigben, OSMesa was found at /usr/lib64/libOSMesa.so with headers in /usr/include.
You will download the Mesa source code and compile the [http://www.mesa3d.org/osmesa.html OSMesa] target.  OSMesa (off screen mesa) allows rendering with the OpenGL API directly into main memory instead of using system display memory.


[http://www.mesa3d.org/ Mesa home page].
[http://www.mesa3d.org/ Mesa home page].
Line 165: Line 165:
===Getting the source===
===Getting the source===


You can download the Mesa source directly using wget.  In case the url changes, here is the [http://sourceforge.net/project/showfiles.php?group_id=3 Mesa download page]
You can download the Mesa source directly using wget.  Alternatively, here is the [http://sourceforge.net/project/showfiles.php?group_id=3 Mesa download page]
<pre>
<pre>
cd ~/projects/mesa
cd ~/projects/mesa
Line 171: Line 171:
tar -zxf MesaLib-7.0.2.tar.gz
tar -zxf MesaLib-7.0.2.tar.gz
</pre>
</pre>
===Native build===
Use the PGI compiler.  Since Mesa uses an in-source build you might want to copy the source dir before you start.
<pre>
cd ~/projects/mesa
cp -r Mesa-7.0.2 mesa-native
## edit mesa-native/configs/default
##
## replace line:  INSTALL_DIR = /usr/local
## with:          INSTALL_DIR = ~/install
## or:            INSTALL_DIR = <native-install-dir>
cd mesa-native
make linux-osmesa
make install
</pre>


===Catamount build===
===Catamount build===


Use the PGI compiler.  Since Mesa uses an in-source build you might want to copy the source dir before you start.
Use the PGI compiler.  You might want to copy the source dir before you start (in order to preserve a clean original copy).


<pre>
<pre>
Line 210: Line 190:
make install
make install
</pre>
</pre>


==Compiling Python==
==Compiling Python==


CMake files for building Python can be checked out from the ParaView repository.  The native python build is only required if your system doesn't already have python libraries and binaries installed.  On Bigben, python was located at /usr/lib64/libpython2.3.so and /usr/bin/python2.3.
===Getting the source===


===Getting the source===
These instructions use Python from the subversion repository.  It is possible to use Python release 2.5.1 and apply a patch, more details [http://www.cmake.org/Wiki/BuildingPythonWithCMake are here].  These instructions have been tested with python revision 61085 (future commits could break the CMake files).


These instructions use Python from the subversion repository.  It is possible to use Python release 2.5.1 and apply a patch, more details [http://www.cmake.org/Wiki/BuildingPythonWithCMake are here].  The following commands grab Python source from subversion and place it into a directory named <i>python-with-cmake</i>.  Next cvs downloads CMake files directly into the <i>python-with-cmake</i> directory.
The following commands grab Python source from subversion and place it into a directory named <i>python-with-cmake</i>.  Next cvs is used to download CMake files directly into the <i>python-with-cmake</i> directory.


<pre>
<pre>
cd ~/projects/python
cd ~/projects/python
svn co http://svn.python.org/projects/python/trunk python-with-cmake
svn co http://svn.python.org/projects/python/trunk python-with-cmake -r 61085


cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login
Line 231: Line 210:
===Native build===
===Native build===


You can use the PGI compiler.
Use the GNU compiler.  Switch from PGI if you need to:
 
<pre>
module switch PrgEnv-pgi PrgEnv-gnu
</pre>
 
When configuring with CMake:
 
* set the variable MODULE_posix_SHARED to OFF.


General build command:
General build command:
Line 257: Line 244:
</pre>
</pre>


===Catamount build===


Use the GNU compiler.  Switch from PGI if you need to:


===Catamount build===
<pre>
module switch PrgEnv-pgi PrgEnv-gnu
</pre>


You can use the PGI compiler.  When configuring with CMake:
When configuring with CMake:


* Confirm all MODULE__*_SHARED options are off
* Confirm all MODULE__*_SHARED options are off
Line 308: Line 299:
===Native build===
===Native build===


You will build a native version of ParaView but do not need to install it.  Use the PGI compiler.  When configuring ccmake:
You only need to compile the pvHostTools target.  Use the PGI compiler.  Switch from GNU if you need to:
 
<pre>
module switch PrgEnv-gnu PrgEnv-pgi
</pre>
 
When configuring ccmake:


* turn on BUILD_SHARED_LIBS
* turn on BUILD_SHARED_LIBS
* turn on PARAVIEW_ENABLE_PYTHON
 
* watch out for these [[#X11 Pitfalls | X11 pitfalls]].
* double check these [[#Python paths | python paths]].


General build command:
General build command:
Line 323: Line 318:
## configure with ccmake
## configure with ccmake


make
make pvHostTools
</pre>
</pre>


Line 335: Line 330:
## configure with ccmake
## configure with ccmake


make
make pvHostTools
</pre>
</pre>


===Catamount build===
===Catamount build===


Use the PGI compiler.  When configuring with CMake:
Use the PGI compiler.  Switch from GNU if you need to:
 
<pre>
module switch PrgEnv-gnu PrgEnv-pgi
</pre>
 
When configuring with CMake:


* turn on PARAVIEW_ENABLE_PYTHON
* turn on PARAVIEW_ENABLE_PYTHON
Line 350: Line 351:
* confirm BUILD_SHARED_LIBS: OFF
* confirm BUILD_SHARED_LIBS: OFF


* confirm OSMESA_LIBRARY is the one you cross compiled and installed locally.
 
* confirm PYTHON_LIBRARY is the one you cross compiled and installed locally.
 
* set PYTHON_EXECUTABLE to a native python binary, NOT a cross compiled python binary
* confirm OSMESA_LIBRARY is the one you cross compiled and installed locally. For example, <catamount-install-dir>/lib/libOSMesa.a
* confirm OSMESA_INCLUDE_DIR is set.  For example, <catamount-install-dir>/include
* if OPENGL_INCLUDE_DIR is not found, set it to the same path as OSMESA_INCLUDE_DIR
 
 
 
* confirm PYTHON_LIBRARY is the one you cross compiled and installed locally. For example, <catamount-install-dir>/lib/libpython2.6.a
* confirm PYTHON_INCLUDE_PATH is set.  For example, <catamount-install-dir>/include/python2.6
* set PYTHON_EXECUTABLE to the native python binary, NOT the cross compiled python binary.  For example, <native-install-dir>/bin/python
 
 
 
 
Before building, test your python interpreter by following the [[#Python paths | python paths]] notes.
 
When cross compiling, don't be surprised to see warning messages such as: "warning: ... is not implemented and will always fail"


General build command:
General build command:
Line 378: Line 394:
==Testing==
==Testing==


Here is a simple python script to test pvbatch, <i>coloredSphere.py</i>:
 
===pvbatch===
 
When ParaView has compiled, we'll want to test pvbatch.
 
 
Here is a simple python script, <i>coloredSphere.py</i>:


<pre>
<pre>
Line 403: Line 425:
view.StillRender()
view.StillRender()


view.WriteImage("/usr/users/6/bgeveci/coloredSphere.png","vtkPNGWriter");
view.WriteImage("/path/to/home/coloredSphere.png","vtkPNGWriter");
</pre>
</pre>


Note the script contains an absolute path to write its output file, coloredSphere.png.
Note the script above contains an absolute path to write its output file, coloredSphere.png. Make sure to use a correct path.
The script could be run with the command:
 
 
The script could be run with pvbatch like this:


<pre>
<pre>
mpirun -np 2 pvbatch coloredSphere.py
mpirun -np 2 /path/to/pvbatch coloredSphere.py
</pre>
</pre>


But on Bigben you do not enter the <i>mpirun</i> command directly.  Instead the Bigben system wraps all jobs in a job script.  On Bigben, the job script <i>coloredSphere.job</i> might look like:
But we want mpirun and pvbatch to execute on the supercomputer, so we write a job script <i>coloredSphere.job</i>:


<pre>
<pre>
Line 427: Line 451:
</pre>
</pre>


The script is submitted by typing:
Make sure to use correct path names in the above script.  The script is submitted by typing:


<pre>
<pre>
Line 441: Line 465:
More information about running jobs at Bigben can be [http://www.psc.edu/machines/cray/xt3/bigben.html#running found here].
More information about running jobs at Bigben can be [http://www.psc.edu/machines/cray/xt3/bigben.html#running found here].


==Pitfalls==
===Python paths===


===X11 Pitfalls===
The python interpreter is executed during the ParaView build processYou probably configured the python interpreter to use shared libraries, in which case you need to set the LD_LIBRARY_PATH environment variable.  Depending on your shell, use one of the commands:
The native operating system may or may not have X11 installedDepending on the availability of X11, you may have to compile the native version of ParaView with OSMesa instead of X.


To disable X and enable OSMesa (assuming OSMesa libraries can be found), add this flag to your ccmake command line:
<pre>
setenv LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:<native-install-dir>/lib/
</pre>


<pre>
<pre>
-DVTK_USE_X=0
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:<native-install-dir>/lib/
</pre>
</pre>


Test your python interpreter...


Some modifications to ParaView3/VTK/Rendering/CMakeLists.txt are pending (12/19/07), when these changes are commited to CVS to following notes will no longer be needed:
General command:


To enable OSMesa:
<pre>
<native-install-dir>/bin/python
</pre>


* erase the contents of OPENGL_gl_LIBRARY, make it an empty string
Example command:
* confirm OSMESA_LIBRARY is found
* confirm OSMESA_INCLUDE_DIR is found


Even if you think you are using OSMesa and not X11, the CMake procedure CHECK_FUNCTION_EXISTS might find X and decide to use it.  You can fix this by:
<pre>
~/install/bin/python
</pre>


* Set internal CMakeCache variable CMAKE_USE_GLX_PROC_ADDRESS to 0
* Fix the checks in <paraview-source-dir>/VTK/Rendering/CMakeLists.txt so that VTK_NO_EXTENSION_LOADING gets set to 1.
* Confirm that <paraview-build-dir>/VTK/Rendering/vtkOpenGLExtensionManagerConfigure.h has VTK_NO_EXTENSION_LOADING defined and all other definitions are commented out or undefined.


 
At the python prompt try to import the compileall module (used by ParaView):
===Python paths===
 
A python binary is used during the ParaView build process to generate source files for future build targets.  If you have compiled a native python binary that you would like to use then make sure the advanced CMake variables PYTHON_EXECUTABLE, PYTHON_LIBRARY, and PYTHON_INCLUDE_PATH are correct.  If you compiled a native version of python with shared libraries and installed it to a local directory, it may have trouble loading since the install dir is not in your LD_LIBRARY_PATH.  You can test this by typing:


<pre>
<pre>
ldd <native-install-dir>/bin/python
>>> import compileall
 
## bad: libpython2.6.so => not found
</pre>
</pre>


You can solve the problem by adding <native-install-dir>/lib to your ld path, or set the CMake variable PYTHON_EXECUTABLE to the python binary inside your python build directory, since this binary uses rpath.
Finally, be sure that the cmake variable PYTHON_EXECUTABLE is set to the natively built python interpretor, not the cross compiled one.
 
==TODO==
 
Add toolchain files.

Latest revision as of 13:49, 21 July 2008

Objectives and Overview

Our goal is to run pvbatch on Cray massively parallel processor systems. Pvbatch is ParaView's MPI-enabled batch application. It reads batch scripts written in python and distributes the work across many processors. Pvbatch will be built when you compile ParaView 3. Before you can compile ParaView 3 you must compile CMake, OSMesa, and Python. The entire process will take about three hours to complete and you should have at least one gigabyte of workspace.

These instructions are intended for Cray MPP systems running the Catamount operating system. Specifically, these instructions have been tested on the Bigben XT3 supercomputer at Pittsburgh Supercomputing Center.

Terminology

These terms are probably self explanatory, but just to clarify...

  • front end node - the computer/shell you log into and work on.
  • native operating system - the operating system that runs on the front end node (SuSE linux for example)
  • native build - software that is built for and executes on front end nodes
  • compute node - the computers/processors running parallel computations
  • catamount - the operating system that runs on the compute nodes
  • catamount build - software that has been cross compiled to execute on the compute nodes

Build steps

You will log into a shell on a front end node. You will download source code and then compile CMake, OSMesa, Python, and ParaView3. Some of these packages must be compiled twice- one native version and one cross compiled version. The steps are:

  1. Compile a CMake native build.
  2. Compile an OSMesa catamount build.
  3. Compile a Python native build.
  4. Compile a Python catamount build.
  5. Compile ParaView native build tools.
  6. Compile a ParaView catamount build.

Why are the native builds required?

During the ParaView build process helper binaries are compiled and executed to generate source files for future build targets. When you cross compile ParaView the helper binaries cannot execute since they are non-native to the front end node you are working on. The solution is to build a native version of ParaView first, and then tell CMake to use the native helper binaries while cross compiling.

Additional information

The instructions on this wiki page detail the steps required to build the software but do not provide additional background information. Some concepts used but not explained on this wiki page are TryRunResults and Toolchain files. You may find these pages very helpful:

Compilers

The front end nodes have more than one compiler installed. We will use both the PGI and GNU compilers. At Bigben, the PGI compiler is the default compiler when you log in. You can switch to gcc like this:

## switch from PGI to GNU compiler
module switch PrgEnv-pgi PrgEnv-gnu

## switch from GNU to PGI compiler
module switch PrgEnv-gnu PrgEnv-pgi

Toolchains

When you cross compile with CMake you will input a toolchain file. The instructions on this wiki page use individual toolchain files for each compiler, but in reality the toolchain files are identical. The instructions assume path names like this:

~/
  toolchains/
    Toolchain-Catamount-gcc.cmake
    Toolchain-Catamount-pgi.cmake

The contents of the toolchain files can be found on the CMake/CrayXT3 page.

Directory structure

Setup your directories however you'd like. Path names on this wiki page are usually given in two forms, a general form and an example form, where ~/ is your home directory:

General form                      Example form

<install-dir>                     ~/install
<catamount-install-dir>           ~/install-catamount
<toolchain-dir>                   ~/toolchains

<paraview-source-dir>             ~/projects/paraview/ParaView3
<paraview-native-build-dir>       ~/projects/paraview/build-native

...                               ...

Here is how my directory tree looks:

~/
  install/
    bin/
    include/
    lib/

  install-catamount/
    bin/
    include/
    lib/

  toolchains/

  projects/
    cmake/
      cmake-2.6.0/
      build/

    mesa/
      mesa-native/
      mesa-catamount/

    python/
      python-for-cmake/
      build-native/
      build-catamount/

    paraview/
      ParaView3/
      build-native/
      build-catamount/

Note, some of these directories will be created automatically when you extract archives or checkout code from cvs/svn. The install directories and subdirectories are created automatically when you run "make install" commands. Here is a command you could use to set up the directory tree:

cd ~/
mkdir toolchains projects projects/cmake projects/cmake/build projects/mesa projects/python projects/python/python-with-cmake projects/python/build-native projects/python/build-catamount projects/paraview projects/paraview/build-native projects/paraview/build-catamount 

Compiling CMake

You will need CMake version 2.6 or greater. If CMake is already installed on the front end, you will not need to compile it. CMake home page

Getting the source

You can download the latest release of CMake source code from the CMake homepage.

cd ~/projects/cmake
wget http://www.cmake.org/files/v2.6/cmake-2.6.0.tar.gz
tar -zxvf cmake-2.6.0.tar.gz

Native build

It shouldn't matter which compiler you use to build CMake. I used the default PGI compiler.

General build command:

cd <cmake-build-dir>
<cmake-src-dir>/bootstrap --prefix=<native-install-dir>
make
make install

Example build command:

cd ~/projects/cmake/build
~/projects/cmake/cmake-2.6.0/bootstrap --prefix=~/install
make
make install

Compiling OSMesa

You will download the Mesa source code and compile the OSMesa target. OSMesa (off screen mesa) allows rendering with the OpenGL API directly into main memory instead of using system display memory.

Mesa home page.

Getting the source

You can download the Mesa source directly using wget. Alternatively, here is the Mesa download page

cd ~/projects/mesa
wget http://easynews.dl.sourceforge.net/sourceforge/mesa3d/MesaLib-7.0.2.tar.gz
tar -zxf MesaLib-7.0.2.tar.gz

Catamount build

Use the PGI compiler. You might want to copy the source dir before you start (in order to preserve a clean original copy).

cd ~/projects/mesa
cp -r Mesa-7.0.2 mesa-catamount

## edit mesa-catamount/configs/default
##
## replace line:   INSTALL_DIR = /usr/local
## with:           INSTALL_DIR = ~/install-catamount
## or:             INSTALL_DIR = <catamount-install-dir>

cd mesa-catamount
make catamount-osmesa-pgi
make install

Compiling Python

Getting the source

These instructions use Python from the subversion repository. It is possible to use Python release 2.5.1 and apply a patch, more details are here. These instructions have been tested with python revision 61085 (future commits could break the CMake files).

The following commands grab Python source from subversion and place it into a directory named python-with-cmake. Next cvs is used to download CMake files directly into the python-with-cmake directory.

cd ~/projects/python
svn co http://svn.python.org/projects/python/trunk python-with-cmake -r 61085

cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login
## respond with empty password
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co -d python-with-cmake ParaView3/Utilities/CMakeBuildForPython

Native build

Use the GNU compiler. Switch from PGI if you need to:

module switch PrgEnv-pgi PrgEnv-gnu

When configuring with CMake:

  • set the variable MODULE_posix_SHARED to OFF.

General build command:

cd <python-build-native-dir>
<native-install-dir>/bin/ccmake <python-source-dir> -DCMAKE_INSTALL_PREFIX=<native-install-dir>

## configure with ccmake

make
make install

Example build command:

cd ~/projects/python/build-native
~/install/bin/ccmake ~/projects/python/python-with-cmake -DCMAKE_INSTALL_PREFIX=~/install

## configure with ccmake

make
make install

Catamount build

Use the GNU compiler. Switch from PGI if you need to:

module switch PrgEnv-pgi PrgEnv-gnu

When configuring with CMake:

  • Confirm all MODULE__*_SHARED options are off
  • Turn off MODULE__pwd_ENABLE
  • Turn off ENABLE_IPV6
  • Turn off WITH_THREAD

General build command:

cd <python-build-catamount-dir>
<native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~<toolchain-dir>/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=<catamount-install-dir> -C <python-source-dir>/CMake/TryRunResults-Python-catamount-gcc.cmake <python-source-dir>

## configure with ccmake

make
make install

Example build command:

cd ~/projects/python/build-catamount

~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-gcc.cmake -DCMAKE_INSTALL_PREFIX=~/install-catamount -C ~/projects/python/python-with-cmake/CMake/TryRunResults-Python-catamount-gcc.cmake ~/projects/python/python-with-cmake/

## configure with ccmake

make
make install

Compiling ParaView3

Getting the source

cd ~/projects/paraview
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 login
## respond with empty password
cvs -d :pserver:anoncvs@www.paraview.org:/cvsroot/ParaView3 co ParaView3

Native build

You only need to compile the pvHostTools target. Use the PGI compiler. Switch from GNU if you need to:

module switch PrgEnv-gnu PrgEnv-pgi

When configuring ccmake:

  • turn on BUILD_SHARED_LIBS


General build command:

cd <paraview-native-build-dir>
<native-install-dir>/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 <paraview-source-dir>

## configure with ccmake

make pvHostTools

Example build command:

cd ~/projects/paraview/build-native

~/install/bin/ccmake -DPARAVIEW_BUILD_QT_GUI=0 ~/projects/paraview/ParaView3

## configure with ccmake

make pvHostTools

Catamount build

Use the PGI compiler. Switch from GNU if you need to:

module switch PrgEnv-gnu PrgEnv-pgi

When configuring with CMake:

  • turn on PARAVIEW_ENABLE_PYTHON
  • turn on PARAVIEW_USE_MPI
  • turn OFF VTK_USE_METAIO
  • confirm VTK_OPENGL_HAS_OSMESA: ON
  • confirm VTK_NO_PYTHON_THREADS: ON
  • confirm BUILD_SHARED_LIBS: OFF


  • confirm OSMESA_LIBRARY is the one you cross compiled and installed locally. For example, <catamount-install-dir>/lib/libOSMesa.a
  • confirm OSMESA_INCLUDE_DIR is set. For example, <catamount-install-dir>/include
  • if OPENGL_INCLUDE_DIR is not found, set it to the same path as OSMESA_INCLUDE_DIR


  • confirm PYTHON_LIBRARY is the one you cross compiled and installed locally. For example, <catamount-install-dir>/lib/libpython2.6.a
  • confirm PYTHON_INCLUDE_PATH is set. For example, <catamount-install-dir>/include/python2.6
  • set PYTHON_EXECUTABLE to the native python binary, NOT the cross compiled python binary. For example, <native-install-dir>/bin/python



Before building, test your python interpreter by following the python paths notes.

When cross compiling, don't be surprised to see warning messages such as: "warning: ... is not implemented and will always fail"

General build command:

cd <paraview-catamount-build-dir>
<native-install-dir>/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=<toolchain-dir>/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=<paraview-native-build-dir> -DPARAVIEW_BUILD_QT_GUI=0 -C <paraview-source-dir>/CMake/TryRunResults-ParaView3-catamount-pgi.cmake <paraview-source-dir>

## configure with ccmake

make

Example build command:

cd ~/projects/paraview/build-catamount
~/install/bin/ccmake -DCMAKE_TOOLCHAIN_FILE=~/toolchains/Toolchain-Catamount-pgi.cmake -DParaView3CompileTools_DIR=~/projects/paraview/build-native -DPARAVIEW_BUILD_QT_GUI=0 -C ~/projects/paraview/ParaView3/CMake/TryRunResults-ParaView3-catamount-pgi.cmake ~/projects/paraview/ParaView3

## configure with ccmake

make

Testing

pvbatch

When ParaView has compiled, we'll want to test pvbatch.


Here is a simple python script, coloredSphere.py:

from paraview.servermanager import *

Connect()

sphere =sources.SphereSource()
sphere.ThetaResolution = 100
sphere.PhiResolution = 100
filter = filters.ProcessIdScalars()
filter.Input = sphere
view = CreateRenderView()

display = CreateRepresentation(filter, view)
lt = rendering.PVLookupTable()
display.LookupTable = lt
display.ColorAttributeType = 0; # Point Data
display.ColorArrayName = "ProcessId"
lt.RGBPoints = [0.0, 0, 0, 1, 1, 1, 0, 0]
lt.ColorSpace = 1 ; # HSV
view.StillRender()
view.ResetCamera()
view.StillRender()

view.WriteImage("/path/to/home/coloredSphere.png","vtkPNGWriter");

Note the script above contains an absolute path to write its output file, coloredSphere.png. Make sure to use a correct path.


The script could be run with pvbatch like this:

mpirun -np 2 /path/to/pvbatch coloredSphere.py

But we want mpirun and pvbatch to execute on the supercomputer, so we write a job script coloredSphere.job:

#!/bin/sh
#PBS -l size=2
#PBS -l walltime=30
#PBS -j oe
#PBS -q debug

set echo

pbsyod -size $PBS_O_SIZE ${HOME}/projects/paraview/build-catamount/bin/pvbatch ${HOME}/coloredSphere.py

Make sure to use correct path names in the above script. The script is submitted by typing:

qsub coloredSphere.job

You can check the status of submitted jobs by typing:

qstat -a

More information about running jobs at Bigben can be found here.

Python paths

The python interpreter is executed during the ParaView build process. You probably configured the python interpreter to use shared libraries, in which case you need to set the LD_LIBRARY_PATH environment variable. Depending on your shell, use one of the commands:

setenv LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:<native-install-dir>/lib/
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:<native-install-dir>/lib/

Test your python interpreter...

General command:

<native-install-dir>/bin/python

Example command:

~/install/bin/python


At the python prompt try to import the compileall module (used by ParaView):

>>> import compileall

Finally, be sure that the cmake variable PYTHON_EXECUTABLE is set to the natively built python interpretor, not the cross compiled one.