Software and Libraries

From oldwiki.scinet.utoronto.ca
Revision as of 19:44, 3 January 2012 by Ljdursi (talk | contribs)
Jump to navigation Jump to search

Software Module System

All the software listed on this page is accessed using a modules system. This means that much of the software is not accessible by default but has to be loaded using the module command. The reason is that

  • it allows us to easily keep multiple versions of software for different users on the system;
  • it allows users to easily switch between versions.

The module system works similarly on the GPC and the TCS, although different modules are installed on these two systems.

Note that, generally, if you compile a program with a module loaded, you will have to run it with that same module loaded, to make dynamically linked libraries accessible.

Function Command Comments
List available software packages:
$ module avail
  • If a module is not listed here, it is not supported.
  • The flag "(default)" is never part of the name.
Use particular software:
 $ module load [module-name] 
  • If possible, specify only the short name (the part before the "/").
  • When ambiguous, this loads the default one.
List available versions of a specific software package:
$ module avail [short-module-name]
List currently loaded modules:
$ module list
For reproducability, it is a good idea to put this in your job scripts, so you know exactly what modules(+version) were used.
Get description of a particular module:
$ module help [module-name]
Remove a module from your shell:
$ module unload [module-name]
Remove all modules:
$ module purge
Replace one loaded module with another:
$ module switch [old-module-name] [new-module-name]

Modules that load libraries, define environment variables pointing to the location of library files and include files for use Makefiles. These environment variables follow the naming convention

SCINET_[short-module-name]_BASE
SCINET_[short-module-name]_LIB
SCINET_[short-module-name]_INC

for the base location of the module's files, the location of the libraries binaries and the header files, respectively.

So to compile and link the library, you will have to add -I${SCINET_[short-module-name]_INC} and -L${SCINET_[short-module-name]_LIB}, respectively, in addition to the usual -l[libname].

Errors in loaded modules can arise for a few reasons, for instance:

  • A module by that name may not exist.
  • Some modules require other modules to have been loaded; it this requirement is not met when you try to load that module, an error message will be printed explaining what module is needed.
  • Some modules cannot be loaded together: an error message will be printed explaining which modules conflict.

It is recommended to load frequently used modules in the file .bashrc in your home directory.

Default and non-default modules

When you load a module with its 'short' name, you will get the default version, which is the most recent (usually), recommended version of that library or piece of software. In general, using the short module name is the way to go. However, you may have code that depends on the intricacies of a non-default version. For that reason, the most common older versions are also available as modules. You can find all available modules using the module avail command.

Naming convention

For modules that access applications, the full name of a module is as follows.

  [short-module-name]/[version-number]

To have all modules conform to this convention, a number of modules' name change on Nov 3, 2010:

old name new name remarks
autoconf/autoconf-2.64      autoconf/2.64 short name unchanged
cuda/cuda-3.0 cuda/3.0 default's short name unchanged
cuda/cuda-3.1 cuda/3.1
debuggers/ddd-3.3.12 ddd/3.3.12
debuggers/gdb-7.1 gdb/7.1
editors/nano/2.2.4 nano/2.2.4
emacs/emacs-23.1 emacs/23.1.1 short name unchanged
gcc/gcc-4.4.0 gcc/4.4.0 short name unchanged
graphics/ncview ncview/1.93
graphics/graphics grace/5.1.22
gnuplot/4.2.6
svn/svn165 svn/1.6.5 short name unchanged
visualization/paraview paraview/3.8
amber10/amber10 amber/10.0.30
gamess/gamess gamess/May2209   default's short name unchanged

modulefind - Finding modules by name

The module avail command will only show you modules whose names start with the argument that you give it, and will alsi return modules that you cannot load due to conflicts with already loaded modules.

A little SciNet utility called modulefind (one word) can do that. It will list all installed modules which contain the arguments, and will determine whether those modules have been loaded, could be loaded, cannot because of conflicts with already loaded modules, or have unresolved dependencies (i.e. for which other modules need to be loaded first). This is especially useful in cases like the "boost" libraries, whose module names are cxxlibraries/boost/1.47.0-gcc and cxxlibraries/boost/1.47.0-gcc, for the gcc and intel compiler, respectively. modulefind boost will find those, whereas module avail boost will not.

Note that just 'modulefind' will list all top-level modules.

Making your own modules

How to make your own modules (e.g. for local installations or to access optional perl modules, ...), is possible and described on the Installing your own modules page.

Deprecated modules

Some older software modules for which newer versions exist, get deprecated, which means they do not get maintained. Since deprecated modules should only be needed in rare exceptional cases, they are not listed by the module avail command. However, if you have a piece of legacy code that really depends on a deprecated version of a library (and we urge you to check that it does not work with newer versions!), then you can load a deprecated version by

module load use.deprecated [deprecated-module-name]

Currently (Oct 5,2010), the following modules are deprecated on the GPC:

gcc/gcc-4.3.2          hdf5/184-v16-serial     intel/intel-v11.1.046               openmpi/1.3.3-intel-v11.0-ofed
hdf5/183-v16-openmpi   hdf5/184-v18-intelmpi   intelmpi/impi-3.2.1.009             openmpi/1.3.2-intel-v11.0-ofed.orig
hdf5/183-v18-openmpi   hdf5/184-v18-openmpi    intelmpi/impi-3.2.2.006             pgplot/5.2.2-gcc.old            
hdf5/184-v16-intelmpi  hdf5/184-v18-serial     intelmpi/impi-4.0.0.013             pgplot/5.2.2-intel.old
hdf5/184-v16-openmpi   intel/intel-v11.0.081   intelmpi/impi-4.0.0.025               

On the TCS, currently (Oct 5,2010) the only deprecated module is:

ncl/5.1.1old

Before using any of these deprecated modules, make sure that there is not a regular module that satisfies your needs, likely by a very similar name.

Commercial software

Apart from the compilers on our systems and the ddt parallel debugger, we generally do not provide licensed application software, e.g., no Gaussian, IDL, Matlab, etc. See the FAQ.

Other software and libraries

If you want to use a piece of software or a library that is not on the list, you can in principle install it yourself in you /home directory. Note however that building libraries and software from source often uses a lot of files. To avoid running out of disk space, building software is therefore best done from the /scratch, from which you can copy/install only the libraries, header files and binaries to your /home directory.

If you suspect that a particular piece of software or a library would be of use to other users of SciNet as well, contact us, and we will consider adding it to the system.

Software lists

ARC/GPU Software

The CPUs in the GPU nodes of the ARC cluster are of the same kind as those of the GPC, so all modules available on the GPC are available on the GPU nodes with a CentOS 6 image. This means that the different cuda variants that are available as modules, can be loaded on those GPC nodes as well, although they are of little use on that system.

GPC Software

The majority of the GPC nodes have CentOS 6.0 as their operating system, while some still run CentOS 5.6 (so that users can run final tests before migrating). In the very near future, all nodes will move to CentOS 6. The set of available modules on a node depends on the OS (although as much as possible only in version numbers).


GPC software on CentOS 6 nodes

Software Version Comments Command/Library Module Name
Compilers
Intel Compiler 12.1 includes MKL library icpc,icc,ifort intel
GCC Compiler 4.6.1 gcc,g++,gfortran gcc
IntelMPI 4.0.2 MPICH2 based MPI mpicc,mpiCC,mpif77,mpif90 intelmpi
OpenMPI 1.4.4*, 1.5.4 mpicc,mpiCC,mpif77,mpif90 openmpi
UPC 2.12.2 Berkley Unified Parallel C Implementation upcc upc
Editors
Nano 2.2.4 Nano's another editor nano nano
Emacs 23.1.1 New version of popular text editor emacs emacs
XEmacs 21.4.22 XEmacs editor xemacs xemacs
Development tools
CMake 2.8.6 cross-platform, open-source build system cmake cmake
Scons 2.0 Software construction tool scons scons
Git 1.7.1 Revision control system git,gitk git
Intel tools 2011 Intel Code Analysis Tools Vtune Amplifier XE, Inspector XE inteltools
Mercurial 1.8.2 Version control system
(part of the python module!)
hg python
Debug and performance tools
DDT 3.1 Allinea's Distributed Debugging Tool ddt ddt
DDD 3.3.12 Data Display Debugger ddd ddd
GDB 7.3.1 GNU debugger (the intel idbc debugger is available by default) gdb gdb
MPE2 2.4.5 Multi-Processing Environment with intel + OpenMPI mpecc, mpefc, jumpshot mpe
OpenSpeedShop 2.0.1 sampling and MPI tracing openss, ... openspeedshop
Scalasca 1.2 SCalable performance Analysis of LArge SCale Applications (Compiled with OpenMPI) scalasca scalasca
IPM 0.983 Integrated Performance Monitors http://ipm-hpc.sourceforge.net/] ipm, ipm_parse, ploticus,... ipm
Valgrind 3.6.1 Memory checking utility valgrind,cachegrind valgrind
Padb 3.2 examine and debug parallel programs padb padb
Visualization tools
Grace 5.1.22 Plotting utility xmgrace grace
Gnuplot 4.2.6 Plotting utility gnuplot gnuplot

-

ParaView 3.12.0 Scientific visualization, server only pvserver,pvbatch,pvpython paraview
VMD 1.9 Visualization and analysis utility vmd vmd
NCL/NCARG 6.0.0 NCARG graphics and ncl utilities ncl ncl
ROOT 5.30.00 ROOT Analysis Framework from CERN root ROOT
ImageMagick 6.6.7 Image manipulation tools convert,animate,composite,... ImageMagick
PGPLOT 5.2.2 Graphics subroutine library libcpgplot,libpgplot,libtkpgplot pgplot
Storage tools and libraries
NetCDF 4.1.3 Scientific data storage and retrieval ncdump,ncgen,libnetcdf netcdf
Ncview 2.1.1 Visualization for NetCDF files ncview ncview
NCO 4.0.8 NCO utilities to manipulate netCDF files ncap, ncap2, ncatted, etc. nco
CDO 1.5.1 Climate Data Operators cdo cdo
UDUNITS 2.1.11 unit conversion utilities libudunits2 udunits
HDF4 4.2.6 Scientific data storage and retrieval h4fc,hdiff,...,libdf,libsz hdf4
HDF5 1.8.7-v18* Scientific data storage and retrieval, parallel I/O h5ls, h5diff, ..., libhdf5 hdf5
EncFS 1.74 EncFS provides an encrypted filesystem in user-space, (works ONLY on gpc01..04) encfs encfs
Applications
AMBER 10 Amber 10 + Amber Tools 1.3 Amber Molecular Dynamics Package sander, sander.MPI amber
antlr 2.7.7 ANother Tool for Language Recognition antlr, antlr-config
libantlr, antlr.jar, antlr.py
antlr
GAMESS (US) August 18, 2011 R1 General Atomic and Molecular Electronic Structure System rungms gamess
GROMACS 4.5.5 GROMACS molecular mechanics, single precision, MPI grompp, mdrun gromacs
NAMD 2.8 NAMD - Scalable Molecular Dynammics namdmpiexec, namd2 namd
NWChem 6.0 NWChem Quantum Chemistry nwchem nwchem
BLAST 2.2.23+ Basic Local Alignment Search Tool blastn,blastp,blastx,psiblast,tblastn... blast
CPMD 3.13.2 Carr-Parinello molecular dynamics, MPI cpmd.x cpmd
R 2.13.1 statistical computing R R
Octave 3.4.3 Matlab-like environment octave octave
Libraries
PETSc 3.1* Portable, Extensible Toolkit for Scientific Computation (PETSc) libpetsc, etc.. petsc
BOOST 1.47.0* C++ Boost libraries libboost... cxxlibraries/boost
GotoBLAS 1.13 Optimized BLAS implementation libgoto2 gotoblas
GSL 1.13 GNU Scientific Library libgsl, libgslcblas gsl
FFTW 3.3 fast Fourier transform library

Be careful in combining fftw3 and MKL: you need to link fftw3 first, with -L${SCINET_FFTW_LIB} -lfftw3, then link MKL

libfftw3 fftw
LAPACK Provided by the Intel MKL library See http://software.intel.com/en-us/articles/intel-mkl-link-line-advisor/ intel
RLog 1.4 RLog provides a flexible message logging facility for C++ programs and libraries. librlog cxxlibraries/rlog
Programming/scripting languages
Cuda 4.0* NVIDIA's extension to C for GPGPU programming nvcc cuda
GNU Parallel 2010 execute commands in parallel parallel gnu-parallel
Python 2.7.2 Python programming language python python
Ruby 1.9.1 Ruby programming language ruby ruby
Java 1.6.0 IBM's Java JRE ad SDK java, javac java
Extras A collection of standard linux and home-grown tools bc, screen, xxdiff, modulefind, ish, ... extras

* Several versions of this module are installed; listed is the default version.

GPC software on CentOS 5 nodes

Software Version Comments Command/Library Module Name
Compilers
Intel Compiler 11.1,update 6* includes MKL library icpc,icc,ifort intel
GCC Compiler 4.4.0* gcc,g++,gfortran gcc
IntelMPI 4.0.0 mpicc,mpiCC,mpif77,mpif90 intelmpi
OpenMPI 1.4.1* mpicc,mpiCC,mpif77,mpif90 openmpi
Editors
Nano 2.2.4 Nano's another editor nano nano
Emacs 23.1 New version of popular text editor emacs emacs
XEmacs 21.4.22 XEmacs editor xemacs xemacs
Development tools
Autoconf 2.64 system to automatically configure software source code package autoconf, ... autoconf
CMake 2.8.0 cross-platform, open-source build system cmake cmake
Git 1.6.3 Revision control system git,gitk git
Mercurial 1.3.1 Version control system
(part of the python module!)
hg python
Scons 1.3.0 Software construction tool scons scons
Subversion 2.6.5 Version control system svn svn
Debug and performance tools
DDT 3.0 Allinea's Distributed Debugging Tool ddt ddt
DDD 3.3.12 Data Display Debugger ddd ddd
GDB 7.1 GNU debugger (the intel idbc debugger is available by default) gdb gdb
MPE2 2.4.5 Multi-Processing Environment with intel + OpenMPI mpecc, mpefc, jumpshot mpe
OpenSpeedShop 1.9.3.4* sampling and MPI tracing openss, ... openspeedshop
Scalasca 1.2 SCalable performance Analysis of LArge SCale Applications (Compiled with OpenMPI) scalasca scalasca
Valgrind 3.5.0* Memory checking utility valgrind,cachegrind valgrind
Padb 3.2 examine and debug parallel programs padb padb
Visualization tools
Grace 5.22.1 Plotting utility xmgrace grace
Gnuplot 4.2.6 Plotting utility gnuplot gnuplot
VMD 1.8.6 Visualization and analysis utility vmd vmd
Ferret 6.4 Plotting utility ferret ferret
NCL/NCARG 5.1.1 NCARG graphics and ncl utilities ncl ncl
ROOT 5.26.00 ROOT Analysis Framework from CERN root ROOT
ParaView 3.8.0 Scientific visualization, server only pvserver,pvbatch,pvpython paraview
PGPLOT 5.2.2* Graphics subroutine library libcpgplot,libpgplot,libtkpgplot pgplot
ImageMagick 6.6.7 Image manipulation tools convert,animate,composite,... ImageMagick
Storage tools and libraries
NetCDF 4.0.1* Scientific data storage and retrieval ncdump,ncgen,libnetcdf netcdf
Parallel netCDF 1.1.1 Scientific data storage and retrieval using MPI-IO libpnetcdf.a parallel-netcdf
Ncview 1.93g Visualization for NetCDF files ncview ncview
NCO 4.0.8 NCO utilities to manipulate netCDF files ncap, ncap2, ncatted, etc. nco
CDO 1.5.1 Climate Data Operators cdo cdo
UDUNITS 2.1.11 unit conversion utilities libudunits2 udunits
HDF4 4.2r4* Scientific data storage and retrieval h4fc,hdiff,...,libdf,libsz hdf4
HDF5 1.8.4-v18* Scientific data storage and retrieval, parallel I/O h5ls, h5diff, ..., libhdf5 hdf5
EncFS 1.74 EncFS provides an encrypted filesystem in user-space, (works ONLY on gpc01..04) encfs encfs
Applications
GAMESS (US) January 12, 2009 R3 General Atomic and Molecular Electronic Structure System rungms gamess
NWChem 5.1.1 NWChem Quantum Chemistry nwchem nwchem
GROMACS 4.5.1 GROMACS molecular mechanics, single precision, MPI grompp, mdrun gromacs
CPMD 3.13.2 Carr-Parinello molecular dynamics, MPI cpmd.x cpmd
BLAST 2.2.23+ Basic Local Alignment Search Tool blastn,blastp,blastx,psiblast,tblastn... blast
AMBER 10 Amber 10 + Amber Tools 1.3 Amber Molecular Dynamics Package sander, sander.MPI amber
NAMD NAMD 2.8 NAMD Molecular Dynamics Package namd2, charmrun namd
GDAL 1.7.1 Geospatial Data Abstraction Library gdal_contour,gdal_rasterize,gdal_grid, libgdal gdal
MEEP 1.1.1* MIT Electromagnetic Equation Propagation meep, meep-mpi meep/1.1.1-serial
meep/1.1.1-intelmpi
meep/1.1.1-openmpi
MPB 1.4.2 MIT Photonic Bands mpb, mpb-data, mpb-split mpb
Octave 3.2.4 GNU Octave - MATLAB-like environment octave octave
R 2.11. statistical computing R R
Libraries
PETSc 3.0.0* Portable, Extensible Toolkit for Scientific Computation (PETSc) libpetsc, etc.. petsc
BOOST 1.40 C++ Boost libraries libboost... cxxlibraries/boost
GotoBLAS 1.13 Optimized BLAS implementation libgoto2 gotoblas
GSL 1.13 GNU Scientific Library libgsl, libgslcblas gsl
FFTW 3.2.2* fast Fourier transform library

Be careful in combining fftw3 and MKL: you need to link fftw3 first, with -L${SCINET_FFTW_LIB} -lfftw3, then link MKL

libfftw3 fftw
LAPACK Provided by the Intel MKL library See http://software.intel.com/en-us/articles/intel-mkl-link-line-advisor/ intel
extras Full set of X11 libraries and others not installed on compute nodes bc, dmidecode, gv, iostat, lsof, tkdiff, zip, libXaw,...,libjpeg extras
RLog 1.4 RLog provides a flexible message logging facility for C++ programs and libraries. librlog clog
Programming/scripting languages
Cuda 3.2* NVIDIA's extension to C for GPGPU programming nvcc cuda
GNU Parallel 2010 execute commands in parallel parallel gnu-parallel
Guile + ctl 1.8.7 + 3.1 guile + libctl scheme interpreter libguile, libctl guile
Java 1.6.0 IBM's Java JRE ad SDK java, javac java
Python 2.6.2 Python programming language python python
Ruby 1.9.1 Ruby programming language ruby ruby

* Several versions of this module are installed; listed is the default version.

TCS Software

Software Version Comments Command/Library Module Name
Compilers
IBM compilers 10.1(c/c++)
12.1(fortran)
See TCS Quickstart xlc,xlC,xlf,xlc_r,xlC_r,xlf_r standard available
IBM MPI library See TCS Quickstart mpcc,mpCC,mpxlf,mpcc_r,mpCC_r,mpxlf_r standard available
UPC 1.2 Unified Parallel C xlupc upc
IBM fortran compiler 13.1 newer version xlf,xlf_r xlf/13.1
IBM c/c++ compilers 11.1 new versions xlc,xlC,xlc_r,xlC_r vacpp
Debug/performancs tools
MPE2 1.0.6 Performance Visualization for Parallel Programs libmpe mpe
Scalasca 1.2 SCalable performance Analysis of LArge SCale Applications scalasca, ... scalasca
Storage tools and libraries
HDF4 4.2.5 Scientific data storage and retrieval h4fc, hdiff, ..., libdf, libsz hdf4
HDF5 Scientific data storage and retrieval, parallel I/O
Part of the extras module on the tcs:
compile with -I${SCINET_EXTRAS_INC}
link with -L${SCINET_EXTRAS_LIB}
libhdf5 extras
NetCDF + ncview 4.0.1* Scientific data storage and retrieval ncdump, ncgen, libnetcdf, ncview netcdf
parallel netCDF 1.1.1* Scientific data storage and retrieval using MPI-IO libpnetcdf.a parallel-netcdf
NCO 3.9.6* NCO utilities to manipulate netCDF files ncap, ncap2, ncatted, etc. nco
Libraries
FFTW 3.2.2 Fast Fourier transform library
Part of the extras module on the tcs:
compile with -I${SCINET_EXTRAS_INC}
link with -L${SCINET_EXTRAS_LIB}
libfftw, libfftw_mpi,libfftw3 extras
GSL 1.13 GNU Scientific Library libgsl, libgslcblas gsl
extras Adds paths to a fuller set of libraries to your user environment
compile with -I${SCINET_EXTRAS_INC}
link with -L${SCINET_EXTRAS_LIB}
libfftw, libfftw_mpi, libfftw3, libhdf5, liblapack, ... extras
Other
antlr 2.7.7 ANother Tool for Language Recognition antlr, antlr-config
libantlr, antlr.jar, antlr.py
antlr
NCL 5.1.1 NCAR Command Language ncl, libncl, ... ncl

* Several versions of this module are installed; listed is the default version.

P7 Software

Soon...