You are here

Lattice

Software Available on Lattice

Title Version Usage Notes
64-bit Linux CentOS CentOS release 6.4 (Final) - 2.6.32-358.el6.x86_64 #1 SMP
ABAQUS 6.12-2

(U of A only) See /global/software/abaqusua .

(U of C only) See /global/software/abaqus .

ABAQUS 6.11-1

(U of C only) See /global/software/abaqus

ABAQUS 6.10-1

(U of C only) See /global/software/abaqus

ABINIT 6.8.1

/global/software/abinit

Amber 11

See detailed Lattice instructions.

Amber 12

See detailed Lattice instructions.

Amber 14

This is version 14 of Amber with patches to level 8 and AmberTools 14 with patches to level 21 compiled with Intel 12.1 compilers and using the MKL libraries. It has been installed on Lattice in /global/software/amber/amber14p8_at14p21. See the more detailed Lattice instructions

ANSYS simulation software, including Fluent and CFX Release 16.1

A suite of ANSYS Release 16.1 products including Structural Mechanics, Fluid Dynamics (CFX, Fluent, TurboGrid and CFD-Post) and Composite PrepPost have been installed on Lattice in /global/software/ansys/ansys16/v161.  Look in subdirectories for the specific products.  See the main WestGrid ANSYS page for details.

ANSYS simulation software, including Fluent and CFX Release 16.2

A suite of ANSYS Release 16.2 products including Structural Mechanics, Fluid Dynamics (CFX, Fluent, TurboGrid and CFD-Post) and Composite PrepPost have been installed on Lattice in /global/software/ansys/ansys16/v162.  Look in subdirectories for the specific products.  See the main WestGrid ANSYS page for details. This installation is for approved University of Calgary researchers only.

bash (default) 3.2.25 or later
BEAGLE (beagle-lib) 1.0.1

/global/software/beaglelib/beaglelib_1086

BLAS In MKL
C - GNU GCC, Intel, Portland Group (PGI) compilers Portland Group 14.6

Intialize your environment to use version 14.6 of the Portland Group compilers with

module load pgi/14.6

C - GNU GCC, Intel, Portland Group (PGI) compilers Portland Group 12.1-0

(pgcc - module load pgi)

C - GNU GCC, Intel, Portland Group (PGI) compilers Intel 11.1

(icc - Default)

C - GNU GCC, Intel, Portland Group (PGI) compilers Intel 12.1.10.319

(icc - module load intel/2011-sp1.10.319)

C++ - GNU GCC, Intel, Portland Group (PGI) compilers Intel 12.1-10.319

(icpc- module load intel/2011-sp1.10.319)

C++ - GNU GCC, Intel, Portland Group (PGI) compilers GNU 4.4.0

(g++44)

C++ - GNU GCC, Intel, Portland Group (PGI) compilers Portland Group 14.6

Initialize your environment to use the Portland Group version 14.6 compilers with

module load pgi/14.6

C++ - GNU GCC, Intel, Portland Group (PGI) compilers GNU 4.1.2

(g++)

C++ - GNU GCC, Intel, Portland Group (PGI) compilers Portland Group 12.10

(pgCC- module load pgi)

CMake 2.8.11

See /global/software/cmake/cmake2811/bin (or links from /global/software/bin).

 
CMake 3.4.1

Pre-compiled binaries (cmake, ccmake, cmake-gui, cpack and ctest) for CMake 3.4.1 have been installed in /global/software/cmake/cmake341/bin.

CMake 2.8.4

/global/ software/ bin/cmake

Compilers - C, C++ and Fortran Intel 11.1

Initialize your environment to use the Intel 11.1 compilers with

module load intel/11.1

Compilers - C, C++ and Fortran Intel 12.0.2

Initialize your environment to use the Intel 12.0.2 compilers with

module load intel/12

Compilers - C, C++ and Fortran Intel 14.0.3

Initialize your environment to use the Intel 14.0.3 compilers with

module load intel/14.0.3

Compilers - C, C++ and Fortran Intel 16.0.0

Initialize your environment to use the Intel 16.0.0 compilers with

module load intel/16.0.0

Compilers - C, C++ and Fortran Intel 12.1.4

The Intel 12.1.4 compilers are on the default command PATH, but, you can also explicitly initialize your environment to use them with

module load intel

Compilers - C, C++ and Fortran PGI 12.1-0

Initialize your environment to use the Portland Group 12.1-0 compilers with

module load pgi/12.1

Compilers - C, C++ and Fortran PGI 14.6-0

Initialize your environment to use the Portland Group 14.6-0 compilers with

module load pgi/14.6

Compilers - C, C++ and Fortran GCC 4.4.7

The GCC 4.4.7 compilers are on the default command PATH.

Compilers - C, C++ and Fortran GCC 4.9.0 20140406 (experimental)

Initialize your environment to use the GCC 4.9.0 20140406 (experimental) compilers with

module load gcc/gcc-4.9-20140406-beta

Compilers - C, C++ and Fortran GCC 4.8.4

Initialize your environment to use the GCC 4.8.4 compilers with

module load gcc/4.8.4

Compilers - C, C++ and Fortran GCC 5.1.0

Initialize your environment to use the GCC 5.1.0 compilers with

module load gcc/5.1.0

CP2K 2.4.0

Installed the latest stable release (version 2.4.0) on Lattice, using the Intel 12.0 compiler with MKL for linear algebra and FFTs.  This is the MPI-only version with no optional features (such as the Hartree-Fock exchange support).  Call the program with:

module unload intel
module load intel/12
mpiexec /global/software/cp2k/cp2k240/bin/cp2k input_file.inp

CPMD 3.15.3p1

Located in /global/software/cpmd/CPMD-3.15.3.p1_ifort_mpi_mkl_defaultfft

CPMD 3.13.2

Under /global/software/cpmd MKL and GotoBLAS versions

CPMD 3.17.1

The CPMD executable, version 3.17.1 with 7 patches applied, was built with the Intel 14 compiler, Math Kernel Libraries (MKL) and default FFT implementation.  It is /global/software/cpmd/CPMD-3.17.1.p7_ifort14_mpi_mkl_defaultfft/bin/cpmd.x.

csh Link to tcsh
DL_POLY 4.02

See /global/software/dl_poly .

DL_POLY CLASSIC 1.7
Elk 2.1.25

If NODES is the number of complete Lattice nodes requested, then use:

module unload intel
module load intel/12
export OMP_NUM_THREADS=8
mpiexec -pernode $NODES /global/software/elk/elk-2.1.25/bin/elk

See the Elk user manual for additional advice on using a combination of OpenMP and MPI parallelism.

Emacs 23.1.1
FFmpeg (video encoding) 20150429

See the main WestGrid FFmpeg page.

FFTW 3.3.4
/global/software/fftw/fftw-3.3.4-gcc447
/global/software/fftw/fftw-3.3.4-intel12

FFTW 2.1.5

Look in /global/software/fftw.

FFTW 3.2.2

Look in /global/software/fftw.

Fortran - GNU GCC, Intel, Portland Group (PGI) and Absoft compilers Portl

Initialize your environment to use the Portland Group 14.6 compilers with

module load pgi/14.6

Fortran - GNU GCC, Intel, Portland Group (PGI) and Absoft compilers Portland Group 12.10

(pgf77, pgf90, pgf95 - module load pgi)

Fortran - GNU GCC, Intel, Portland Group (PGI) and Absoft compilers 12.1.10.319

(ifort - module load intel/2011-sp1.10.319)

Gerris Flow Solver 2013-12-06

See the main WestGrid Gerris software page.

Git 1.7.8.3

module load git/

GMT - Generic Mapping Tools 4.5.6

/global/software/gmt/gmt456

Gnuplot 4.6p0

/global/software/bin/gnuplot

Grace/Xmgrace 5.1.22

/global/software/bin/xmgrace

GROMACS 4.6.3

Installed version 4.6.3 incidentally as part of a PLUMED request. You may want to ignore this and go to a newer version.  See /global/software/gromacs/gromacs_463_intel1214_mkl_openmp_nogpu_sse41/bin.

GROMACS 4.6.5

Installed GROMACS 4.6.5 in /global/software/gromacs/gromacs_465_intel1214_mkl_openmp_nogpu_sse41/bin

Iniitialize your environment with:

module load gromacs/4.6.5

 

 

GROMACS 4.6.1

Initialize environment with:

module unload intel
module load intel/12
module load openmpi/old
source /global/software/gromacs/gromacs461_openmp_mkl_nogpu/bin/GMXRC

Normally one would use mdrun_mpi, but, double precision version, mdrun_mpi_d is also available.

GROMACS 5.0.4

Single and double precision versions of GROMACS 5.0.4 have been installed on Lattice.

module load gromacs/5.0.4

See the main WestGrid GROMACS page for an example script.

GROMACS 4.6.7

There are two versions of GROMACS 4.6.7 available on Lattice, one compiled with GNU compilers for use with g_mmpbsa and one with Intel compilers for use with PLUMED 2.2.0.

To set up your environment to use the GNU-compiled version use:

module load gromacs/4.6.7-gnu

This version of GROMACS also has the g_mmpbsa (http://rashmikumari.github.io/g_mmpbsa), a tool for evaluation of bonding free energy, installed in it. After loading the module two g_mmpbsa-related commands, g_mmpbsa and energy2bfac will be available.

To set up your environment to use the Intel-compiled version use (bash shell example):

module load plumed/2.2.0
BINDIR=/global/software/gromacs/gromacs_467_intel1214_mkl_openmp_nogpu_sse41_plumed220/bin
. ${BINDIR}/GMXRC

 

 

GROMACS 4.6.3 with PLUMED 2.0.1

Installed GROMACS 4.6.3 with patches for use with PLUMED 2.0.1 into /global/software/gromacs/gromacs_463_intel1214_mkl_openmp_nogpu_sse41_plumed201/bin.  Initialize your environment with

source /global/software/gromacs/gromacs_463_intel1214_mkl_openmp_nogpu_sse41_plumed201/bin/GMXRC

GROMACS 4.6.2

Initialize the environment with:

module unload intel
module load intel/12
module load openmpi/old
source /global/software/gromacs/gromacs462_openmp_mkl_nogpu/bin/GMXRC

Normally one would use mdrun_mpi, but, double precision version, mdrun_mpi_d is also available.

GROMACS 4.0.7

See /global/software/gromacs/gromacs407_intel111.

GROMACS 4.5.4

See /global/software/gromacs .

GROMACS 4.6

See /global/software/gromacs .

GSL 1.16

Both GCC and Intel-compiled versions of the GSL 1.16 library are available, installed in /global/software/gsl/gsl116_gcc and /global/software/gsl/gsl116_intel respectively.  Performance testing has not been done, but, usually the Intel compiler produces faster code.  See the main WestGrid GSL page for more instructions.

HDF 4 4.2.5

/global /software /hdf/hdf425

HDF 5 1.8.5-patch1

/global /software/lib

HDF 5 1.8.16

HDF5 1.8.16 built with Intel 12.1.4 compilers has been installed to /global/software/hdf/hdf5-1.8.16 .

idb 11.1
ImageMagick 6.9.1-10

Usage:

module load imagemagick/6.9.1

This make ImageMagic available on the compute nodes and replaces an older versoin on the login node.

ImageMagick 6.7.0.0

/global/software/imagemagick/ImageMagick-6.7.0-0

Java Sun Java SE JRE Oracle HotSpot 1.6.0_23 module unload intel will revert default to GCC gij 4.1.2

/global/software/java/jre1.6.0_23
Julia 0.3.7

Julia can be run in a batch job (bash script example) with

export PATH=/global/software/julia/julia037/julia-cb9bcae93a/bin:$PATH
julia --machinefile $PBS_NODEFILE <other arguments>

Note, the default versions of git and Python are too old for some Julia packages (including the package manager) to work. Set up your environment to use newer versions with

module unload intel
module load intel/12
module load python
export PATH=/global/software/git/bin:$PATH

ksh Y (ksh93)
LAMMPS 4Dec12

A relatively old version of the software is available as /global/software/lammps/bin/4Dec12/lmp_lattice

LAMMPS 7Dec15

The 7 Dec. 2015 version of LAMMPS (Intel 12.1.4 compiler with FFTW3 and MKL) has been installed on Lattice as /global/software/lammps/bin/7Dec15/lmp_lattice .

LAMMPS 31Mar17

The 31 March 2017 version of LAMMPS (built with the Intel 12.1.4 compiler with FFTW3 and MKL) has been installed on Lattice as /global/software/lammps/bin/31Mar17/lmp_lattice .

LAPACK In MKL
LIKWID - Performance analysis tools 3.1.1

The software has been installed under /global/software/likwid/likwid311 but not extensively tested. A typical build of software using the likwid library would use an include directive -I/global/software/likwid/likwid311/include and specify the library with -L/global/software/likwid/likwid311/lib -llikwid.

MATLAB R2011a

(UofC only)

MATLAB Release 2011b (7.13.0)

(U of C only) /global/software/bin/matlab

MATLAB R2010b

(U of C only)

MATLAB Compiler Runtime (MCR) 7.14

For use with standalone applications compiled with the version of the MATLAB compiler distributed with R2010b of MATLAB.

Installed to /usr/local/matlab-mcr/v714.

See full instructions for creating standalone applications.

MATLAB Compiler Runtime (MCR) 7.15

For use with standalone applications compiled with the version of the MATLAB compiler distributed with R2011a of MATLAB.

Installed to /global/software/matlab/mcr/v715.

See full instructions for creating standalone applications.

MATLAB Compiler Runtime (MCR) 7.13

For use with standalone applications compiled with the version of the MATLAB compiler distributed with R2010a of MATLAB.

Installed to /global/software/matlab/mcr/v713.

See full instructions for creating standalone applications.

MATLAB Compiler Runtime (MCR) 7.17

For use with standalone applications compiled with the version of the MATLAB compiler distributed with R2012a of MATLAB.

Installed to /global/software/matlab/mcr/v717.

See full instructions for creating standalone applications.

 

MATLAB Compiler Runtime (MCR) 8.0

For use with standalone applications compiled with the version of the MATLAB compiler distributed with R2012b of MATLAB.

Installed to /global/software/matlab/mcr/v80.

See full instructions for creating standalone applications.

MATLAB Compiler Runtime (MCR) 7.16

For use with standalone applications compiled with the version of the MATLAB compiler distributed with R2011b of MATLAB.

Installed to /global/software/matlab/mcr/v716.

See full instructions for creating standalone applications.

METIS 4.0.1

/global/software (bin, lib, include)

Migrate (Migrate-n) - Population genetics program to estimate migration rates and population sizes 3.6

Migrate has been installed on Lattice as /global/software/migrate/migrate36/bin/migrate-n-mpi.

See the generic Running Instructions section of the WestGrid Migrate page for more information.

MKL 10.3.x

module load intel/12

MKL 10.2.6

(default with Intel 11.1 compiler)

Mono (C# environment) 4.2.1

See the instructions for running Mono on Lattice on the main WestGrid mono page.  However, it is expected that most Mono applications will not be appropriate for running on Lattice.  This installation is merely a by-product of sharing file systems with Parallel, where Mono was installed for use on GPU-enabled nodes.

NAMD 2.9.1

For information about NAMD installations on Lattice see the main WestGrid NAMD page.

NAMD 2.7b3 (with PLUMED 1.3, legacy Open MPI)

The namd2 and associated binaries have been installed to /global/software/namd/namd27b3_plumed .

Set up the environment to use a legacy version of Open MPI:

module load openmpi/old

NAMD 2.11

Note: These are beta versions, b2, not the final release.

There are two variants of NAMD installed based on the binary distributions provided by the developers:

For multinode MPI parallel version:

export PATH=/global/software/namd/NAMD_2.11b2_Linux-x86_64-ibverbs:$PATH

For multinode MPI and shared memory (SMP) version:

export PATH=/global/software/namd/NAMD_2.11b2_Linux-x86_64-ibverbs-smp:$PATH

 

NAMD 2.10

There are two variants of NAMD installed based on the binary distributions provided by the developers:

For multinode MPI parallel version:

export PATH=/global/software/namd/NAMD_2.10_Linux-x86_64-ibverbs:$PATH

For multinode MPI and shared memory (SMP) version:

export PATH=/global/software/namd/NAMD_2.10_Linux-x86_64-ibverbs-smp:$PATH

 

NAMD 2.7 (ibverbs)
NAMD 2.7b3 (OpenMPI)
nano 2.0.9
NCAR Graphics and NCL 6.3.0

Initialize your enivronment (NCARG_ROOT and PATH) to use NCAR Command Language 6.3.0 on Lattice with

module load ncl/6.3.0

Please note that this installation provides NCL only. NCAR Graphics does not work due to the Cairo library on Lattice being too old.

 

NCAR Graphics and NCL 6.0.0

Set NCARG_ROOT to /global/software/ncl/ncl_ncarg _600

NEdit 5.5
NetCDF 4.1.1

/global /software /netcdf /netcdf4.1.1 (Also /global /software/lib)

Netpbm 10.35.80

/global/software/netpbm/netpbm-10.35.80

NEURON 7.4

Initialize your environment with

module load python/2.7.9

before running NEURON, which has been installed under /global/software/neuron/nrn-7.4 .

Note this is not a release version of NEURON, but, is a development snapshot.  It was built with InterViews, MPI and Python support.

NEURON 7.1

See /global/software/neuron/nrn-7.1. This is the non-graphical version only.

NEURON 7.3

See /global/software/neuron/nrn-7.3.

OpenFOAM, OpenFOAM+ and OpenFOAM Extend 2.2.1

See main WestGrid OpenFOAM page.

OpenFOAM, OpenFOAM+ and OpenFOAM Extend foam-extend-3.1

Initialize your environment with

export FOAM_INST_DIR=/global/software/openfoam
source $FOAM_INST_DIR/foam-extend-3.1_20140710/etc/bashrc

See the main WestGrid OpenFOAM page for a sample batch job script.

OpenFOAM, OpenFOAM+ and OpenFOAM Extend 2.3.0

See the main WestGrid OpenFOAM page for a sample batch job script.

OpenFOAM, OpenFOAM+ and OpenFOAM Extend 4.1

See the main WestGrid OpenFOAM page for a sample batch job script.

ORCA 2.9.0

/global/software/orca orca_2_9_0_ linux_ x86-64

ORCA 2.8.0.2 is also available
ORCA 3.0.3

Binaries have been installed in /global/software/orca/orca_3_0_3_linux_x86-64/ .  See the main WestGrid ORCA page for a sample job script.

ParaView 3.14.1

/global/software/ParaView/ParaView-3.14.1

ParaView 4.0.1

ParaView 4.0.1 has been installed on Lattice under /global/software/ParaView/ParaView-4.0.1-Linux-64bit. It is preferable to run it on Parallel to take advantage of the GPU support there.

ParaView 4.3.1

ParaView 4.3.1 has been installed on Lattice under /global/software/ParaView/ParaView-4.3.1-Linux-64bit. It is preferable to run it on Parallel to take advantage of the GPU hadrware there.

pdflib 7.05

/global/software/pdf/pdflib_lite _705

Perl 5.10.1

Perl 5.10.1 is the default on Lattice, but, does not have many add-on modules. 

Python 2.7.2

Initialize your environment with

module unload intel
module load intel/12
module load python

This Python installation includes NumPy 1.6.1 and Matplotlib 1.1.0 .

Python 2.4.3
Python 2.7.9

Version 2.7.9 of Python has been installed for a project that required the bz2 support that was missing in the previous versions of Python installed on Lattice.  However, very few additional modules have been installed for use with this version, so, unless you specifically need bz2 support or can make do with a bare-bones version of Python, please consider using version 2.7.2.  To use version 2.7.9, initialize your environment with:

module load python/2.7.9

Qbox DFT 1.56.2
Qt 4.8.2

/global/software/qt4/qt4.8.2

Quantum ESPRESSO 5.0.3

Quantum ESPRESSO was installed using Intel 12.1.4 compilers, MKL and FFTW 3.2.2 libraries. Before running ESPRESSO binaries, initialize your environment with:

module load espresso/5.0.3

Ray 2.3.1

The Ray binary compiled for the default maximum k-mer length of 32 is /global/software/ray/ray231/Ray .  A second version with a maximum k-mer length of 201 is /global/software/ray/ray231_maxkmer201/Ray.  Use the latter version only if necessary as it will use more memory than the one with the more restricted maximum k-mer length.

Ray 2.3.0

The Ray binary is /global/software/ray/ray230/Ray.

ROOT (data analysis framework) 5.34.32

See the main WestGrid ROOT page.

Ruby 1.9.3p0

/global/software/ruby/ruby193p0

ScaLAPACK MKL

ScaLAPACK is available through the Math Kernel Library (MKL).

SCons 2.5.1

SCons 2.5.1 has been installed on Lattice as /global/software/scons/scons-2.5.1/bin/scons .

sh Link to bash
Subversion 1.4.2

/usr/bin/svn

Subversion 1.6.15

/opt/ CollabNet_ Subversion/bin/svn

SUNDIALS (Suite of Nonlinear and Differential/Algebraic Equation Solvers) 2.5.0

The SUNDIALS software has been installed in /global/software/sundials/sundials250.  There is a Makefile for each program in the examples subdirectory that illustrates how programs using the SUNDIALS libraries can be built with Intel compilers and the Intel Math Kernel Library for optimized linear algebra support.

Szip 2.1

/global/software/lib

tcsh NA
Vim 7.2.411
VMD 1.9.1

Run VMD with

/global/software/vmd/vmd-1.9.1/bin/vmd

VMD 1.8.7

/global/software/vmd/vmd-1.8.7/bin

VMD 1.9.2

Run VMD with

/global/software/vmd/vmd-1.9.2/bin/vmd

 

Xerces-C++ 3.1.1

/global/software/xercesc/xercesc311