User Tools

Site Tools


wiki:software

This is an old revision of the document!


Available Software

The following scientific packages are installed on Keck Center workstations:

Installed software Version Location Executable
Amber 16 /soft/pkg/amber sander, pmemd, pmemd.cuda
Avogadro 1.0.3 avogadro
APBS 1.4 /soft/pkg/APBS apbs
BrownDye 2016.2.5 /soft/pkg/browndye bd_top
CalVR 2013.12.2 /soft/pkg/calvr CalVR
Cambridge Structural Database 2014 /soft/pkg/csds cq
charmm c41b2 /soft/pkg/charmm charmm
Dalton 2016.2 /soft/pkg/dalton dalton.x
Dirac 16.0 /soft/pkg/dirac dirac.x
Gaussian g09.d01 /soft/pkg/g09.d01 g09
Gaussview 5.0.9 /soft/pkg/gv gview.exe
Gromacs 5.0.4 /soft/pkg/gromacs gmx
Kepler 2.4 /soft/pkg/kepler kepler.sh
Matlab 2013b /soft/pkg/matlab matlab
mgltools 1.5.7r1 /soft/pkg/mgltools pmv
MMV 2.5.0 /soft/pkg/Molegro mmv
moe 2014 /soft/pkg/moe moe
MSMBuilder 2.5 /soft/pkg/msmbuilder misc.
NAMD 2.12 /soft/pkg/NAMD namd2
ORCA 4.0.0 /soft/pkg/orca orca
Pymol 1.8.6.0 /soft/pkg/pymol pymol
PyRx 0.9 /soft/pkg/PyRx run.sh
Relion 2.0 /soft/pkg/relion relion
Rosetta 3.5 /soft/pkg/rosetta
Schrodinger 2017 /soft/pkg/schrodinger maestro
vina 1.1.2 /soft/pkg/autodock_vina vina
VMD 1.9.2 /soft/pkg/vmd vmd

The Intel compilers version 12.1.5 are installed in /soft/pkg/intel directory. Use the following command to initialize your environment before using the compilers:

 module load intel

Additionally, a recent version of OpenMPI (v. 1.6.5) is installed in /soft/pkg/openmpi-1.6.5-intel.

How to use it

The modules package is used to manage, enable and configure scientific applications on the Keck Center workstations. To see what software is available run the following command:

 module avail

you should see something similar to this:

# module avail
------------------------------------------- /soft/etc/modules -------------------------------------------
amber/14(default)             gromacs-cuda/5.0-gcc(default) openmpi/1.6.2
apbs/1.4                      gromacs-cuda/5.0.4-gcc        openmpi/1.6.5(default)
autodock/4.2.6                intel/12.1.5                  openmpi-gcc/1.6.2
browndye/2016.2.5(default)    kepler/2.4                    orca/2.9.1
calvr/2013.12.2               matlab/2013b(default)         orca/3.0.0
chem185/1.0                   mgltools/1.5.7r1(default)     orca/3.0.1
csds/2014                     mmv/2.5.0                     orca/3.0.2
cuda/5.0.35(default)          moe/2013                      orca/3.0.3(default)
cuda/5.5.22                   moe/2014(default)             pymol/1.5
gaussian/g09.a02              msmbuilder/2.5                pyrx/0.9
gaussian/g09.c01              msms/2.6.1                    rosetta/3.5
gaussian/g09.d01(default)     namd/2.10                     schrodinger/2015u4
gaussview/5.0.9               namd/2.11(default)            schrodinger/2016u1
gcc/4.9.3                     namd-cuda/2.10                schrodinger/2016u2
glove/2015                    namd-cuda/2.11(default)       schrodinger/2016u3(default)
gromacs/4.6b2-gcc             namd-cuda/2.9                 vina/1.1.2
gromacs/4.6b2-icc             nbo/6.0                       vmd/1.9.1
gromacs/5.0-gcc(default)      nvidia/4.2.9                  vmd/1.9.2(default)
gromacs/5.0-intel             nvidia/5.0.35(default)
gromacs-cuda/4.6.1-gcc        nvidia/5.5.22

To actually use one of the installed application you need to load the application's environment first and then start the application. For example, to run pymol you need to execute these two steps:

module load pymol
pymol

To get help about the package:

module help namd

You can find more information about the modules management package on this help page. Also here is the module man page.

Application specific notes

Amber

Before running the CUDA enabled version of pmemd you need to initialize your environment with these two commands:

module load amber/16
module load cuda/7.5.18

Cambridge Structural Database

To use CSDS execute the following:

module load csds
cq

Gaussian/Gaussview

If you want to run Gaussian or Gaussview you have to be added to the Gaussian unix group - please contact keck-help@keck2.ucsd.edu for assistance.

Running non-interactive long jobs

The Keck II workstations can be used to run non-interactive, long, computationally intensive jobs. All these jobs must be submitted through the SGE queue manger. More information can be found here.

wiki/software.1494358574.txt.gz · Last modified: 2017/05/09 12:36 by admin