Running Gromacs

From oldwiki.scinet.utoronto.ca
Jump to navigation Jump to search
Submitting an IB GPC job using openmpi

<source lang="sh">

  1. !/bin/bash
  2. PBS -l nodes=10:ib:ppn=8,walltime=40:00:00,os=centos53computeA
  3. PBS -N 1

if [ "$PBS_ENVIRONMENT" != "PBS_INTERACTIVE" ]; then

 if [ -n "$PBS_O_WORKDIR" ]; then
   cd $PBS_O_WORKDIR
 fi

fi /scinet/gpc/mpi/openmpi/1.3.2-intel-v11.0-ofed/bin/mpirun -np $(wc -l $PBS_NODEFILE | gawk '{print $1}') -machinefile $PBS_NODEFILE /scratch/cneale/exe/intel/gromacs-4.0.5/exec/bin/mdrun_openmpi -deffnm pagp -nosum -dlb yes -npme 24 -cpt 120

    1. To submit type: qsub this.sh

</source>

-- cneale 18 August 2009

Submitting an IB GPC job using mvapich2-1.4rc1

Note that mvapich2-1.4rc1 is not configured to fall back to ethernet so this will not work on the non-IB nodes, even for 8 cores.

<source lang="sh">

  1. !/bin/bash
  2. PBS -l nodes=4:ib:ppn=8,walltime=30:00:00,os=centos53computeA
  3. PBS -N 1

if [ "$PBS_ENVIRONMENT" != "PBS_INTERACTIVE" ]; then

 if [ -n "$PBS_O_WORKDIR" ]; then
   cd $PBS_O_WORKDIR
 fi

fi module purge module load mvapich2 intel /scratch/cneale/exe/mvapich2-1.4rc1/bin/mpirun_rsh -np $(wc -l $PBS_NODEFILE | gawk '{print $1}') -hostfile $PBS_NODEFILE /scratch/cneale/exe/intel/gromacs-4.0.5/exec/bin/mdrun_mvapich2 -deffnm pagp -nosum -dlb yes -npme 12 -cpt 120

    1. To submit type: qsub this.sh

</source>

-- cneale 18 August 2009

Submitting a non-IB GPC job using openmpi

<source lang="sh">

  1. !/bin/bash
  2. PBS -l nodes=1:compute-eth:ppn=8,walltime=40:00:00,os=centos53computeA
  3. PBS -N 1

if [ "$PBS_ENVIRONMENT" != "PBS_INTERACTIVE" ]; then

 if [ -n "$PBS_O_WORKDIR" ]; then
   cd $PBS_O_WORKDIR
 fi

fi /scinet/gpc/mpi/openmpi/1.3.2-intel-v11.0-ofed/bin/mpirun -mca btl_sm_num_fifos 7 -np $(wc -l $PBS_NODEFILE | gawk '{print $1}') -mca btl self,sm

-machinefile $PBS_NODEFILE /scratch/cneale/exe/intel/gromacs-4.0.5/exec/bin/mdrun_openmpi 

-deffnm pagp -nosum -dlb yes -cpt 120

    1. To submit type: qsub this.sh

</source>

This is VERY IMPORTANT !!! Please read the [relevant user tips section] for information that is essential for your single node (up to 8 core) MPI GROMACS jobs.

-- cneale 14 September 2009

Submitting a IB GPC job using openmpi that resubmits prior to completing

<source lang="sh">

  1. !/bin/bash
  2. PBS -l nodes=6:ib:ppn=8,walltime=46:00:00,os=centos53computeA
  3. PBS -N 1

if [ "$PBS_ENVIRONMENT" != "PBS_INTERACTIVE" ]; then

 if [ -n "$PBS_O_WORKDIR" ]; then
   cd $PBS_O_WORKDIR
 fi

fi /scinet/gpc/mpi/openmpi/1.3.2-intel-v11.0-ofed/bin/mpirun -np $(wc -l $PBS_NODEF ILE | gawk '{print $1}') -machinefile $PBS_NODEFILE /scratch/cneale/GPC/exe/inte l/gromacs-4.0.5/exec/bin/mdrun_openmpi -deffnm X3_LDAO_0.5MNaCl -cpi X3_LDAO_0.5 MNaCl -nosum -dlb yes -cpt 120 -maxh 46 -npme 16

num=$NUM if [ $num -lt 100 ]; then

       num=$(($num+1))
        ssh gpc01 "cd $PBS_O_WORKDIR; qsub this.sh -v NUM=$num";

fi

    1. Initial submission with: qsub this.sh -v NUM=0

</source>

-- Holyoake 14 September 2009