This is a first tutorial for cluster job submissions

First, create the file that will do your work. To demonstrate, we use this simple script that waits 20 seconds:

echo "waiting on 20 seconds..."
sleep 20
echo "this is the end of the work"

Call this script

To run this job:

cassini> sbatch -o myOut.txt -e myError.txt –mem=16G -p medium ./

  • You must make sure that you are in the same directory as or sbatch will not find
  • Best practice is to use complete paths in your scripts: e.g. /home/username/path/
  • Output of this job goes to myOut.txt (-o myOut.txt)
  • Errors produced from this job go to myError.txt (-e myError.txt)

This will run in the medium queue (-p medium), asking for 16G of memory.

You can also put these parameters into a single file, along with the execution of it, to make things easier and more consistent:

#SBATCH -o myOut.txt
#SBATCH -e myError.txt
#SBATCH --mem=16G
#SBATCH -p medium

cd /home/username/path
echo ""
echo "Done! This message will appear in the output file: myOut.txt"

Name this single file as

To Run it:

cassini> sbatch

This will do exactly the same work as the above job with all parameters placed on the command line.

Viewing your job and job parameters

To see the status of this job (and all jobs running):

cassini> qstat

You can also isolate just your jobs (or any username) using the -u flag:

cassini> qstat -u username

To view information about the queues themselves:

cassini> sinfo

You can also specify the number of processors you want with the -n flag:

#SBATCH -o output.txt
#SBATCH -e errors.txt
#SBATCH -J /home/username/path/
#SBATCH -n 4

This will run your script on 4 processors.

You can run your script with your own different inputs/parameters on multiple nodes using a single script:

#SBATCH -o myout.out
#SBATCH -e myerror.out
#SBATCH --mem=16G
#SBATCH -p medium
# There are three arguments available:
echo "waiting on 20 seconds..."
echo "1: $1"
echo "2: $2"
echo "3: $3"
sleep 20

    • # denotes a commented line

Calling this script:, and running it like this:

cassini> sbatch -o ParamOut -e ParamError -p medium First 2222 three33

Runs the job with these three inputs added on the last line, and writes them to the output file (ParamOut):

waiting on 20 seconds…
1: First
2: 2222
3: three33

Array Tasks/Jobs

To run array jobs that make use of parameters for input, create this script:

#SBATCH --mem=32G
#SBATCH -o array.out
#SBATCH -e array.error
#SBATCH -p long
./home/username/path/arrayJob $SLURM_ARRAY_TASK_ID

arrayJob uses the parameter assigned to $SLURM_ARRAY_TASK_ID as input. Run it like this:
cassini> sbatch –array=0-19 -N1 arrayJob

-N flag is to indicate Nodes, and should be used with array tasks. Each job will run on 1 node using -N1