Previous System News
The current month's changes can be found on the wiki front page.
Updated in January 2015:
Fri Jan 9 01:27:49 EST 2015: Scheduler glitch. Most jobs were killed about an hour ago. Scheduler is back to normal, and please resume submitting jobs. Apologize for the inconvenience!
Updated in December 2014:
Thu Dec 18 10:12 EST 2014: Both BGQ systems are back up. Please resubmit your jobs (Note that only the BGQ was affected, all other systems are fine).
Thu Dec 18 9:30 EST 2014: Both BGQ systems shut off due to a cooling issue, killing all running jobs. Systems are being brought back up. Check here for updates.
Updated in November 2014:
- Nov 7: Archive jobs are on hold because the system is nearing capacity. They will run only once they are reviewed and released by SciNet staff (HPSS)
- Nov 6: Python 2.7.8, a popular scripting environment, installed as module (GPC)
Updated in October 2014:
- Oct 30: BGQ devel system upgraded from half a rack (8,192 cores) to 1 full rack (16,384 cores)
- Oct 17: R 3.1.1, a statistical package, installed as a module (GPC)
- Oct 7: Bedtools, a powerful toolset for genome arithmetic, installed as a module (GPC)
- Oct 2: Parallel Debugger DDT upgraded to version 4.2.1 (TCS/P7)
Updated in September 2014:
- Sept 25: Two new "Haswell" test nodes available, hasw01 and haws02, 2x E5-2660 v3 @ 2.60GHz (20 cores total) nodes with 128GB RAM.
- Sept 10: Job arrays re-enabled (GPC/ARC/GRAVITY/SANDY)
- Sept 10: Email notifications by the scheduler re-enabled (GPC/ARC/GRAVITY/SANDY).
- Sept 9: Scheduler upgraded (GPC/ARC/GRAVITY/SANDY).
- Sept 2: Intel Compilers 15.0 and IntelMPI 5.0 installed (GPC)
Updated in July 2014:
- Jul 25: qsub now checks your submission scripts (GPC/ARC/GRAVITY/SANDY).
- Jul 17: Paraview 4.1 server installed (GPC)
- Jul 16: VNC now works on arc01 as well as gpc01,2,3,4.
Updated in May 2014:
- May 26: gcc 4.9.0 installed as an experimental module (module load use.experimental gcc/4.9.0).
- May 23: CUDA 6.0 installed as a module and Nvidia driver updated to 331.67 on ARC and GRAVITY
- May 21: Allinea DDT/MAP updated on the GPC to version 4.2.1.
- May 15: 4 nodes have been upgraded with more memory (2 with 128GB, 2 with 256GB), see GPC Memory Configuration for usage details.
Updated in Mar 2014:
- Mar 12: Petsc 3.4.4, the Portable, Extensible Toolkit for Scientific computation, installed (GPC)
- Mar 11: Hpn-ssh, high-performance enabled ssh, installed (BGQ)
- Mar 7: Vapor 2.3.0, Tge Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers, installed (P7)
- Mar 4: Python 3.3.4 installed (GPC)
- Mar 3: Ffmpeg v2.1.3, an audio and video software solution, installed (GPC)
Updated in Feb 2014:
- Feb 28: Cuda 5.5 module installed (ARC/Gravity)
- Feb 28: New version of valgrind modules installed (valgrind/3.9.0_intelmpi and valgrind/3.9.0_openmpi) (GPC)
- Feb 25: New version of CP2K (latest trunk) installed as a module (GPC).
- Feb 19: Ray 2.3.1 installed as a module (GPC)
- Feb 19: Ghostscript added to the Xlibraries module (GPC)
Updated in Jan 2014:
- Jan 24: As a precaution, emails by the Moab/Torque scheduler have been disabled because of a potential security vulnerability (GPC).
Updated in Dec 2013:
- discovar, a genome assember, installed as a module on GPC.
- allpaths-lg, a short read genome assembler, installed as a module on GPC.
- gamess (version of May 1, 2013) installed as a module on GPC.
- HDF4 file format library version 4.2.6 installed on TCS.
- Newest IBM compilers (xlf 14.1 and xlc 12.1) now the default on TCS.
- zlib and slib compression libraries installed as module 'compression' on TCS.
- cmake version 2.8.12.1 installed as a module on BGQ.
- BGQ: serial and parallel HDF5 v1.8.12 libraries installed as modules on BGQ.
Updated in Nov 2013:
- Nov 14, Octopus 4.1.1 module installed on GPC
- Nov 13, PAPI 5.2.0 module installed on BGQ
- Nov 13, PFFT 1.0.7 module installed on GPC
- Nov 13, Samtools 0.1.19 module installed on GPC
- Nov 12, Trilinos 11.4.2 module installed on GPC
- Nov 8, Open Babel 2.3.2 module installed on GPC
- Nov 7, Ncview 2.1.2 module installed on GPC
- Nov 6, NetCDF 4.2.1.1 modules installed on GPC and TCS
- Nov 5, HDF5 1.8.11 modules installed on GPC and TCS
Updated in Oct 2013:
- Oct 30, Parallel netCDF 1.3.1 modules for intelmpi and openmpi installed on GPC, TCS, P7 and BGQ
- Oct 30, GDAL 1.9.2 installed as a module on GPC
- Oct 24, MemP 1.0.3 a memory profiling tool installed on BGQ
- Oct 22, Gcc 4.8.1 installed on P7
- Oct 15, Rsync 3.1.0 installed as a module on GPC
- Oct 15, User-space MySQL module installed on GPC
- Oct 7, Python 2.7.5 module installed on P7
Updated in Sep 2013:
- Sep 13 FFTW 3.3.3 with openmpi support installed as a module on GPC
- Sep 13: IntelMPI 4.1.1.036 installed as a module on GPC
- Sep 12: Paraview server 2.14.1 installed as a module on GPC
- Sep 11: Intel compiler 14.0.0 available as a module on GPC.
- Sep 11: Armadillo 3.910.0, a c++ linear algebra library, available as a module on GPC
- Sep 10: git-annex 1.8.4, a tool to manage files using git, available as a module on GPC
- Sep 10: cmake 2.8.8 module installed on TCS
- Sep 4: Storage offloading from BGQ to HPSS enabled
Updated in Aug 2013:
- Aug 26: CP2K, a molecular simulations package, installed as a module on GPC and BGQ
- Aug 15, Vim editor v7.4.5 module on GPC
- Aug 6, GROMACS v4.6.3 module on GPC
Updated in Jul 2013:
- July 22, Latest version of quantum espresso, a ab initio electronic-structure package, available as module espresso/trunk on GPC and BGQ
- July 18, MAP 4.1 available on the GPC as part of the ddt/4.1 module. This version can also do IO profiling.
- July 18, DDT 4.1 installed (BGQ, GPC, P7, TCS).
- July 12, Ray de-novo assembler v2.2.0 module (GPC)
- July 4, GDB 7.6 available as a module (GPC).
Updated in Jun 2013:
- June 25, GCC 4.8.1 available on BGQ as a module.
- June 11, GROMACS 4.5.7, 4.6.2 available on GPC, with modules.
Updated in May 2013:
- May 31, GCC 4.8.1 available on GPC as a module.
- May 28, New version of GNU Parallel, 20130422, available as module on the GPC.
- May 16, BGQ maximum job length reduced to 12 hours (bgq devel) and 24 hours (bgq production)
- May 2, PetSc 3.3 installed as a module. Uses intel 13 and openmpi 1.6.4
- May 1, OpenMPI 1.6.4 installed as module openmpi/intel/1.6.4.
Updated in Apr 2013:
- Apr 11, GPC operating system upgraded from CentOS 6.3 to 6.4.
- Apr 4, Allinea DDT 4.0 now available (and default version) for all our systems, with 128 task license.
- Apr 4, Allinea MAP is now available on the GPC (part of the ddt module).
Updated in Mar 2013:
- Mar 21, BGQ systems are running RHEL6.3 and V1R2M0 driver.
- Mar 19, P7 xlf and vacpp compilers patched to latest versions, and the default modules refer to these latest versions.
- Mar 18, Ray de-novo assembler v2.1.0 available on GPC
- Mar 7, PGI compilers v13.2 available on ARC and GPC
- Mar 7, Gnuplot 4.6.1 available on GPC
- Mar 5, P7 Linux Cluster expanded by 3 nodes.
Updated in Feb 2013:
- Feb 25, PetSc 3.2 available on TCS
- Feb 21, FFTW 3.3.3 for intelmpi available on GPC
- Feb 15, Armadillo 3.6.2 installed on GPC
- Feb 7, Intel MPI 4.1 available on GPC
- Feb 6, 2013: Intel compilers 13.1 available on GPC
Updated in Jan 2013:
- Jan 8, 2013: GCC 4.7.2 available on GPC
Updated in Nov 2012:
- Nov 13, 2012: GNU Parallel 20121022 installed on the GPC.
- Nov 9, 2012: DDT upgraded to 3.2.1 (GPC,ARC,TCS,P7)
- Nov 8, 2012: Gnuplot 4.6.1 installed on the P7.
Updated in Oct 2012:
- Oct 19, 2012: Cuda 5.0 installed on ARC.
- Oct 2, 2012: Perl-CPAN installed on the GPC.
Updated in Sep 2012:
- Sep 19, 2012: Users now get an email alert when they reach 90% and 95% of their allowed disk usage, or of their allowed number of files.
- Sep 5, 2012: GPC/HPSS: A parallel implementation of gzip called 'pigz' has been installed as part of the "extras" module.
Updated in Aug 2012:
- Aug 15, 2012, ARC: PGI compilers for OpenACC and Cuda Fortran upgraded to 12.6 as module pgi/12.6, which is the new default
- Aug 7, 2012, P7: new version of the IBM Fortran (14.1) and C/C++ compiler (12.1) available in non-default modules
Updates in Jul 2012:
- Jul 13, 2012, TCS: new version of the IBM Fortran (14.1) and C/C++ compiler (12.1) available in non-default modules
- Jul 11, 2012, TCS: new module gmake/3.82
- Jul 10, 2012, GPC: new version of GNU parallel in module gnu-parallel/20120622
- Jul 7, 2012, GPC: queues except debug got a minimum 15 minutes walltime
- Jul 5, 2012, ARC: cluster now integrated into GPC scheduler
- Jul 4, 2012, ARC: PGI compilers for OpenACC and Cuda Fortran installed
Updates in Jun 2012:
- GPC: A versions of the Intel compilers are available as intel/12.1.5. Version 12.1.3 is still the default.
- Scratch purging: the allowed time is still three months, but now files that were modified in the last three months will not get purged, even if they were never read in that period.
- ARC: cuda/4.1 is now the default CUDA module. The module cuda/4.2 is available as well, and will work with the newer gcc 4.6.1 compiler.
Updates in May 2012:
- GPC: a newer git version 1.7.10 is now available as a module (the default is still 1.7.1).
- GPC: silo is installed as a module
- GPC: gcc 4.7.0 available as module (version 4.6.1 is still the default)
- HPSS: Jobs will now run automatically.
- ARC: cuda 4.1 and 4.2 are available as modules (Note: 4.2 is not supported by the ddt debugger).
- P7: ncl available as a module
- P7: scons available as a module
Updates in Apr 2012:
- GPC: The GPC has been upgraded to a low-latency, high-bandwidth Infiniband network throughout the cluster. The temporary mpirun settings that were recommended before for multinode ethernet runs, are no longer in effect, as all MPI traffic is now going over InfiniBand. For most cases, mpirun -np X [executable] will work.
Updates in Mar 2012:
- New Blue Gene/Q system announced.
Updates in Feb 2012:
- GPC: A new version of the Intel compiler suite has become the default module. The C/C++ and fortran compilers in this suite are at version 12.1.3, while the MKL library is at version 10.3.9.
- GPC: New versions of parallel-netcdf and mpb have been installed.
Updates in Jan 2012:
- The new Resource Allocations will take effect on January 9, for groups who were awarded an allocation.
- On January 30th, CentOS 5 was phased out.
- The "diskUsage" command has been improved and its output has been simplified.
- GPC: Due to some changes we are making to the GigE nodes, if you run multinode ethernet MPI jobs, you will need to explicitly request the ethernet interface in your mpirun: For Openmpi: mpirun --mca btl self,sm,tcp; For IntelMPI: mpirun -env I_MPI_FABRICS shm:tcp.. There is no need to do this if you run on IB, or if you run single node mpi jobs on the ethernet (GigE) nodes.
Updates in December 2011:
- GPC transition from CentOS 5 to CentOS 6 completed. A few nodes still have the old CentOS 5 for validation purposes.
Updates in Nov 2011:
- Disks added to the scratch file system and scratch now spans both of our DDN controllers. The performance of /scratch should improve as a result of more spindles and the use of a second controller while the available space increased by about 40%.
- The home, scratch, project and hpss file systems have been restructured (note: not all users have access to the latter two). As a consequence, users' files reside in different locations than before. The home and scratch file system are now group-based, and groups are furthermore clustered by the initial letter of the group name. For instance, the current home directory of user 'resu' in group 'puorg' is now /home/p/puorg/resu. The predefined variables $HOME, $SCRATCH, $PROJECT and $ARCHIVE point to the new directories.
- The High-Performance Storage System (HPSS) goes into full production with a concurrent change in /project policies. Users with storage allocations greater than 5 TB will find all their former /project files will now reside in HPSS and their /project quotas will be reduced to 5 TB.
Updates in Oct 2011:
- GPC: an OS update from CentOS 5.6 to CentOS 6 is being prepared, which will include updates to other programs (perl,gcc,python) as well. The ARC already uses the newer OS, and a few of the gpc nodes are using this as a test already, while we are in the process of porting all the modules to the new OS.
Updates in Sep 2011:
- File system: In the near future, the home, scratch, project and hpss file systems will be restructured (note: not all users have access to the latter two). To facilitate the transition, we ask the user's cooperation in making sure all their scripts and applications only use relative paths, or use the predefined variables $HOME, $SCRATCH and $PROJECT.
Updates in Aug 2011:
- GPC: an OS update from Centos 5.6 to CentOS 6 is being prepared, which will include updates to other programs (perl,gcc,python) as well. A few nodes are using this as a test already, and we are in the process of porting all the modules to the new OS. We encourage users willing to try the new environment out to contact us. Note that the ARC already uses the newer OS.
- GPC: "Climate Data Operator" versions 1.4.6 and 1.5.1 are available as modules cdo/1.4.6 and cdo/1.5.1, respectively.
- GPC: The "Climate Model Output Rewriter" is installed as module cmor/2.7.1.
- GPC: a newer version of R can now be used by explicitly loading the module R/2.13.1, while R/2.11.1 remains the default.
- GPC: ffmpeg has been added to the ImageMagick module.
Updates in Jul 2011:
- Extensive updates and tightening of security measured were performed. Users were required to change there passwords and regenerate (pass-phrase protected) ssh keys if they used these. We also updated the operating system on the gpc to close the security hole.
- GPC: nedit installed as a module.
- P7: any user that has access to the power-6 cluster tcs, can now give the power-7 cluster (p7) a try.
Updates in Jun 2011:
- HPSS, the new tape-backed storage system that expands the current storage capacity of SciNet, has entered its pilot phase. This means that the installation is complete, and select users are trying out the system. HPSS will be one of the ways in which storage allocation will be implemented.
- New IBM Power-7 cluster: The P7 cluster currently consists of 5 IBM Power 755 servers (at least 3 more servers to be added later this year). Each has four 8-core 3.3GHz Power7 CPUs and 128GB RAM, and features 4-way Simultaneous MultiThreading giving 128 threads per server. Linux is the operating system. Both the GCC and IBM compilers are installed, as well as POE and OpenMPI. LoadLeveler is used as the scheduler. Instruction on usage are on the wiki, but you will first have to ask us if you want access (support@scinet.utoronto.ca).
- GPC: The Berkeley compiler for Unified Parallel C (UPC) has been installed as the module upc. The compiler command is 'upcc'.
- GPC: Bugs in the gnuplot module were fixed.
- GPC: qhull support was added to octave.
Updates in May 2011:
- DDT, a parallel debugging program from Allinea, has been installed on the GPC, TCS, and ARC. DDT stands for "Distributed Debugging Tool" and is available as the module "ddt". It supports debugging OpenMP, MPI and CUDA applications in a graphical environment.
Updates in Apr 2011:
- GPC: Two versions of Goto Blas were installed, a single and multi-threaded one. They can be loaded as modules gotoblas/1.13-singlethreaded and gotoblas/1.13-multithreaded, respectively.
- Accelerator Research Cluster (ARC): A 8-node GPU test-cluster has been setup with a total of 64 Nehalem CPUs and 16 GPUs (NVIDIA, Cuda capability 2.0).
Updates in Mar 2011:
- TCS: The bug in the showstart command was fixed, and showstart may be used again to estimate the start time of your job.
- GPC: Issues regarding simultaneously loading the gcc/4.4.0 and the Intel compiler modules were resolved.
- GPC: A newer version of the gcc compiler suite, v4.6.0, has been installed. The default version is still 4.4.0.
- GPC: Octave version 3.2.4 has been installed on the GPC. You should for now consider this an experimental module.
Updates in Feb 2011:
- GPC: The temporary location of the standard error/output file for GPC jobs has changed.
- TCS: The showstart command has been disabled as it appears to contain a bug that puts jobs in a 'hold' state.
Updates in Jan 2011:
- Users can now request Network Switch Affinity for GPC ethernet jobs at runtime.
- For groups who were allocated compute time in this RAC allocation round, the new RAPs took effect on Jan 17th.
- File system servers were reconfigured to improve performance and stability. File access should be better, especially for writing.
Updates in Dec 2010:
- Addition of ImageMagick software packages on GPC.
- GPC: EncFS, an encrypted filesystem in user-space was installed. Works only on gpc01..04.
- GPC: Versions 12 of the Intel compilers have been installed as module 'intel/intel-v12.0.0.084'.
- GPC: The corresponding code analysis tools for these compilers are available as the module 'inteltools'.
Updates in Nov 2010:
- A number of module names have been changed on the GPC.
- GPC: A module for R was installed.
- GPC: padb was installed as a module.
- GPC: GNU parallel was installed as module 'gnu-parallel'.
- TCS: CDO (Climate Data Operators) was installed as module 'cdo/1.4.6'
- TCS: compilers xlf 13.1 and xlc 11.1 are available as modules as modules xlf/13.1 and vacpp/11.1, respectively.
Updates in Oct 2010:
- Further enhancements to diskUsage. You may also generate plots of your usage over time (with the -plot option)
- CPMD 3.13.2 installed on the GPC
Updates in Sept 2010:
- The diskUsage command has been enhanced, and now you may get information on how much your usage has changed over a certain period with the -de option.
- IntelMPI 3.x has been deprecated.
- GPFS file system was upgraded to 3.3.0.6; Stricter /scratch quotas of 10TB were implemented; check yours with /scinet/gpc/bin/diskUsage.
- GPC: The quantum chemistry software package NWChem 5.1.1 installed.
- GPC: CPMD, a Carr-Parinello molecular dynamics package, was installed.
- GPC: Gromacs 4.5.1 (single precision), a molecular simulation package was installed.
Updates in Aug 2010:
- GPC: A number of versions of PetSc 3.1 were installed.
- GPC: OpenSpeedShop v1.9.3.4 was installed.
Updates in Jul 2010:
- Started the pilot project on Hierarchical Storage Management (HSM)
- GPC: The intel module no longer automatically load the gcc module. Users which use both should have "module load gcc intel" in their .bashrc.
- GPC: The default intel compiler v11.1 was changed to Update 6 (module intel/11.1.0.72).
- TCS: OpenDX, a visualisation software package, was installed as a module.
- GPC: MEEP, a finite difference simulation software for electromagnetic systems with mpi support, was installed.
- GPC & TCS: A number of old modules has been deprecated.
- Recurring file system issues were mitigated as much as possible.
Updates in Jun 2010:
- Hyper-threading was enabled on GPC.