Difference between revisions of "SciNet User Support Library"

From oldwiki.scinet.utoronto.ca
Jump to navigation Jump to search
Line 5: Line 5:
 
| valign="top" style="cellpadding:1em; padding:1em; border:3px solid #0645ad; background-color:#f6f6f6; border-radius:7px"|
 
| valign="top" style="cellpadding:1em; padding:1em; border:3px solid #0645ad; background-color:#f6f6f6; border-radius:7px"|
 
==System Updates==
 
==System Updates==
* [[BGQ]] devel system upgraded from half a rack (8,192 cores) to 1 full rack (16,384 cores).
+
* Oct 30: [[BGQ]] devel system upgraded from half a rack (8,192 cores) to 1 full rack (16,384 cores).
* Parallel Debugger DDT upgraded to version 4.2.1 (TCS/P7)
+
* Oct 2: Parallel Debugger DDT upgraded to version 4.2.1 (TCS/P7)
 
* Sept 25: Two new "Haswell" test nodes available, '''hasw01''' and '''haws02''', 2x E5-2660 v3 @ 2.60GHz (20 cores total) nodes with 128GB RAM.
 
* Sept 25: Two new "Haswell" test nodes available, '''hasw01''' and '''haws02''', 2x E5-2660 v3 @ 2.60GHz (20 cores total) nodes with 128GB RAM.
 
* Sept 10: [http://wiki.scinethpc.ca/wiki/index.php/Scheduler#Multiple_Job_Submissions Job arrays] re-enabled (GPC/ARC/GRAVITY/SANDY)
 
* Sept 10: [http://wiki.scinethpc.ca/wiki/index.php/Scheduler#Multiple_Job_Submissions Job arrays] re-enabled (GPC/ARC/GRAVITY/SANDY)
Line 12: Line 12:
 
* Sept 9: Scheduler upgraded (GPC/ARC/GRAVITY/SANDY).
 
* Sept 9: Scheduler upgraded (GPC/ARC/GRAVITY/SANDY).
 
* Sept 2: Intel Compilers 15.0 and IntelMPI 5.0 installed (GPC)
 
* Sept 2: Intel Compilers 15.0 and IntelMPI 5.0 installed (GPC)
* Jul 25: qsub now checks your submission scripts (GPC/ARC/GRAVITY/SANDY).
 
* Jul 17: Paraview 4.1 server installed (GPC)
 
* Jul 16: VNC now works on arc01 as well as gpc01,2,3,4.
 
  
 
([[Previous_updates:|Previous updates]])
 
([[Previous_updates:|Previous updates]])

Revision as of 15:50, 31 October 2014

System Status

System status can now be found at docs.scinet.utoronto.ca


Mon 23 Apr 2018 GPC-compute is decommissioned, GPC-storage available until 30 May 2018

Thu 18 Apr 2018 Niagara system will undergo an upgrade to its Infiniband network between 9am and 12pm, should be transparent to users, however there is a chance of network interruption.

Fri 13 Apr 2018 HPSS system will be down for a few hours on Mon, Apr/16, 9AM, for hardware upgrades, in preparation for the eventual move to the Niagara side.

Tue 10 Apr 2018 Niagara is open to users.

Wed 4 Apr 2018 We are very close to the production launch of Niagara, the new system installed at SciNet. While the RAC allocation year officially starts today, April 4/18, the Niagara system is still undergoing some final tuning and software updates, so the plan is to officially open it to users on next week.

All active GPC users will have their accounts, $HOME, and $PROJECT, transferred to the new Niagara system. Those of you who are new to SciNet, but got RAC allocations on Niagara, will have your accounts created and ready for you to login.

We are planning an extended Intro to SciNet/Niagara session, available in person at our office, and webcast on Vidyo and possibly other means, on Wednesday April 11 at noon EST.


System Updates

  • Oct 30: BGQ devel system upgraded from half a rack (8,192 cores) to 1 full rack (16,384 cores).
  • Oct 2: Parallel Debugger DDT upgraded to version 4.2.1 (TCS/P7)
  • Sept 25: Two new "Haswell" test nodes available, hasw01 and haws02, 2x E5-2660 v3 @ 2.60GHz (20 cores total) nodes with 128GB RAM.
  • Sept 10: Job arrays re-enabled (GPC/ARC/GRAVITY/SANDY)
  • Sept 10: Email notifications by the scheduler re-enabled (GPC/ARC/GRAVITY/SANDY).
  • Sept 9: Scheduler upgraded (GPC/ARC/GRAVITY/SANDY).
  • Sept 2: Intel Compilers 15.0 and IntelMPI 5.0 installed (GPC)

(Previous updates)

QuickStart Guides

Tutorials and Manuals

What's New On The Wiki

  • Sept 2014: Improved information on the Python versions installed on the GPC, and which modules are included in each version.
  • Sept 2014: Description on using job arrays on the GPC on the Scheduler page.
  • Sept 2014: Instructions on using Hadoop (for the Hadoop workshop held in September).

Previous new stuff can be found in the What's new archive.