Difference between revisions of "SciNet User Support Library"

From oldwiki.scinet.utoronto.ca
Jump to navigation Jump to search
m
Line 17: Line 17:
 
* Nov 23, GPC: Two [[Visualization Nodes]], viz01 and viz02, are being setup. They are 8-core Nehalem nodes with 2 graphics cards each, 64 GB of memory, and about 60GB of local hard disk. For now, you can directly log into viz01 to try it out.  We would value users' feedback and request for suitable software, help with visualization projects etc.
 
* Nov 23, GPC: Two [[Visualization Nodes]], viz01 and viz02, are being setup. They are 8-core Nehalem nodes with 2 graphics cards each, 64 GB of memory, and about 60GB of local hard disk. For now, you can directly log into viz01 to try it out.  We would value users' feedback and request for suitable software, help with visualization projects etc.
  
* Nov 16: ARC being decommisioned. During a transition period, the ARC head node and two  compute nodes will be kept up. Users are encouraged to start using [[Gravity]] instead.
+
* Nov 16: ARC being decommissioned. During a transition period, the ARC head node and two  compute nodes will be kept up. Users are encouraged to start using [[Gravity]] instead.
  
 
* Nov 12, GPC: The number of [[GPC_Quickstart#Compile.2FDevel_Nodes | GPC devel]] nodes has been doubled from 4 to 8, and the new ones can be accessed using gpc0[5-8].   
 
* Nov 12, GPC: The number of [[GPC_Quickstart#Compile.2FDevel_Nodes | GPC devel]] nodes has been doubled from 4 to 8, and the new ones can be accessed using gpc0[5-8].   

Revision as of 08:53, 21 May 2016

System Status

System status can now be found at docs.scinet.utoronto.ca


Mon 23 Apr 2018 GPC-compute is decommissioned, GPC-storage available until 30 May 2018

Thu 18 Apr 2018 Niagara system will undergo an upgrade to its Infiniband network between 9am and 12pm, should be transparent to users, however there is a chance of network interruption.

Fri 13 Apr 2018 HPSS system will be down for a few hours on Mon, Apr/16, 9AM, for hardware upgrades, in preparation for the eventual move to the Niagara side.

Tue 10 Apr 2018 Niagara is open to users.

Wed 4 Apr 2018 We are very close to the production launch of Niagara, the new system installed at SciNet. While the RAC allocation year officially starts today, April 4/18, the Niagara system is still undergoing some final tuning and software updates, so the plan is to officially open it to users on next week.

All active GPC users will have their accounts, $HOME, and $PROJECT, transferred to the new Niagara system. Those of you who are new to SciNet, but got RAC allocations on Niagara, will have your accounts created and ready for you to login.

We are planning an extended Intro to SciNet/Niagara session, available in person at our office, and webcast on Vidyo and possibly other means, on Wednesday April 11 at noon EST.


System News

  • May 3: GPC: Versions 15.0.6 and 16.0.3 of the Intel Compilers are installed as modules.
  • Feb 12: GPC: Version 6.0 of Allinea Forge (DDT Debugger, MAP, Performance Reports) installed as a module.
  • Jan 11: The 2016 Resource Allocations for compute cycles are now in effect.
  • Nov 23: The quota for home directories has been increased from 10 GB to 50 GB.
  • Nov 23, GPC: Two Visualization Nodes, viz01 and viz02, are being setup. They are 8-core Nehalem nodes with 2 graphics cards each, 64 GB of memory, and about 60GB of local hard disk. For now, you can directly log into viz01 to try it out. We would value users' feedback and request for suitable software, help with visualization projects etc.
  • Nov 16: ARC being decommissioned. During a transition period, the ARC head node and two compute nodes will be kept up. Users are encouraged to start using Gravity instead.
  • Nov 12, GPC: The number of GPC devel nodes has been doubled from 4 to 8, and the new ones can be accessed using gpc0[5-8].
  • Sept 7, GPC: The number of nodes with 32 GB of RAM has been increased from 84 to 205.
  • July 24, GPC: GCC 5.2.0 with Coarray Fortran support, installed as a module.

(Previous System News)

QuickStart Guides

Tutorials and Manuals

What's New On The Wiki

  • Dec 2014: Updated GPC Quickstart with info on email notifications from the scheduler.
  • Dec 2014: Hdf5 compilation page updated.
  • Sept 2014: Improved information on the Python versions installed on the GPC, and which modules are included in each version.
  • Sept 2014: Description on using job arrays on the GPC on the Scheduler page.
  • Sept 2014: Instructions on using Hadoop (for the Hadoop workshop held in September).

Previous new stuff can be found in the What's new archive.