Difference between revisions of "SciNet User Support Library"

From oldwiki.scinet.utoronto.ca
Jump to navigation Jump to search
 
(38 intermediate revisions by 5 users not shown)
Line 1: Line 1:
 
__NOTOC__
 
__NOTOC__
 +
{| style="border-spacing: 8px; width:100%"
 +
| valign="top" style="cellpadding:1em; padding:1em; border:2px solid; background-color:#f6f674; border-radius:5px"|
 +
'''WARNING: SciNet is in the process of replacing this wiki with a new documentation site. For current information, please go to [https://docs.scinet.utoronto.ca https://docs.scinet.utoronto.ca]'''
 +
|}
 +
 +
 
{|  style="border-spacing: 8px;"
 
{|  style="border-spacing: 8px;"
 
| valign="top" style="cellpadding:1em; padding:1em; border:3px solid #0645ad; background-color:#f6f6f6; border-radius:7px"|
 
| valign="top" style="cellpadding:1em; padding:1em; border:3px solid #0645ad; background-color:#f6f6f6; border-radius:7px"|
Line 5: Line 11:
 
| valign="top" style="cellpadding:1em; padding:1em; border:3px solid #0645ad; background-color:#f6f6f6; border-radius:7px"|
 
| valign="top" style="cellpadding:1em; padding:1em; border:3px solid #0645ad; background-color:#f6f6f6; border-radius:7px"|
 
==System News==
 
==System News==
 
+
* April 23, 2018: GPC & Sandy decommissioned.
* Nov 23: The quota for home directories has been increased from 10 GB to 50 GB.
+
* April 10, 2018: Niagara commissioned.
 
+
* March 20th, 2018: Gravity decommissioned.
* Nov 23, GPC: Two [[Visualization Nodes]], viz01 and viz02, are being setup. They are 8-core Nehalem nodes with 2 graphics cards each, 64 GB of memory, and about 60GB of local hard disk. For now, you can directly log into viz01 to try it out.  We would value users' feedback and request for suitable software, help with visualization projects etc.
+
* Dec 4, 2017 : scratchtcs decommissioned.
 
+
* Nov 28, 2017: The GPC has been reduced from 30,912 to 16,800 cores to make room for the installation of Niagara.
* Nov 16: ARC being decommisioned. During a transition period, the ARC head node and two  compute nodes will be kept up. Users are encouraged to start using [[Gravity]] instead.
+
* Sept 29, 2017: The TCS has been decommissioned.
 
 
* Nov 12, GPC: The number of [[GPC_Quickstart#Compile.2FDevel_Nodes | GPC devel]] nodes has been doubled from 4 to 8, and the new ones can be accessed using gpc0[5-8]. 
 
 
 
* Nov 12, GPC: Two new [[GPC_Quickstart#Memory_Configuration| 128 GB RAM Haswell]] nodes have been added to the "largemem" queue.  
 
 
 
* Sept 7, GPC: The number of nodes with [[GPC_Quickstart#Memory_Configuration|32 GB of RAM]] has been increased from 84 to 205.
 
 
 
* July 24, GPC: GCC 5.2.0 with [[Co-array_Fortran_on_the_GPC|Coarray Fortran]] support, installed as a module.
 
  
 
([[Previous System News]])
 
([[Previous System News]])
Line 27: Line 25:
 
* [[SciNet Command Line Utilities]]
 
* [[SciNet Command Line Utilities]]
 
* [[Media:SciNet_Tutorial.pdf|SciNet User Tutorial]]
 
* [[Media:SciNet_Tutorial.pdf|SciNet User Tutorial]]
* [[GPC_Quickstart|GPC: General Purpose Cluster]]
 
* [[TCS_Quickstart|TCS: Tightly Coupled System]]
 
* [[Sandy| Sandy: Intel Sandybridge nodes ]]
 
* [[Gravity| Gravity: GPU nodes]]
 
 
* [[P7_Linux_Cluster|P7: Power 7 Linux cluster]]
 
* [[P7_Linux_Cluster|P7: Power 7 Linux cluster]]
 
* [[BGQ|BGQ: BlueGene/Q clusters]]
 
* [[BGQ|BGQ: BlueGene/Q clusters]]
Line 37: Line 31:
 
* [[FAQ | FAQ (frequently asked questions)]]
 
* [[FAQ | FAQ (frequently asked questions)]]
 
* [[Acknowledging_SciNet | Acknowledging SciNet]]
 
* [[Acknowledging_SciNet | Acknowledging SciNet]]
* [[File:Rss.gif|link=http://support.scinet.utoronto.ca/podcasts.xml]] [http://support.scinet.utoronto.ca/podcasts.xml SciNet Training Webcasts]
+
* [https://courses.scinet.utoronto.ca SciNet education]
 +
* [https://www.youtube.com/channel/UC42CaO-AAQhwqa8RGzE3daQ SciNet YouTube Channel]
 
| valign="top" style="padding:1em; border:1px solid #aaaaaa; background-color:#e8f6e8; border-radius:7px" |
 
| valign="top" style="padding:1em; border:1px solid #aaaaaa; background-color:#e8f6e8; border-radius:7px" |
  
Line 54: Line 49:
 
* [[Knowledge_Base:_Tutorials_and_Manuals#Applications|Applications]]
 
* [[Knowledge_Base:_Tutorials_and_Manuals#Applications|Applications]]
 
* [[Knowledge_Base:_Tutorials_and_Manuals#Manuals|Manuals (compilers etc)]]
 
* [[Knowledge_Base:_Tutorials_and_Manuals#Manuals|Manuals (compilers etc)]]
 +
* [[2016_Ontario_Summer_School_for_High_Performance_Computing_Central]]
 
* [[2015_Ontario_Summer_School_for_High_Performance_Computing_Central]]
 
* [[2015_Ontario_Summer_School_for_High_Performance_Computing_Central]]
 
|-
 
|-

Latest revision as of 14:59, 15 May 2018

WARNING: SciNet is in the process of replacing this wiki with a new documentation site. For current information, please go to https://docs.scinet.utoronto.ca


System Status

System status can now be found at docs.scinet.utoronto.ca


Mon 23 Apr 2018 GPC-compute is decommissioned, GPC-storage available until 30 May 2018

Thu 18 Apr 2018 Niagara system will undergo an upgrade to its Infiniband network between 9am and 12pm, should be transparent to users, however there is a chance of network interruption.

Fri 13 Apr 2018 HPSS system will be down for a few hours on Mon, Apr/16, 9AM, for hardware upgrades, in preparation for the eventual move to the Niagara side.

Tue 10 Apr 2018 Niagara is open to users.

Wed 4 Apr 2018 We are very close to the production launch of Niagara, the new system installed at SciNet. While the RAC allocation year officially starts today, April 4/18, the Niagara system is still undergoing some final tuning and software updates, so the plan is to officially open it to users on next week.

All active GPC users will have their accounts, $HOME, and $PROJECT, transferred to the new Niagara system. Those of you who are new to SciNet, but got RAC allocations on Niagara, will have your accounts created and ready for you to login.

We are planning an extended Intro to SciNet/Niagara session, available in person at our office, and webcast on Vidyo and possibly other means, on Wednesday April 11 at noon EST.


System News

  • April 23, 2018: GPC & Sandy decommissioned.
  • April 10, 2018: Niagara commissioned.
  • March 20th, 2018: Gravity decommissioned.
  • Dec 4, 2017 : scratchtcs decommissioned.
  • Nov 28, 2017: The GPC has been reduced from 30,912 to 16,800 cores to make room for the installation of Niagara.
  • Sept 29, 2017: The TCS has been decommissioned.

(Previous System News)

QuickStart Guides

Tutorials and Manuals

What's New On The Wiki

  • Dec 2014: Updated GPC Quickstart with info on email notifications from the scheduler.
  • Dec 2014: Hdf5 compilation page updated.
  • Sept 2014: Improved information on the Python versions installed on the GPC, and which modules are included in each version.
  • Sept 2014: Description on using job arrays on the GPC on the Scheduler page.
  • Sept 2014: Instructions on using Hadoop (for the Hadoop workshop held in September).

Previous new stuff can be found in the What's new archive.