Difference between revisions of "Oldwiki.scinet.utoronto.ca:System Alerts"

From oldwiki.scinet.utoronto.ca
Jump to navigation Jump to search
 
Line 1: Line 1:
== System Status: <span style="color:#00bb11">'''UP'''</span>==
+
== System Status==
 +
<!--
 +
  Notes for updating the system status:
  
<div style="background-color:#000000; color:#ffffff; padding: 1em">
+
  - When removing system status entries, please archive them to:
<h2 style="color:#ffffff">GPC Upgrade to Infiniband - what you need to know</h2>
 
  
The GPC network has been upgraded to a low-latency,
+
    http://wiki.scinethpc.ca/wiki/index.php/Previous_messages:
high-bandwidth Infiniband network throughout the cluster. Several significant benefits over the old ethernet/infiniband mixed setup are expected,
 
including:
 
*better I/O performance for all jobs
 
*better job performance for what used to be multi-node ethernet jobs (as they will now make use of Infiniband),
 
*for users that were already using Infiniband, improved queue throughput (there are now 4x as many available nodes), and the ability to run larger IB jobs.
 
  
NOTE 1: Our wiki is NOT completely up-to-date after
+
    (yes, the trailing colon is part of the url)
this recent change. For the time being, you should first check this
 
current page and the temporary [https://support.scinet.utoronto.ca/wiki/index.php/Infiniband_Upgrade Infiniband Upgrade]  page
 
for anything related to networks and queueing.
 
  
NOTE 2: The temporary mpirun settings that were recommended for multinode ethernet runs are no longer in effect, as all MPI traffic is now going over InfiniBand.
+
  -  The 'status circles' can be one of the following files:  
  
NOTE 3: Though we have been testing the new system since last night, a change of
+
    down.png  for down
this magnitude is likely to result in some teething problems so please
+
    up25.png  for 25% up
bear with us over the next few days. Please report any issues/problems
+
    up50.png  for 50% up
that are not explained/resolved after reading this current page or our
+
    up75.png  for 75% up
[https://support.scinet.utoronto.ca/wiki/index.php/Infiniband_Upgrade Infiniband Upgrade]  page
+
    up.png    for 100% up
to support@scinet.utoronto.ca.
 
</div>
 
  
Thu 19 Apr 2012 19:43:46 EDT
+
 +
{|
 +
|[[File:up.png|up|link=https://docs.scinet.utoronto.ca/index.php/Main_Page]][https://docs.scinet.utoronto.ca Niagara]
 +
|-
 +
|[[File:up.png|up|link=BGQ]][[BGQ]]
 +
|[[File:up.png|up|link=P7 Linux Cluster]][[P7 Linux Cluster|P7]]
 +
|[[File:up.png|up|link=P8]][[P8]]
 +
|-
 +
|[[File:up.png|up|link=SOSCIP_GPU]][[SOSCIP_GPU|SGC]]
 +
|[[File:up.png|up|link=Knights Landing]][[Knights Landing|KNL]]
 +
|[[File:down.png|up|link=HPSS]][https://docs.scinet.utoronto.ca/index.php/HPSS HPSS]
 +
|-
 +
|[[File:up.png|up|]]File System
 +
|[[File:up.png|up|]]External Network
 +
|
 +
|}
  
([[Previous_messages:|Previous messages]])
+
-->
 +
 
 +
System status can now be found at [https://docs.scinet.utoronto.ca docs.scinet.utoronto.ca]
 +
 
 +
 
 +
<b> Mon 23 Apr 2018 </b> GPC-compute is decommissioned, GPC-storage available until <font color=red><b>30 May 2018</b></font>
 +
 
 +
<b> Thu 18 Apr 2018 </b>  Niagara system will undergo an upgrade to its Infiniband network between 9am and 12pm, should be transparent to users, however there is a chance of network interruption. 
 +
 
 +
<b> Fri 13 Apr 2018 </b> HPSS system will be down for a few hours on <b>Mon, Apr/16, 9AM</b>, for hardware upgrades, in preparation for the eventual move to the Niagara side.
 +
 
 +
<b> Tue 10 Apr 2018 </b> Niagara is open to users.
 +
 
 +
<b> Wed 4 Apr 2018 </b> We are very close to the production launch of Niagara, the new system installed at SciNet.
 +
While the RAC allocation year officially starts today, April 4/18, the Niagara system is still undergoing some final tuning and software updates, so the plan is to officially open it to users on next week.
 +
 
 +
All active GPC users will have their accounts, $HOME, and $PROJECT, transferred to the new
 +
Niagara system.  Those of you who are new to SciNet, but got RAC allocations on Niagara,
 +
will have your accounts created and ready for you to login.
 +
 
 +
We are planning an extended [https://support.scinet.utoronto.ca/education/go.php/370/index.php Intro to SciNet/Niagara session], available in person at our office, and webcast on Vidyo and possibly other means, on Wednesday April 11 at noon EST.
 +
 
 +
<!-- [https://support.scinet.utoronto.ca/wiki/index.php/Previous_messages:] -->

Latest revision as of 14:23, 7 May 2018

System Status

System status can now be found at docs.scinet.utoronto.ca


Mon 23 Apr 2018 GPC-compute is decommissioned, GPC-storage available until 30 May 2018

Thu 18 Apr 2018 Niagara system will undergo an upgrade to its Infiniband network between 9am and 12pm, should be transparent to users, however there is a chance of network interruption.

Fri 13 Apr 2018 HPSS system will be down for a few hours on Mon, Apr/16, 9AM, for hardware upgrades, in preparation for the eventual move to the Niagara side.

Tue 10 Apr 2018 Niagara is open to users.

Wed 4 Apr 2018 We are very close to the production launch of Niagara, the new system installed at SciNet. While the RAC allocation year officially starts today, April 4/18, the Niagara system is still undergoing some final tuning and software updates, so the plan is to officially open it to users on next week.

All active GPC users will have their accounts, $HOME, and $PROJECT, transferred to the new Niagara system. Those of you who are new to SciNet, but got RAC allocations on Niagara, will have your accounts created and ready for you to login.

We are planning an extended Intro to SciNet/Niagara session, available in person at our office, and webcast on Vidyo and possibly other means, on Wednesday April 11 at noon EST.