Hardware

A2C2 System Resources

The A2C2 represents ASU’s investment in centralized research computing via high-performance and high-throughput computing with about 4000 cores with high-speed Infiniband interconnects and high-memory computing with a 32 core 1TB RAM symmetric multiprocessing machine. All of the compute nodes have access to 400TB of high-speed scratch storage and 1.5PB of spinning disk primary storage. These compute and storage resources are connected via a 10 Gbps fiber link to the rest of campus and to ASU’s 100 Gbps Internet2 Innovation Platform network. The data center in Goldwater 167 is a 1600 sq ft facility with 750 kW of power, 54 F 46 gpm chilled water, and 69 F 16,000 cfm chilled air. In addition to the physical infrastructure, the A2C2 hosts a team of advanced operations professionals who lead the coordination of the hardware, software stack, and network cyberinfrastructure.

 

New SMP Machine Added in March 2012

A2C2 has added a new computer to the Saguaro cluster. It has 1TB. RAM and 32 processors. All users are welcome to use it if you need access ot large memory. A few things to keep in mind when using the SMP computer:

  • Add a PBS directive to your qsub script which names the queue; i.e.
    #PBS -q smp
  • You may reserve either 16 processors, which allows your program to access 500Gb. RAM, or all 32 processors, allowing access to all the RAM. Keep in mind that your account will be charged for the number of processors (16 or 32) times the amount of time that your program runs.
  • There is a maximum wall clock time of 4 days for this queue.

Please do not submit jobs with a request for 1 processor and expect to use all 1TB RAM. That will just cause any other users’ jobs to crash and you will be responsible for crashing the computer. Keep in mind that there is only 1 of these machines, so the waiting list could be long.

If you have any questions or concerns, please drop a note to support@hpchelp.asu.edu.

Saguaro System Upgrades – 2012

A2C2’s Saguaro system currently contains over 5,000 processor cores; 215 TB of high speed parallel LUSTRE scratch space (scalable to petabytes); 11 TB aggregate RAM and 1.5GB of aggregate L2 cache; double fat tree DDR InfiniBand network, and more than 40 teraFLOPS of computation power.

A2C2 recently purchased capital equipment from Dell to upgrade our system. With thirty-two new Dell M610 computers, each with dual 6-core Intel Westmere processors and 24Gb. RAM, A2C2 hopes to help our researchers maximize their supercomputing potential. The new computers are connected to a quad data rate (QDR) Infiniband switch. By comparison, Saguaro’s older computers have dual 4-core processors and 16Gb. RAM and are connected to dual data rate (DDR) Infiniband.

In addition, A2C2 purchased two new Dell Poweredge R710 computers that will be used as “data movers.” These computers are connected to the QDR and DDR Infiniband network, with 4Gbs ethernet uplinks to the University’s wide area network (WAN) and 10Gbs uplinks to ASU’s research network. These computers also have other network connection capabilities for future use. They will be used primarily to transfer data to and from the Saguaro high speed scratch storage area, user’s home directories, and ETS storage shares.


 
Download