High-Performance Center Overview

HPC Advisory Council High-Performance Center offers an environment for developing, testing, benchmarking and optimizing products based on clustering technology. The center, located in Sunnyvale, California, provides on-site technical support and enables secure sessions onsite or remotely.

The High-Performance Center provides a unique ability to access the latest systems, CPU, and networking InfiniBand/Ethernet technologies, even before it reaches the public availability. It provides a development testing and tuning environment for applications.


Current Available Systems


Rome


  • Daytona_X AMD 8-node cluster
  • Dual Socket AMD EPYC 7742 64-Core Processor @ 2.25GHz
  • Mellanox ConnectX-6 HDR 200Gb/s InfiniBand/Ethernet
  • Mellanox HDR Quantum Switch QM7800 40-Port 200Gb/s HDR InfiniBand
  • Memory: 256GB DDR4 2677MHz RDIMMs per node
  • Lustre Storage, NFS

  



Venus


  • Supermicro AS -2023US-TR4 8-node cluster
  • Dual Socket AMD EPYC 7551 32-Core Processor @ 2.00GHz
  • Mellanox ConnectX-5 EDR 100Gb/s InfiniBand/Ethernet
  • Mellanox Switch-IB 2 SB7800 36-Port 100Gb/s EDR InfiniBand switches
  • Memory: 256GB DDR4 2677MHz RDIMMs per node
  • 240GB 7.2K RPM SSD 2.5" hard drive per node

     



Helios


  • Supermicro SYS-6029U-TR4 / Foxconn Groot 1A42USF00-600-G 32-node cluster
  • Dual Socket Intel(R) Xeon(R) Gold 6138 CPU @ 2.00GHz
  • Mellanox ConnectX-6 HDR/HDR100 200/100Gb/s InfiniBand/VPI adapters with Socket Direct
  • Mellanox HDR Quantum Switch QM7800 40-Port 200Gb/s HDR InfiniBand
  • Memory: 192GB DDR4 2677MHz RDIMMs per node
  • 1TB 7.2K RPM SSD 2.5" hard drive per node

     



Telesto


  • IBM S822LC POWER8 8-node cluster
  • Dual Socket IBM POWER8 10-core CPUs @ 2.86 GHz
  • Mellanox ConnectX-4 EDR 100Gb/s InfiniBand adapters
  • Mellanox Switch-IB SB7700 36-Port 100Gb/s EDR InfiniBand switch
  • Memory: 256GB DDR3 PC3-14900 RDIMMs per node
  • 1TB 7.2K RPM 6.0 Gb/s SATA 2.5" hard drive per node
  • GPU: NVIDIA Kepler K80 GPUs

Thor


  • Dell PowerEdge R730/R630 36-node cluster
  • Dual Socket Intel® Xeon® 16-core CPUs E5-2697A V4 @ 2.60 GHz
  • Mellanox ConnectX-6 HDR100 100Gb/s InfiniBand/VPI adapters
  • Mellanox Switch-IB 2 SB7800 36-Port 100Gb/s EDR InfiniBand switches
  • Mellanox Connect-IB® Dual FDR 56Gb/s InfiniBand adapters
  • Mellanox SwitchX SX6036 36-Port 56Gb/s FDR VPI InfiniBand switches
  • Memory: 256GB DDR4 2400MHz RDIMMs per node
  • 1TB 7.2K RPM SATA 2.5" hard drives per node

Odin


  • Colfax CX2660s-X6 2U 4-node cluster
  • Dual Socket Intel® Xeon® 14-core CPUs E5-2697 V3 @ 2.60 GHz
  • Mellanox ConnectX-4® EDR InfiniBand and 100Gb/s Ethernet VPI adapters
  • Mellanox Switch-IB SB7700 36-Port 100Gb/s EDR InfiniBand switches
  • GPU: NVIDIA Kepler K80 GPUs
  • Memory: 64GB DDR4 2133MHz RDIMMs per node

     

Ops


  • Colfax CX1350s-XK5 1U 4-node cluster
  • Based on Supermicro SYS-1027GR-TRF
  • Dual Socket Intel® Xeon® 10-core E5-2680 V2 CPUs @ 2.80 GHz
  • NVIDIA Tesla P100 PCIe-3 x16, 16GB HBM2
  • Mellanox ConnectX-4 EDR 100Gb/s InfiniBand adapters
  • Mellanox Switch-IB SB7700 36-Port 100Gb/s EDR InfiniBand switches
  • Mellanox ConnectX®-3 56Gb/s FDR InfiniBand and Ethernet VPI HCA
  • Mellanox SwitchX SX6036 36-Port 56Gb/s FDR InfiniBand switch
  • 500GB 7.2K RPM SATA 2.5" 6Gbps hard drive
  • Dual Rank 32GB DDR3 1600MHz DIMMs memory

     

Jupiter

  • Dell™ PowerEdge™ R720xd/R720 32-node cluster
  • Dual Socket Intel® Xeon® 10-core CPUs E5-2680 V2 @ 2.80 GHz
  • Mellanox ConnectX-4 EDR 100Gb/s InfiniBand adapter
  • Mellanox Switch-IB SB7700 36-Port 100Gb/s EDR InfiniBand switches
  • Mellanox ConnectX®-3 VPI 56Gb/s FDR InfiniBand adapters
  • Mellanox Connect-IB® FDR InfiniBand adapters
  • Mellanox SwitchX SX6036 36-Port 56Gb/s FDR InfiniBand switch
  • R720xd: 24x 250GB 7.2K RPM SATA 2.5" hard drives per node
  • R720: 16x 250GB 7.2K RPM SATA 2.5" hard drives per node with 1 GPU
  • Memory: 64GB DDR3 1600MHz RDIMMs per node
  • GPU: NVIDIA Kepler K40, K20x and K20 GPUs

Juno

  • 1x GIGABYTE R270-T64 Chassis
    • 2 x Cavium ThunderX 48-core ARM processors
    • Memory: 64GB DDR4 2400 MHz
    • Mellanox ConnectX-4 EDR 100Gb/s InfiniBand/VPI adapter
    • SSD 480GB SATA 3
  • 2x GIGABYTE MT30-GS0 Chassis
    • 1x Cavium ThunderX 32-core ARM Processor
    • Memory: 128GB DDR4 2400 MHz
    • Mellanox ConnectX-5 EDR 100Gb/s InfiniBand/VPI adapter
    • SSD 1TB SATA 3
  • Switch : Mellanox Switch-IB 2 SB7800 36-Port 100Gb/s EDR InfiniBand switches

 


Hali

  • 4x Huawei Taishan 2280 Servers
  • Dual Socket Huawei Kunpeng 916 (Hi1616) 32-core ARMv8 CPU Processors at 2.4GHz (@TDP 85W)
  • Memory: 128GB DDR4 2400 MHz
  • Storage: 2x 4TB 7.2K 6Gbps SATA drives
  • Mellanox ConnectX-5 Socket Direct EDR 100Gb/s InfiniBand/VPI adapter
  • Mellanox Switch-IB 2 SB7800 36-Port 100Gb/s EDR InfiniBand switches

 

Lustre Storage


  • 2U Intel S2600iP Server
  • Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz
  • 64KB DDR3 1333 MT/s Memory
  • OS: CentOS 7
  • OST 8TB Intel SSD
  • Network: InfiniBand




The HPC Advisory Council would also like to thank the following equipment providers for their generous donations throughout the High-Performance Center's history.