Explore UCD

UCD Home >

Sonic HPC

Next Generation Sonic HPC/AI Implementation

The Next Generation Sonic HPC/AI Implementation was set up to replace "end of life" infrastructure on our existing "Sonic" HPC and to bring increased resources to the UCD research community. HEA funding was secured through the UCD Research Office.

For updates please see our project page:

The UCD Sonic High Performance Computing (HPC) cluster is a group of high powered servers networked together to tackle large computational tasks. The Sonic HPC cluster is a shared campus resource open to all researchers.

Research IT provides High Performance Computing through our Sonic HPC, for more details on getting an account, user guide, software provided and cluster hardware see below.

For help, advice or additional software requirements please see the IT Support Hub

Sonic HPC

Research IT HPC Cluster
Cluster Name Sonic
Cluster Usage (opens in a new window)Real Time Information (visible using the staff and research VPNs)
No. of Compute Nodes 53
Total Number of Cores 1468 (hyperthreading disabled)
Processor Speed

20 nodes with 2 Intel Xeon 6152, 2.1GHz, 22 cores each

4 nodes with  2 Intel Xeon 6140, 2.3Ghz, 18 cores each

 24 nodes with 2 Intel E5-2660 v2, 2.2Ghz, 10 cores each

4 nodes with with 2 Intel E5-2620 v2, 2.1Ghz, 6 cores each

1 node with 2 Intel E5-2620 v3, 2,4Ghz 6 cores each

Memory per Nodes 128Gb (24) 256Gb (7)  384 (20) 786 (1) 1.5TB (1)
Interconnect Infiniband QDR 40GBs
Home Directory Quota 50GB
Scratch Space 180TB 
Additional Nodes Types 

MEM2 - High Memory - 765 GB RAM 4 * 2.4GHz (6 cores)

MEM3 - High Memory - 1.5TB RAM 

3 GPU Servers each with 2 Nvidia Tesla V100 . CPU  2.1Ghz RAM =256GB

 

There is no direct access to the new cluster externally . If you have a @ucd.ie email address you can apply for a UCD VPN account using the VPN request form on and access the cluster using this. Please ensure you are logged into UCD Connect to access the form 

Sonic Topology

The following is the hardware spec of what was purchased for the upgrade of the Sonic Cluster. Interconnect has been upgraded to infiniband and a parallel file system (BeeGFS) has been implemented across 4 storage servers. New login and head nodes complete the infrastructure environment . There are 20 new standard compute nodes which no longer have hyperthreading enabled and contain 384GB of RAM available and are equipped with 2 Intel Xeon Gold 6152 (22 cores each) CPU. There are 3 GPU servers each containing 2 Nvidia Tesla V100's. The school of Computer Science has contributed to the hardware purchase of the cluster and it's users have larger entitlements to that hardware. Home directories have quotas implemented at 50GB. The parallel storage is 180GB in size and is shared across the cluster. This storage is only for the computational use and not long term storage of data. In order for the cluster to remain online files only than 6 months will be removed off this storage 

Storage Nodes Manu Model CPU Spec Hard Drive Spec Memory/GB 
Storage 1 Dell R740XD  2 X Intel Xeon Gold 6136   24 X 2 TB 10K RPM SAS 12Gbp 384
Storage 2 HP R740XD  2 X Intel Xeon Gold 6136    24 X 2 TB 10K RPM SAS 12Gbp 384
Storage 3 Dell  R740XD 2 X Intel Xeon Gold 6246 24 X 2 TB 10K RPM SAS 12Gbp 384
MetaData 1 Dell R740XD   2 X Intel Xeon Gold 6136    24 X 2.4TB 10K RPM SAS 12Gbps + 2 X 800GB SSD Write Intensive 12GB  384
MetaData  2 Dell R740XD   2 X Intel Xeon Gold 6136   24 X 2.4TB 10K RPM SAS 12Gbps + 2 X 800GB SSD Write Intensive 12GB  384
Login Nodes Manu  Model  CPU Spec   Memory /GB
Login Node  Dell  R640  2 X Intel Xeon Gold 5118   256 
Head Node  Dell  R640  2 X Intel Xeon Gold 5118  256 
Compute Nodes Manu Model  CPU Spec Memory/GB 
Sonic 1 (to be decommissioned) Dell C6220 V2 2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 2 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128  
Sonic 3 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 4 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 5 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 6 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 7 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 8 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 21 (to be decommissioned) Dell C6220 V2 2 X Intel E5-2620 v2 (2.1Ghz, 6 Cores) 256 
Sonic 22 (to be decommissioned) Dell C6220 V2 2 X Intel E5-2620 v2  (2.1Ghz, 6 Cores) 256
Sonic 23 (to be decommissioned) Dell C6220 V2 2 X Intel E5-2620 v2  (2.1Ghz, 6 Cores) 256
Sonic 24 (to be decommissioned) Dell C6220 V2   2 X Intel E5-2620 v2  (2.1Ghz, 6 Cores) 256
Sonic 25 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 26 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 27 (to be decommissioned) Dell  C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 28 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 29 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 30 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 31 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 32 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 33 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 34 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 35 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 36 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 37 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 38 (to be decommissioned) Dell C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128 
Sonic 39 (to be decommissioned) Dell  C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 40 (to be decommissioned) Dell  C6220 V2  2 X Intel E5-2660 v2 (2.2Ghz, 10 cores ) 128
Sonic 43 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384
Sonic 44 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 45 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 46 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384
Sonic 47 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 48 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 49 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 50 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 51 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 52 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 53 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 54 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 55 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 56 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 57 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 58 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 59 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384
Sonic 60 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384
Sonic 61 Dell R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 62 Dell  R640 2 X Intel Xeon Gold 6152 (2.1GHz, 22 cores) 384 
Sonic 63 Dell R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 64 Dell R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 65 Dell  R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 66 Dell R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 67 Dell R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 68  Dell R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 69  Dell R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 70  Dell  R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 71 Dell  R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 72  Dell  R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 73  Dell R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
Sonic 74  Dell  R640 2 X Intel Xeon Gold 6252 (2.1Ghz, 24 cores) 384
GPU Node Manu  Model  CPU Spec  & GPU Spec Memory/GB
GPU 1 Dell R740XD 2 X Xeon 6140  (2.3Ghz, 18 Cores) & 2 Nvidia Tesla V100 (32GB) 256
GPU 2 Dell R740XD  2 X Xeon 6140  (2.3Ghz, 18 Cores) & 2 Nvidia Tesla V100 (32GB) 256 
GPU 3 Dell R740XD  2 X Xeon 6140  (2.3Ghz, 18 Cores) & 2 Nvidia Tesla V100 (32GB) 256 
GPU 4 Dell  R740 2 X Xeon 6240 (2.1Ghz, 18 Cores) & 2 Nvidia Tesla V100 (32GB) 384
GPU 5 Dell R740 2 X Xeon 6240 (2.1Ghz, 18 Cores) & 2 Nvidia Tesla V100 (32GB) 384
GPU 6 Dell R7525 2 X AMD 7452 (2.35GHz, 32 Cores) & 2 Nvidia Tesla A100 (40GB)  384
GPU 7 Dell  R7525 2 X AMD 7452 (2.35GHz, 32 Cores) & 2 Nvidia Tesla A100 (40GB) 384
High Memory Node Dell  Model CPU Spec

Memory/GB
Mem3 Dell R640 2 X Intel Xeon 6140  (2.3Ghz, 18 Cores) 1536
Below is a topology of the Infrastructure servers on Sonic illustrating both the compute Infiniband network or the Ethernet services network 

UCD IT Services

Computer Centre, University College Dublin, Belfield, Dublin 4, Ireland.

Contact us via the UCD IT Support Hub: www.ucd.ie/ithelp