2 H200 DEEP LEARNING AI SERVER
- GPU: 2 NVIDIA H200 NVL 141GB 600W or 2 NVIDIA RTX PRO 6000 96GB Server 600W
- CPU: Single AMD EPYC 9965 192Cores 384T 500W Processor
- System Memory: 3TB (12 DIMM)
- STORAGE: NVME Gen 5
About
SYMMATRIX® 2 H200/RTX Pro 6000 GPU Deep Learning System is a scalable and configurable 2U Rackmount System 2 Double Width NVIDIA H200-141GB or RTX Pro 6000 Blackwell Server GPU. Single AMD EPYC 9965 192Cores 384T 500W Processor.
SYMMATRIX provides full range of AI GPU Systems from 1 to 10 GPU custom to your requirements.
Specification
Specification
GPU
Options:
2 NVIDIA H200 NVL 141GB 600W or 2 NVIDIA RTX PRO 6000 Blackwell 96GB Server 600W
2 NVIDIA H200 NVL 141GB 600W or 2 NVIDIA RTX PRO 6000 Blackwell 96GB Server 600W
CPU
Single AMD EPYC 9965 192Cores 384T 500W Processor
System Memory
12 DIMM slots
3TB DDR5 6400MHz ECC RDIMM
Storage
Type A: 16 * 2.5” NVMe/SATA/SAS4 + 8 * 2.5" SATA/SAS4 drives
Type B: 12 * 3.5” NVMe/SATA/SAS4 drives
Network
2X 10GbE ports + 1X IPMI
100G to 800G High Speed Networking Optional
System Weight
Gross Weight ~ 25 KG
System Dimension
Dimension: 19"(W) x 2U (H) x 740mm (D)
Maximum Power Requirements
2400W 1+1 Redundant, Titanium
Operating Temperature Range
0° C ~ 35° C
Support & Warranty
Three or Five Years on-site parts and services, NBD 8x5
You May Also Like
Related products
-

10 GPU 2 XEON DEEP LEARNING AI SERVER
SKU: SMX-R4051More Information- GPU: 10 NVIDIA H100, A100, L40, A40, RTX6000, 2-slot GPU
- CPU: 2 4th Generation Intel Xeon Scalable Processors
- System Memory: 8 TB (32 DIMM)
- STORAGE: NVME
-

4 GPU 1 EPYC DEEP LEARNING AI SERVER
SKU: SMXB8021G88More Information- GPU : 4 NVIDIA A100, V100, RTXA6000, A40, RTX8000, T4
- CPU: 64 CORES (1 AMD EPYC ROME)
- PCIe Gen 4.0 support
- System Memory: 2 TB (16 DIMM)
- 2 2.5" SATA SSD Hotswap bays, 2 NVMe M.2 SSD
- 1U Rackmount
-

RTX PRO SERVER
SKU: SMX-R4051-1More Information- GPU: 8 NVIDIA RTX PRO 6000 Blackwell Server Edition
- GPU MEMORY : 8 X 96GB GDDR7 with ECC = 768GB
Our Customers





























Previous
Next

