AMD data center Available

AMD Instinct MI300X

Instinct MI300 ยท CDNA 3 Architecture

The AMD Instinct MI300X is AMD's flagship data center GPU accelerator built on the CDNA 3 architecture. With an industry-leading 304GB of HBM3 memory and 5.3 TB/s bandwidth, it is purpose-built for training and running the largest AI models. The massive memory capacity allows entire large language models to fit on a single accelerator, reducing the need for multi-GPU model parallelism.

Key Features

304GB HBM3 memory 5.3 TB/s memory bandwidth Infinity Fabric interconnect CDNA 3 matrix cores OAM form factor ROCm software stack

Full Specifications

Compute

Architecture CDNA 3
Process Node 5nm TSMC
Compute Units 192
Base Clock 1000 MHz
Boost Clock 2100 MHz
FP32 Performance 163.4 TFLOPS
FP16 Performance 1307.4 TFLOPS
BF16 Performance 1307.4 TFLOPS
INT8 Performance 2614.8 TOPS

Memory

Memory Size 304 GB
Memory Type HBM3
Memory Bus 8192-bit
Memory Bandwidth 5300 GB/s

Power & Physical

TDP 750W
Form Factor OAM
Power Connectors OAM connector

Features & Connectivity

PCIe Version PCIe 5.0
NVLink Support No
Multi-GPU Support Yes

Availability

MSRP (USD) Contact for pricing
Release Date Dec 2023
Status Available

Industries

Use Cases

LLM Training Large-scale Inference HPC Simulation Generative AI Scientific Computing

Interested in the AMD Instinct MI300X?

Get pricing, availability, and bulk discount information from our team.

Enquire Now

Related GPUs

AMD data center

AMD Instinct MI300A

Memory

128GB HBM3

FP32

122.6 TFLOPS

TDP

760W

FP16

981 TFLOPS

Available View Specs
AMD data center

AMD Instinct MI250X

Memory

128GB HBM2e

FP32

47.87 TFLOPS

TDP

560W

FP16

383 TFLOPS

Available View Specs
AMD data center

AMD Instinct MI250

Memory

128GB HBM2e

FP32

45.26 TFLOPS

TDP

500W

FP16

362.1 TFLOPS

Available View Specs
AMD data center

AMD Instinct MI210

Memory

64GB HBM2e

FP32

22.63 TFLOPS

TDP

300W

FP16

181 TFLOPS

Available View Specs