NVIDIA data center Available

NVIDIA A10

A-Series ยท Ampere Architecture

The NVIDIA A10 is a versatile single-slot data center GPU for AI inference, virtual desktop, and graphics workloads. Its compact 150W design enables dense deployments in standard servers.

Key Features

Single-slot design 150W TDP RT Cores for ray tracing 4x DP 1.4 outputs NVENC/NVDEC

Full Specifications

Compute

Architecture Ampere
Process Node 8nm Samsung
CUDA Cores 9,216
Tensor Cores 288
RT Cores 72
Base Clock 885 MHz
Boost Clock 1695 MHz
FP32 Performance 31.24 TFLOPS
FP16 Performance 125 TFLOPS
BF16 Performance 125 TFLOPS
INT8 Performance 250 TOPS

Memory

Memory Size 24 GB
Memory Type GDDR6
Memory Bus 384-bit
Memory Bandwidth 600 GB/s

Power & Physical

TDP 150W
Form Factor PCIe
Slot Width 1-slot
Card Length 267 mm
Power Connectors 1x 8-pin

Features & Connectivity

PCIe Version PCIe 4.0 x16
NVLink Support No
Multi-GPU Support No
Display Outputs 4x DisplayPort 1.4

Availability

MSRP (USD) Contact for pricing
Release Date Apr 2021
Status Available

Industries

Use Cases

AI Inference Virtual Desktop Graphics Rendering Video Encoding

Interested in the NVIDIA A10?

Get pricing, availability, and bulk discount information from our team.

Enquire Now

Related GPUs

NVIDIA data center

NVIDIA H100 SXM

Memory

80GB HBM3

FP32

66.91 TFLOPS

TDP

700W

FP16

989.4 TFLOPS

Available View Specs
NVIDIA data center

NVIDIA H100 PCIe

Memory

80GB HBM3

FP32

51.22 TFLOPS

TDP

350W

FP16

756 TFLOPS

Available View Specs
NVIDIA data center

NVIDIA H200 SXM

Memory

141GB HBM3e

FP32

66.91 TFLOPS

TDP

700W

FP16

989.4 TFLOPS

Available View Specs
NVIDIA data center

NVIDIA B200

Memory

192GB HBM3e

FP32

90 TFLOPS

TDP

1000W

FP16

1800 TFLOPS

Available View Specs