Image may not exactly match the product.

NVIDIA Tesla A100 - GPU Computing Processor - NVIDIA Tesla A100 - 40 GB - For P/N: HX240C-M5SX-RF, HX-C240-M6L, HX-C240-M6N, HX-C240-M6S, HX-C240-M6SN, HX-C240-M6SX - HX-GPU-A100

Calculated at Checkout
$52,046.94 $21,719.19
(You save $30,327.75 )

Image may not exactly match the product.

Out of Stock.

Request a quote for re-stock information.

$52,046.94 $21,719.19
(You save $30,327.75 )
Currently out of stock, pre-order may be available

Product Line:
Graphics Processor Manufacturer:
Graphics Processor:
NVIDIA Tesla A100
Video Memory Installed Size:
40 GB

Done shopping? You can create a PDF of your cart for later or for your purchasing dept! Details at checkout.

The NVIDIA A100 Tensor Core GPU builds upon the capabilities of the prior NVIDIA Tesla V100 GPU, adding many new features while delivering significantly faster performance for HPC, AI, and data analytics workloads. Powered by the NVIDIA Ampere architecture-based GA100 GPU, the A100 provides very strong scaling for GPU compute and deep learning applications running in single- and multi-GPU workstations, servers, clusters, cloud data centers, systems at the edge, and supercomputers. The A100 GPU enables building elastic, versatile, and high throughput data centers.
The A100 GPU includes a revolutionary "Multi-Instance GPU" (or MIG) virtualization and GPU partitioning capability that is particularly beneficial to Cloud Service Providers (CSPs). When configured for MIG operation, the A100 permits CSPs to improve utilization rates of their GPU servers, delivering up to 7x more GPU Instances for no additional cost. Robust fault isolation allows customers to partition a single A100 GPU safely and securely. A100 adds a powerful Third-Generation Tensor Core that boosts throughput over V100 while adding comprehensive support for DL and HPC data types, together with a Sparsity feature to deliver a further doubling of throughput.

Specs Overview

Detailed Specs