New AMD Instinct MI300X and MI300A chips are now available to customers through a number of OEM partners.
The chips are intended to help underpin organizations wanting to build their own hardware on which to run artificial intelligence (AI) workloads and train large language models (LLMs).
The MI300X accelerators in particular have high memory bandwidth – which AMD claims to be “industry leading” in performance tests versus the published performance of rival chips – and is optimized for LLM training and inferencing. Compared to previous generation AMD Instinct MI250X accelerators, the MI300X, which are powered by AMD CDNA3 architecture, deliver 1.5 times more memory capacity, 1.7 times more peak theoretical capacity, and 40% more compute workloads.
The MI300A, meanwhile, is an accelerated processing unit (APU) that uses AMD’s 3D packaging and 4th generation Infinity Architecture and combines AMD CDNA 3 GPU cores with Zen 4 CPUs, and 128GB of HBM3 memory. According to AMD, the MI300A delivers approximately 1.9 times the performance-per-watt on FP32 HPC and AI workloads versus the Instinct MI250X.
Current OEM partners offering products containing MI300X and MI300A chips include:
- Dell Technologies, which showcased the Dell PowerEdge XE9680 server containing eight AMD Instinct MI300 Series accelerators at AMD's Advancing AI event in December.
- Hewlett Packard Enterprise (HPE). Its Cray Supercomputing EX255a supercomputing accelerator blade, shown at the same event, is the first to be powered by AMD Instinct MI300 APUs.
- Supermicro. Its new H13 generation of accelerated servers features AMD Instinct MI300 Series accelerators and 4th generation AMD EPYC CPUs.
- Oracle, which plans to add AMD Instinct MI300X-based instances to support its Oracle Cloud Infrastructure (OCI) Supercluster with ultrafast RDMA networking.
Choose an optimal solution that satisfies your computing needs
“AMD Instinct MI300 Series accelerators are designed with our most advanced technologies, delivering leadership performance, and will be in large scale cloud and enterprise deployments,” said Victor Peng, president of AMD.
“By leveraging our leadership hardware, software and open ecosystem approach, cloud providers, OEMs and ODMs are bringing to market technologies that empower enterprises to adopt and deploy AI-powered solutions.”
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
ITPro is a global business technology website providing the latest news, analysis, and business insight for IT decision-makers. Whether it's cyber security, cloud computing, IT infrastructure, or business strategy, we aim to equip leaders with the data they need to make informed IT investments.
For regular updates delivered to your inbox and social feeds, be sure to sign up to our daily newsletter and follow on us LinkedIn and Twitter.
Snowflake bigs up the power of the partner and eyes deeper engagement to tackle business challenges in the enterprise AI era
Snowflake CEO: “Many vendors sell you parts of a car and tell you to build it yourself. At Snowflake we have a different philosophy. We want to give you the car.”
Zoom launches new AI companion features for workplace platform