[ad_1]
Advanced Micro Devices said it is unveiling its AMD Instinct MI300X and MI300A accelerator chips for AI processing in data centers.
AMD CEO Lisa Su announced the new AMD Instinct chips at the company’s data center event today. In the third-quarter analyst call on October 31, Su said she expected MI300 would be the fastest to ramp to $1 billion in sales in AMD history.
AMD is also introducing its AMD Ryzen 8040 Series processors, previously code-named Hawk Point, for AI-based laptops.
AMD also touted its NPU chips for AI processing. The Santa Clara, California-based company said millions of Ryzen-based AI PCs have shipped in 2023 across big computer makers.
VB Event
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
Learn More
“It’s another huge moment for PCs,” said Drew Prairie, head of communications at AMD. “The AI performance will take a step up from the performance in the market now.”
On Llama 2, the performance of the 8040 will be 1.4 times better than current Ryzen chips that started shipping in Q2. AMD is also working on next-gen Ryzen processors, code-named Strix Point, with AMD XDNA 2 and NPU for generative AI. It will be shipping in 2024.
AMD Instinct MI300 Series
AMD showed off hardware coming from Dell Technologies, Hewlett Packard Enterprise, Lenovo, Meta, Microsoft, Oracle, Supermicro and others showcase.
And AMD said its ROCm 6 open software ecosystem combines next-gen hardware and software to deliver
eight times better generational performance increase, power advancements in generative AI and simpler deployment of AMD AI solutions.
The AMD Instinct MI300X accelerators have industry leading memory bandwidth for generative AI and leadership performance for large language model (LLM) training and inferencing. And the AMD Instinct MI300A accelerated processing unit (APU) combines both a CPU and GPU in the same product to deliver performance for high-performance computing and AI workloads – combining the latest AMD CDNA 3 architecture and Zen 4 CPUs.
“AMD Instinct MI300 Series accelerators are designed with our most advanced technologies, delivering leadership performance, and will be in large scale cloud and enterprise deployments,” said Victor Peng, president at AMD, in a statement. “By leveraging our leadership hardware, software and open ecosystem approach, cloud providers, OEMs and ODMs are bringing to market technologies that empower enterprises to adopt and deploy AI-powered solutions.”
Customers leveraging the latest AMD Instinct accelerator portfolio include Microsoft, which recently announced the new Azure ND MI300x v5 Virtual Machine (VM) series, optimized for AI workloads and powered by AMD Instinct MI300X accelerators.
Additionally, El Capitan – a supercomputer powered by AMD Instinct MI300A APUs and housed at Lawrence Livermore National Laboratory – is expected to be the second exascale-class supercomputer powered by AMD, delivering more than two exaflops of double precision performance when fully deployed. Dell, HPE, and Supermicro announced new systems. And Lenovo said its support for the accelerators will be read in the first half of 2024.
Today’s LLMs continue to increase in size and complexity, requiring massive amounts of memory and compute. AMD Instinct MI300X accelerators feature a best-in-class 192 GB of HBM3 memory capacity as well as 5.3 TB/s peak memory bandwidth 2 to deliver the performance needed for increasingly demanding AI workloads.
The AMD Instinct Platform is a leadership generative AI platform built on an industry standard OCP design with eight MI300X accelerators to offer an industry leading 1.5TB of HBM3 memory capacity. The AMD Instinct Platform’s industry standard design allows OEM partners to design-in MI300X accelerators into existing AI offerings and simplify deployment and accelerate adoption of AMD Instinct accelerator-based servers.
Compared to the Nvidia H100 HGX, the AMD Instinct Platform can offer a throughput increase of up to 1.6 times when running inference on LLMs like BLOOM 176B 4 and is the only option on the market capable of running inference for a 70B parameter model, like Llama2, on a single MI300X accelerator; simplifying enterprise-class LLM deployments and delivering outstanding TCO, AMD said.
Energy efficiency is of utmost importance for the HPC and AI communities. But these workloads are extremely data- and resource-intensive. AMD Instinct MI300A APUs benefit from integrating CPU and GPU cores on a single package delivering a highly efficient platform while also providing the compute performance to accelerate training the latest AI models.
AMD is setting the pace of innovation in energy efficiency with the company’s 30×25 goal aiming to
deliver a 30x energy efficiency improvement in server processors and accelerators for AI-training and HPC from 2020-2025.
Ryzen 8040 Series
With millions of AI PCs shipped to date, AMD announced new mobile processors with the launch of the latest AMD Ryzen 8040 Series processors that deliver even more AI compute capability.
AMD also launched Ryzen AI 1.0 Software, a software stack that enables developers to easily deploy apps that use pretrained models to add AI capabilities for Windows applications.
AMD also previewed that the upcoming next-gen “Strix Point” CPUs, slated to begin shipping in 2024, will include the XDNA 2 architecture to deliver more than a three times increase in AI compute performance compared to the prior generation that will enable new generative AI experiences. Microsoft also joined to discuss how they are working closely with AMD on future AI experiences for Windows PCs.
With the integrated Ryzen AI NPU on-die on select models, AMD is bringing more state-of-the-art AI PCs to market, with up to 1.6 times more AI processing performance than prior AMD models.
To further enable great AI experiences, AMD is also making Ryzen AI Software widely available for users to easily build and deploy machine learning models on their AI PCs.
AMD Ryzen 8040 Series processors are the latest to join the powerful Ryzen Series processors line and are expected to be broadly available from leading OEMs including Acer, Asus, Dell, HP, Lenovo, and Razer, beginning in Q1 2024.
“We continue to deliver the highest performance and most power efficient NPUs with Ryzen AI technology to reimagine the PC” said Jack Huynh, SVP and GM of AMD computing and graphics business, in a statement. “The increased AI capabilities of the 8040 series will now handle larger models to enable the next phase of AI user experiences.”
The Ryzen 9 8945HS offers up to 64% faster video editing and up to 37% faster 3D rendering than the competition. Meanwhile, gamers can enjoy up to 77% faster gaming than our competitors.
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.
[ad_2]
Source link