Close Menu
The AI Book
    Facebook X (Twitter) Instagram
    The AI BookThe AI Book
    • Home
    • Categories
      • AI Media Processing
      • AI Language processing (NLP)
      • AI Marketing
      • AI Business Applications
    • Guides
    • Contact
    Subscribe
    Facebook X (Twitter) Instagram
    The AI Book
    Daily AI News

    Nvidia set to hop AI forward with next-gen Grace Hopper Superchip

    8 August 2023No Comments3 Mins Read

    [ad_1]

    Head over to our on-demand library to view sessions from VB Transform 2023. Register Here


    Today is a busy day of news from Nvidia as the AI leader takes the wraps off a series of new developments at the annual SIGGRAPH conference.

    On the hardware front, one of the biggest developments from the company is the announcement of a new version of the GH200 Grace Hopper platform, powered with next-generation HBM3e memory technology. The GH200 announced today is an update to the existing GH200 chip announced at the Computex show in Taiwan in May.

    “We announced Grace Hopper recently several months ago, and today we’re announcing that we’re going to give it a boost,” Nvidia founder and CEO Jensen Huang said during his keynote at SIGGRAPH. 

    What’s inside the new GH200 

    The Grace Hopper Superchip has been a big topic for Nvidia’s CEO since at least 2021 when the company revealed initial details.

    Event

    VB Transform 2023 On-Demand

    Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.

     

    Register Now

    The Superchip is based on an Arm architecture, which is widely used in mobile devices and competitive with x86-based silicon from Intel and AMD. Nvidia calls it a “superchip” as it combines the Arm-based Nvidia Grace CPU with the Hopper GPU architecture.

    With the new version of the GH200, the Grace Hopper Superchip gets a boost from the world’s fastest memory: HBM3e. According to Nvidia, the HBM3e memory is up to 50% faster than the HBM3 technology inside the current generation of the GH200.

    Nvidia also claims that HBM3e memory will allow the next-generation GH200 to run AI models 3.5 times faster than the current model.

    “We’re very excited about this new GH200. It’ll feature 141 gigabytes of HBM3e memory,” Ian Buck, VP and general manager, hyperscale and HPC at Nvidia, said during a meeting with press and analysts. “HBM3e not only increases the capacity and amount of memory attached to our GPUs, but also is much faster.”

    Faster silicon means faster, larger AI application inference and training

    Nvidia isn’t just making faster silicon, it’s also scaling it in a new server design.

    Buck said that Nvidia is developing a new dual-GH200-based Nvidia MGX server system that will integrate two of the next-generation Grace Hopper Superchips. He explained that the new GH200 will be connected with NVLink, Nvidia’s interconnect technology.

    With NVLink in the new dual-GH200 server, both CPUs and GPUs in the system will be connected with a fully coherent memory interconnect.

    “CPUs can see other CPUs’ memory, GPUs can see other GPU memory, and of course the GPU can see CPU memory,” Buck said. “As a result, the combined supersized super-GPU can operate as one, providing a combined 144 Grace CPU cores over 8 petaflops of compute performance with 282 gigabytes of HBM3e memory.”

    While the new Nvidia Grace Hopper Superchip is fast, it will take a bit of time until it’s actually available for production use cases. The next generation GH200 is expected to be available in the second quarter of 2024.

    VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

    [ad_2]

    Source link

    Previous ArticleAmplitude taps AI to improve data quality, accelerate product analytics
    Next Article OpenAI launches web crawling GPTBot, sparking blocking effort by website owners and creators
    The AI Book

    Related Posts

    Daily AI News

    Adobe Previews New GenAI Tools for Video Workflows

    16 April 2024
    Daily AI News

    Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

    15 April 2024
    Daily AI News

    8 Reasons to Make the Switch

    15 April 2024
    Add A Comment
    Leave A Reply Cancel Reply

    • Privacy Policy
    • Terms and Conditions
    • About Us
    • Contact Form
    © 2026 The AI Book.

    Type above and press Enter to search. Press Esc to cancel.