
Nvidia’s newest Blackwell chips have significantly advanced the training of large artificial intelligence systems, dramatically reducing the number of chips needed to train massive language models, new data released Wednesday shows.The results, published by MLCommons, a nonprofit that issues benchmark performance results for AI systems, detail improvements across chips from Nvidia and Advanced Micro Devices (AMD), among others, Reuters reported. The benchmarks focus on AI training — the phase where systems learn from vast datasets — which remains a key competitive frontier despite the market’s growing focus on AI inference, or responding to user queries.One major finding was that Nvidia and its partners were the only ones to submit data for training a large-scale model like Llama 3.1 405B, an open-source AI system from Meta Platforms with trillions of parameters. This model is complex enough to stress test modern chips and expose their true training capabilities.According to the data, Nvidia’s new Blackwell chips are more than twice as fast per chip as the previous-generation Hopper chips. In the fastest result, a cluster of 2,496 Blackwell chips completed the training task in just 27 minutes. By contrast, more than three times that number of Hopper chips was needed to match or better that performance.At a press conference, Chetan Kapoor, chief product officer at CoreWeave — which collaborated with Nvidia on the benchmark tests — highlighted a broader industry trend.“Using a methodology like that, they’re able to continue to accelerate or reduce the time to train some of these crazy, multi-trillion parameter model sizes,” Kapoor said, referring to a shift from massive monolithic systems to modular training infrastructures made up of smaller chip groups.The benchmark results underline Nvidia’s dominance in the AI training space, even as rivals like China’s DeepSeek claim competitive performance with fewer chips. As the race to power ever-larger AI systems intensifies, chip efficiency in training tasks remains a critical metric.