Home Tech Nvidia: what’s so good about the tech firm’s new AI superchip?

Nvidia: what’s so good about the tech firm’s new AI superchip?

by Elijah
0 comment
Nvidia: what’s so good about the tech firm’s new AI superchip?

Chipmaker Nvidia has extended its lead in artificial intelligence by unveiling a new ‘superchip,’ a quantum computing service and a new suite of tools to help develop the ultimate science fiction dream : general-purpose humanoid robotics. Here we take a look at what the company is doing and what it could mean.

What is Nvidia doing?

The headline announcement at the company’s annual development conference Monday was the “Blackwell” series of AI chips, used to power the incredibly expensive data centers that drive cutting-edge AI models such as the latest generations by GPT, Claude and Gemini.

One of them, the Blackwell B200, is a fairly straightforward upgrade over the company’s pre-existing H100 AI chip. Training a massive AI model, the size of GPT-4, would currently require around 8,000 H100 chips and 15 megawatts of power, Nvidia said, enough to power around 30,000 typical UK homes.

With the company’s new chips, the same training cycle would only require 2,000 B200s and 4 MW of power. This could lead to a reduction in electricity consumption by the AI ​​industry, or else lead to the same electricity being used to power much larger AI models in the near future.

What makes a chip “great”?

Alongside the B200, the company announced a second part of the Blackwell range – the GB200 “superchip”. It packs two B200 chips onto a single board alongside the company’s Grace processor, to build a system that Nvidia says delivers “30x more performance” for server farms that run, rather than train, chatbots such as Claude or ChatGPT. This system also promises to reduce energy consumption by up to 25 times, the company said.

Putting everything on the same board improves efficiency by reducing the time chips spend communicating with each other, allowing them to spend more of their processing time crunching the numbers that make chatbots sing – or duh. talk less.

Huang demonstrates new chip products at the Nvidia GTC conference. Photograph: Josh Edelson/AFP/Getty Images

What if I want bigger?

Nvidia, which has a market value of over $2 trillion (£1.6 trillion), would be very happy to provide this offer. Take the company’s GB200 NVL72: a single server rack with 72 B200 chips installed, connected by nearly two miles of cabling. It’s not enough ? Why not look at the DGX Superpod, which combines eight of these racks into a single AI data center the size of a shipping container in a box. Pricing wasn’t disclosed at the event, but it’s safe to say that if you have to ask for it, you can’t afford it. Even the latest generation of chips cost around $100,000 a piece.

ignore previous newsletter promotion

Huang reveals details of Nvidia’s “Blackwell” platform. Photograph: Justin Sullivan/Getty Images

And my robots?

Project GR00T – apparently named after Marvel’s tree-like alien, although not explicitly linked to it – is a new base model from Nvidia developed to control humanoid robots. A base model, such as GPT-4 for text or StableDiffusion for image generation, is the underlying AI model upon which specific use cases can be built. They are the most expensive part of the entire industry to create, but they are the drivers of all further innovation because they can be “tuned” to specific use cases down the line.

Nvidia’s basic model for robots will help them “understand natural language and imitate movements by observing human actions – quickly learning coordination, dexterity and other skills in order to navigate, adapt and to interact with the real world.

GR00T teams up with another piece of Nvidia technology (and another Marvel reference) in Jetson Thor, a system-on-a-chip designed specifically to be the brain of a robot. The ultimate goal is an autonomous machine that can be taught, using normal human speech, to perform general tasks, including those for which it has not been specifically trained.

A robot takes the stage at the Nvidia GTC conference. Photograph: Josh Edelson/AFP/Getty Images

Quantum?

One of the few hot sectors that Nvidia isn’t getting its hands on is quantum cloud computing. The technology, which remains at the forefront of research, has already been integrated into offerings from Microsoft and Amazon, and now Nvidia is entering the game.

But Nvidia’s cloud won’t actually be connected to a quantum computer. Instead, the offering is a service that uses its AI chips to simulate a quantum computer, ideally allowing researchers to test their ideas without having to access the (rare and expensive) reality. But eventually, Nvidia will provide access to third-party quantum computers through the platform, he said.

You may also like