• Tuesday, May 07, 2024
businessday logo

BusinessDay

World’s biggest chip created to meet demands of AI

World’s biggest chip created to meet demands of AI

The race among semiconductor makers to gain an edge in the booming market for specialised AI processors has just given rise to the world’s biggest computer chip.

While chip circuitry continues to get smaller, the slab of silicon, developed by Californian startup Cerebras, has a surface area slightly larger than a standard ipad and is more than 80 times bigger than its closest competitor. It also eats up as much electricity as all the servers contained in one and a half racks — the towers of computers in data centres that stand more than six feet tall.

The mammoth chip, due to be unveiled on Monday after nearly four years of development, is the starkest sign yet of how traditional thinking is being turned on its head as the chip industry struggles with the demands of artificial intelligence.

It also highlights giant leaps in the amount of computing power that are being thrown at the most complex AI problems — something that prompted US research group Openai to raise $1bn from Microsoft last month, hoping to ride the exponential hardware curve to reach human-level AI.

Most chipmakers have been looking to create smaller, modular elements, known as “chiplets”, out of which today’s most advanced chips are assembled, according to Patrick Moorhead, a US chip analyst. Cerebras, by contrast, has jettisoned that conventional approach and instead come up with what is in effect an entire computing cluster o a single chip, he says.

The race to build a new generation of specialised AI chips, under way for several years, is finally reaching a critical point, with several companies — including Intel, Habana Labs and UK start- up Graphcore — either just starting or promising to deliver their first chips to customers before the end of this year.

Cerebras did not name what it said was a number of customers already receiving its chips, although they are likely to be best suited for the massive computing tasks undertaken by the biggest internet companies.

More than 50 companies have been trying to develop specialised chips for AI. Most of these are used for inference, the task of applying a trained AI system to real-world examples, rather than the far more data-intensive job of training the deep learning models in the first place. That challenge has been taken on by a handful of start-ups like Cerebras, Graphcore and Wave Computing, as well as Chinese challenger Cambricon.

The length of time it has taken for companies like these to start shipping products shows that the technical challenges were much greater than most had expected, said Linley Gwennap, principal analyst at the Linley Group, a US chip research firm. That has not prevented some of the productless start-ups attracting high valuations. Cerebras has raised more than $200m in venture capital, with its latest round, late last year, valuing it at around $1.6bn, said Andrew Feldman, chief executive.