The go-to word in the computer industry is small & compact is always better, but not this time. A California based startup has built the biggest processor in the world, which consists of 1.2 trillion transistors.
This chip is about 100 times the size of a typical chip, it is as big as a dinner plate and it would barely fit in your lap. If we compare to Nvidia’s largest general processing unit, the chip is almost 57 times bigger than it.
The name of this silicon monster is Cerebras Wafer Scale Engine and it is almost 22 centimeters, roughly 9 inches, on each side, and has a memory architecture of the WSE chip.
But, when every other company these days is trying to build a small and compact AI processor chip, then why there is a need for such a huge processing chip.
Well, the answer, according to Cerebras, is that hooking lots of small chips together creates latencies that slow down training of AI models, and it is a huge industry bottleneck.
The company’s chip boasts 400,000 cores or parts that handle processing, tightly linked to one another to speed up data-crunching. It can also shift data between processing and memory incredibly fast.
The firm suggests this gives it an advantage at handling complex machine learning challenges with less lag and lower power requirements than combinations of the other options.
Cerebras claims the Wafer Scale Engine will reduce the time it takes to process some complex data from months to minutes.
Its founder and chief executive Andrew Feldman said the company had “overcome decades-old technical challenges” that had limited chip size.
Feldman says his oversized design also benefits from the fact that data can move around a chip around 1,000 times faster than it can between separate chips that are linked together.
One of the biggest challenges for such huge size processing chip is being energy efficient and keeping it cool. Cerebras has designed a system of water pipes that run close by the chip to prevent it from overheating.
In order to build chip of this size, Cerebras worked closely with contract chip manufacturer TSMC, whose other customers include Apple and Nvidia. For making Cerebra’s giant chip the TSMC adapt its equipment to make one continuous design, instead of a grid of many separate ones.
Paulsen says. Cerebras’ chip is the largest square that can be cut from a 300-millimeter wafer. “I think people are going to see this and say ‘Wow that’s possible? Maybe we need to explore in that direction,’” he says.
While the chips process information much faster, Dr. Ian Cutress, senior editor at the news site AnandTech, said the advances in technology would come at a cost.
“One of the advantages of smaller computer chips is they use a lot less power and are easier to keep cool,” he explained. “When you start to deal with bigger chips like this, companies need specialist infrastructure to support them, which will limit who can use it practically.
“That’s why it’s suited for AI development as that’s where the big dollars are going at the moment.”
The engineers behind the chip believe it can be used in giant data centers and help accelerate the progress of artificial intelligence in everything from self-driving cars to talking digital assistants like Amazon’s Alexa.
Some experts believe these chips will play a key role in the race to create artificial intelligence, potentially shifting the balance of power among tech companies and even nations.
They could feed the creation of commercial products and government technologies, including surveillance systems and autonomous weapons.
As per reports, Cerebras Systems is not planning on selling this chip, because it is very difficult to connect and cool such a huge piece of silicon. The WSE will instead find its application as a part of a new server, and will further be installed in data centers.
Feldman says “a handful” of customers are trying the chip, including on drug design problems. He plans to sell complete servers built around the chip, rather than chips on their own but declined to discuss price or availability.
Cerebras, a three-year-old company backed by more than $200 million in funding.
More in AI