There’s a chance that IBM has just unveiled the blueprint for the future of AI development with an analog AI chip that’s said to be up to 14 times more power efficient than current industry-leading components.
One of the biggest problems with generative AI is how power-hungry the technology currently is — and potentially will one day become. The cost of training models and running the infrastructure will only skyrocket as the space matures. For example, according to data from ChatGPT, it costs more than $700,000 a day to operate insider.
IBM’s prototype chip, which the company presented in Nature, aims to ease the pressure on companies that develop and operate generative AI platforms such as Midjourney or GPT-4 by reducing energy consumption.
This is due to the way the analog chip is built; These components differ from digital chips in that they can manipulate analog signals and understand gradations between 0 and 1. Digital chips are the most common these days, but they only work with unique binary signals. There are also differences in functionality, signal processing and areas of application.
Nvidia’s chips, including the H100 Tensor Core GPU And A100 Tensor Core GPUare primarily the components that power many of today’s generative AI platforms. However, should IBM continue to develop the prototype and prepare it for the mass market, it could well one day displace Nvidia as its current mainstay.
IBM claims that its 14nm analog AI chip, which can encode 35 million phase-change memory devices per component, can model up to 17 million parameters. The company also said its chip mimics how a human brain works, with the microchip performing calculations directly in memory.
The benefits of using such a chip have been demonstrated in several experiments, including one in which a system was able to transcribe audio recordings of people speaking with an accuracy very close to digital hardware setups.
The IBM prototype was about 14 times more efficient per watt, although simulations support this previously shown Such hardware could be between 40 and 140 times more power efficient than today’s leading GPUs.
Read more
Source : www.techradar.com