AI Energy Crisis Boosts Interest in Chips That Do It All

April 2, 2024

Bloomberg’s Jane Lanhee Lee talks about energy intensive nature of AI and how in-memory computing solutions can make AI inference more sustainable.

The intensive power consumption of Nvidia’s main product, a type of chip known as a graphics processing unit, makes it a relatively inefficient choice to use for inference, says Sid Sheth, founder and CEO of d-Matrix, a Silicon Valley-based chip startup that’s raised $160 million from investors like Microsoft Corp. and Singaporean state-owned investor Temasek Holdings Pte.

Read the full article on Bloomberg

 

Suggested Articles

Deep divers off pier

Impact of the DeepSeek Moment on Inference Compute 

By d-Matrix Team | January 31, 2025

What is AI Inference and why it matters in the age of Generative AI

By Aseem Bathla | June 4, 2025

Introducing dmx.compressor

By Zifei Xu | October 16, 2024