Forget GPUs, d-Matrix uses DIMC to win AI Inference

November 26, 2024

Cerebral Valley ‘Deep Dive’ with CEO Sid Sheth exploring our DIMC architecture for a new way forward with AI, the competitive landscape of AI inference, and Sid’s personal journey as a multi-time entrepreneur.

Today, we’re talking with Sid Sheth, co-founder and CEO of d-Matrix.

d-Matrix is tackling AI inference with its novel Digital In-Memory Compute (DIMC) architecture to boost efficiency and speed for generative AI workloads in cloud data centers. Founded in 2019, d-Matrix is targeting the growing demand for AI inference, building what Sid describes as “the most efficient computing platform for AI inference” by focusing on customer-driven innovation and first-principles engineering.

Key Takeaways:

Inference Dominance: d-Matrix bet early on inference becoming the dominant AI computing challenge, a vision validated by the explosion of generative AI demand.

DIMC Innovation: The DIMC architecture combines digital accuracy with in-memory compute efficiency, chiplet based scaling and block floating point numerics, achieving industry-leading performance, energy efficiency and throughput.

Future Expansion: While currently focused on cloud data centers, d-Matrix envisions scaling it’s chiplet-based platform into workstations and client PCs in the future.

Read the full article on Cerebral Valley

Suggested Articles

Deep divers off pier

Impact of the DeepSeek Moment on Inference Compute 

By d-Matrix Team | January 31, 2025

Introducing dmx.compressor

By Zifei Xu | October 16, 2024

Next-Gen Techniques for AI Inference from d-Matrix Spotlighted at NeurIPS

By Jill Ratkevic | December 6, 2024