The Next
100x

Follow our journey as we aim to disrupt the economics of AI compute.
Overhead view of a large warehouse filled with server racks.
Explore Blog

D-Matrix lands $44M to build AI-specific chipsets

By:
SiliconAngle

Three-year-old startup d-Matrix Corp. said today that it has closed a $44 million funding round to support its efforts to build a new type of computing platform that support transformer artificial intelligence workloads.

Read Article
News

d-Matrix Explores 'In-Memory' AI Compute Method

By:
VMBlog

Recently, a company called d-Matrix launched out of stealth mode with a $44m Series A round. The co-founders are proven, long-time SV tech innovators. They've developed the first 'baked' in-memory AI model which, for a change, actually is different than what's been out there.

Read Article
Conference Presentation

Developing Scalable AI Inference Chip with Cadence Flow in Azure Cloud

By:
Farhad Shakeri (Sr. Director IT/Cloud)

d-Matrix set up a productive Azure cloud infrastructure running Cadence flow, the lessons learned and the key success factors that led to delivering first AI chip within 14 months.

Watch Video
News

Optimal and Efficient AI Inference: A Better Approach to Hyperscale Computing

By:
Playground Global

Why We Invested in d-Matrix

Read Article
News

D-Matrix Debuts With $44 Million and an AI Solution

By:
Futuriom

A four-year-old startup named d-Matrix has scored $44 million in funding for a silicon solution tailored to massive artificial intelligence (AI) workloads. Investors include M12 (Microsoft’s venture fund) and Marvell Technology (Nasdaq: MRVL).

Read Article
News

d-Matrix Gets Funding to Build SRAM ‘Chiplets’ for AI Inference

By:
Datanami

Hardware startup d-Matrix says the $44 million it raised in a Series A round today will help it continue development of a novel “chiplet” architecture that uses 6 nanometer chip embedded in SRAM memory modules for accelerating AI workloads.

Read Article

D-Matrix lands $44M to build AI-specific chipsets

By:
SiliconAngle

Three-year-old startup d-Matrix Corp. said today that it has closed a $44 million funding round to support its efforts to build a new type of computing platform that support transformer artificial intelligence workloads.

Read Article
News

D-Matrix’s new chip will optimize matrix calculations

By:
VentureBeat

Today, D-Matrix, a company focused on building accelerators for complex matrix math supporting machine learning, announced a $44 million series A round. Playground Global led the round with support from Microsoft’s M12 and SK Hynix. The three join existing investors Nautilus Venture Partners, Marvell Technology and Entrada Ventures.

Read Article
News

VMblog Expert Interview

By:
VMBlog

Recently, a company called d-Matrix launched out of stealth mode with a $44m Series A round. The co-founders are proven, long-time SV tech innovators. They've developed the first 'baked' in-memory AI model which, for a change, actually is different than what's been out there.

Read Article
Conference Presentation

Developing Scalable AI Inference Chip with Cadence Flow in Azure Cloud

By:
Farhad Shakeri (Sr. Director IT/Cloud)

d-Matrix set up a productive Azure cloud infrastructure running Cadence flow, the lessons learned and the key success factors that led to delivering first AI chip within 14 months.

Read Article
Conference Presentation

Accelerating Transformers for Efficient Inference of Giant NLP Models

By:
Sudeep Bhoja (Co-founder / CTO)

Large Transformer Models are finding uses across speech, text, video, and images. In this presentation, we explain the challenges of accelerating these large models in hardware.

Read Article
No items found.
White Paper

Designing Next-Gen AI Inferencing Chips Using Azure's Scalable IT Cloud Infrastructure

By:
d-Matrix, Microsoft & Six Nines

A case study describing how d-Matrix built its first proof-of-concept AI chip entirely in Microsoft's Azure cloud.

Read Article
No items found.