d-Matrix CTO and co-founder Sudeep Bhoja spoke at Hot Chips about how in-memory compute and our Corsair and JetStream accelerators are shaping the future of efficient AI inference.
The explosive growth of AI powered apps–and their immense compute requirements–is leading us to rethink the way we power inference. d-Matrix CTO and co-founder Sudeep Bhoja dove into in-memory compute and how our AI accelerator, Corsair, uses this approach to make AI growth sustainable while keeping up with performance demands. He also introduces JetStream, a purpose-built I/O accelerator for AI inference.