While not as popular as their Snapdragon SoCs, Qualcomm has been offering their Cloud AI line of accelerators for scalable AI inference. The current flagship is the Qualcomm Cloud AI 100 Ultra as a 150 Watt rated PCIe Gen4 x16 card for up to 870 TOPS INT8 performance, 576MB of SRAM, and 128GB LPR4x memory. But given the latest open-source Linux driver patch activity, Cloud AI 200 “AIC200” wares are on the way.
For several years now Qualcomm has been maintaining the QIAC Linux “accel” driver for their cloud accelerator hardware. To date this open-source and upstream Linux driver support has been for the Cloud AI 100 series and more recently for a cheaper Cloud AI 80 Ultra cut-down accelerator card. But now it turns out a Cloud AI 200 series is coming.
I haven’t been able to find any Qualcomm Cloud AI 200 accelerator information on their product site or anywhere else after spotting the AIC200 patches surfacing on Friday. Qualcomm engineers posted the initial bring-up patches for the Cloud AI 200 hardware on the existing QAIC accelerator driver:
“Initial support to the driver to boot up AIC200. AIC200 uses BHIe without BHI, which is something that the MHI bus has not supported until now. While the MHI changes are listed first to facilitate cross-tree merging, they are not needed until the last change in the series.
Also, AIC200 is a different product from AIC100 with MSI-X, different BARs, and different MHI configuration so we finally need some infrastructure in the driver to be able to handle product differences. This is expected to evolve more over time.”
The patches don’t shed light onto any performance expectations of the Qualcomm Cloud AI 200 hardware or any other product details on the next-gen Cloud AI inference accelerators, but given these patches expect to hear more about the Qualcomm AIC200 wares in 2025.