Qualcomm might be best known for their chips inside of consumer smartphones, but today the company is announcing a new chipset designed to bring power-efficient AI inference processing to data centers, and in turn, the cloud. Specifically, their new Qualcomm Cloud AI 100 chipset is designed to fit into an enterprise customer’s data center.
The 7nm-based Qualcomm Cloud AI 100 chipset provides customer’s with a full software stack and toolset. Qualcomm believes that they have a compelling offering here thanks to the low power and low latency solution that their Cloud AI 100 offers. Considering the fact that power consumption in data centers is doubling every year, the low power consumption that the Cloud AI 100 offers should be very appealing.
When it comes to raw power, the Qualcomm Cloud AI 100 chip offers up to 50x the performance of a Snapdragon 820 chipset.
According to Qualcomm, by 2025 the market for AI inferencing in data centers will be around $17 billion, so there is no better time like the present to jump into this burgeoning market. With that said, Qualcomm has been invested in artificial intelligence for over a decade, so in many ways this announcement shouldn’t come as a complete surprise.