October 27, 2025
Qualcomm Technologies Inc. unveiled its strategic entry into the competitive data center artificial intelligence market on Monday with the introduction of two new AI accelerator chips, the AI200 and AI250. The announcement spurred a significant market reaction, with the company's stock surging as much as 15% before settling with an 11% gain.
The newly announced processors are designed as cost-efficient solutions for AI inference workloads. They will be sold as accelerator cards for integration into server systems, with Qualcomm emphasizing they will deliver high performance per dollar per watt.
While the company withheld specific details on power consumption and processing speeds, it confirmed the chips are based on its proprietary Hexagon architecture. This is the same architecture that powers the Neural Processing Units (NPUs) in Qualcomm's consumer-grade Snapdragon systems-on-chip, such as the flagship Snapdragon 8 Elite Gen 5 for smartphones.
The data center chips represent a significant scaling up of this technology. The AI200, the less advanced of the two, is equipped with 768 gigabytes of LPDDR memory. This mobile-optimized memory type is power-efficient but offers lower bandwidth than the DDR5 memory typically used in servers. In a key differentiator, the higher-end AI250 is stated to provide more than ten times the memory bandwidth of the AI200, a performance leap that industry observers suggest may be achieved through the use of High Bandwidth Memory (HBM), a standard in data center AI processors from competitors like Nvidia.
Both new accelerators will feature a confidential computing capability, a security technology that encrypts memory into isolated sections accessible only by authorized applications. This feature is also present in Nvidia's flagship Blackwell Ultra processor.
Qualcomm plans to distribute the AI200 and AI250 within water-cooled compute racks that utilize PCIe for internal connectivity and Ethernet for linking multiple systems. These racks may eventually incorporate the server-grade central processing units Qualcomm is known to be developing, mirroring the integrated appliance strategy of competitors.
The company has outlined an aggressive release schedule, with the AI200 slated for a 2026 launch and the more powerful AI250 following in 2027. Qualcomm also committed to an annual refresh cycle for its data center AI processor lineup moving forward, signaling a long-term commitment to challenging incumbent players in the high-stakes AI hardware market.
SOURCE siliconangle