Today, Qualcomm announced that it will provide a new data center chip that can accelerate the speed of AI processing. The Qualcomm Cloud AI 100 is designed to address the most important aspects of cloud AI inferencing such as power consumption, processing ease, signal processing expertise, etc.
Qualcomm executive Keith Kressin said the company will use its expertise in the mobile space and leverage its ability to design chips using the latest manufacturing technologies. The key feature of this chip is energy efficiency. The market is expected to reach $17 billion by 2025. Due to the slowdown in smartphone growth, increased competition, and increased legal proceedings, Qualcomm has fallen into stagnation. In response to this, the company began to look for new markets to reverse the situation. Image and speech recognition and data storage services are growing rapidly and expanding the capabilities of the chip.
Semiconductor companies are working hard to optimize traditional chips or provide new ways for Google, Amazon, and Facebook, and these cloud computing providers have even begun to design their own chips. Facebook product manager, Joe Spisak, said the company had 200 trillion predictions a day. Such a large workload has made it difficult for data centers to meet growing demand. This also highlights the urgent need to develop new solutions.
As the leader in the data center processor market, Intel has acquired a number of small businesses that develop alternative chips, hoping to help them deal with artificial intelligence tasks. The company has also added other features to enhance data processing capabilities. NVIDIA has also set up a huge business to provide graphics processing chips for the data center.
Krisin said Qualcomm will disclose more details about its Cloud A1 100 chip later this year. The goal of this chip is to make decisions based on digital voice or picture data stream analysis. He said that this is not a simple revision of the mobile phone processor, and its artificial intelligence processing capacity is 50 times that of the company’s flagship mobile phone chip. The product will begin production in 2020.
This is what the Qualcomm Cloud AI 100 promises:
- 10x performance per watt over the industry’s most advanced AI inference solutions deployed today
- A new and highly efficient chip specifically designed for processing AI inference workloads
- 7nm process node bringing further performance and power advantages
- Available support for industry-leading software stacks, including PyTorch, Glow, TensorFlow, Keras, and ONNX
- Power-efficient signal processing expertise across major areas: Artificial Intelligence, eXtended Reality, Camera, Audio, Video, Gestures