The companies are trying to build chips that could help run their AI applications better and at lower costs, which could affect the few vendors — such as Nvidia — that make these types of graphics processors.
Alibaba told CNBC April 19 it recently established a research and development division — called the Academy for Discovery, Adventure, Momentum and Outlook — which has been working on an AI chip that anyone can access through its public cloud. Alibaba hopes to strengthen its cloud to enable commerce and AI within a multitude of industries.
Other tech giants are carving out similar paths. Google’s parent company Alphabet is using Google’s custom-built tensor processing unit to advance its own machine learning tasks, and last year, Google released a second-generation TPU that could handle more challenging computing work, which it started letting the public use through its cloud in February, according to CNBC.
Facebook is also tinkering with developing an AI chip to improve operations for internal researchers, and Apple built a “neural engine” element into its chips in the iPhone X. Microsoft hopes to add an AI chip to the next version of its mixed-reality headset, HoloLens, and Tesla is building an AI chip for its vehicles. However, these chips would use data center servers as opposed to the cloud, which could offer more power, direct network connectivity and more data storage.
More articles on artificial intelligence:
How 1 Massachusetts practice uses AI to screen refill requests
Mass General researchers develop AI technique to speed image reconstruction in radiology
US News & World Report ranks top grad schools for AI