Despite the competition coming in droves, Nvidia has managed to maintain its leading position in the global market for Artificial Intelligence (AI) chips used in cloud and data centers.
Nvidia has also managed to maintain a large gap between itself and the rest, according to a new report from technology research firm Omdia, which claims that it took 80.6% of the market share of global revenue in 2020.
Last year, the company generated $3.2 billion in revenue, up from $1.8 billion the year before. The bulk of its earnings came from GPU-derived chips, for which Omdia says are the leading type of AI processors used in cloud and data center equipment.
- Here’s our list of the best cloud computing services right now
- We’ve built a list of the best cloud hosting services on the market
- Check out our list of the best small business servers available
Whether or not Nvidia keeps its dominant position in the future remains to be seen, as Omdia expects the market for AI processors to quickly grow and attract many new suppliers. Global market revenue for cloud and data center AI processors rose 79% last year, hitting $4 billion.
By the time we reach 2026, the company expects revenue to increase ninefold, to $37.6 billion.
For Jonathan Cassell, principal analyst, advanced computing, at Omdia, one advantage Nvidia has over the competition is its familiarity among the clients.
“NVIDIA’s Compute Unified Device Architecture (CUDA) Toolkit is used nearly universally by the AI software development community, giving the company’s GPU-derived chips a huge advantage in the market," he noted.
"However, Omdia predicts that other chip suppliers will gain significant market share in the coming years as market acceptance increases for alternative GPU-based chips and other types of AI processors.”
Omdia sees Xilinx, Google, Intel and AMD as the biggest contenders for at least a larger piece of the AI pie. Xilinx offers field-programmable gate array FPGA products, Google’s Tensor Processing Unit (TPU) AI ASIC is employed extensively in its own hyperscale cloud operations, while Intel’s racehorse comes in the form of its Habana AI proprietary-core AI ASSPs and its FPGA products for AI cloud and data center servers.
AMD, currently ranked fifth, offers GPU-derived AI ASSPs for cloud and data center servers.
- Here’s our rundown of the best VPS hosting providers out there