Artificial intelligence makes GPUs a must-have

0

Traditional processors, also known as CPUs, were designed to perform general purpose computing tasks efficiently in a single cycle. GPUs are purpose-designed to process less complex but multiple logic-based computations in parallel, which makes them incredibly efficient for data heavy processing, and, thus, popular for AI training and inference tasks.

Nearly US$16 billion worth of GPUs went into AI acceleration-related use cases worldwide in 2022. Nvidia is the dominant supplier, owning nearly 80% of the market share in GPU-based AI acceleration. Nvidia’s investments in software frameworks, including the CUDA architecture, ensures that developers and engineers can harness computational efficiency from the GPUs, giving the company a sustained edge over its competition, which lacks similar software support.

Outside of GPUs, field-programmable gate arrays (FPGAs) and application-specific integrated circuits (ASICs) from companies such as Infineon and Texas Instruments make up the rest of the AI chip market. Together, FPGAs and ASICs resulted in roughly US$19.6 billion in sales for 2022, growing an estimated 50% YoY. Over the next eight years, combined spending is expected to grow at a nearly 30% annual growth rate to top US$165 billion by 2030.

Rising chip demand will spur investments

Developers and engineers are rushing to bring AI-first products to market amid a shortage of AI processing hardware. Orders for new GPUs have a six-month backlog, and prices for Nvidia’s A1000 and H1000 line-up of GPUs have shot up significantly, with chips being sold in secondary marketplaces at hefty premiums.

But with such high prices and demand, we believe the entire semiconductor value chain, including foundries, chip designers, and semi equipment suppliers, will benefit as AI spreads. multi-trillion-dollar markets such as advertising, e-commerce, digital media and entertainment, online services, communications, and productivity are all likely to ramp up spending on turnkey AI hardware setups.

There’s no AI without specialized AI hardware

The AI boom will likely spur a wave of data center upgrades that, in turn, gives rise to a new semiconductor investment cycle with the GPU at its core. The rapid proliferation of LLMs will likely result in exponential demand for AI processing and accelerated spending on specialized chips, which could open a hundred-billion-dollar plus market for the semiconductor industry in the near future. Meanwhile, large cloud hyper-scalers will likely continue to invest in R&D to build and deploy chips of their own in a bid to reduce dependence on large chip providers and reduce costs. While AI chip demand may be lumpy for now, Global X believes that the semiconductor value chain is well positioned to capture this opportunity and create a potential investment alternative as AI penetrates new markets.

LEAVE A REPLY

Please enter your comment!
Please enter your name here