Nvidia’s market value jumped $207 billion (roughly Rs. 17 lakh crore) in the two days since May 24, as the American chip designer gave a surprisingly good revenue outlook after a season of bad news for the semiconductor industry. Yet a handful of other tech companies could benefit even more from the race toward artificial intelligence.
There are many ways to put these predictions and their subsequent reactions into context. The revenue was 53 percent higher than analysts’ expectations and 33 percent higher than the company’s previous record set in March last year. The first-day gain marks the third-largest gain in U.S. history, while the two-day gain eclipsed the market capitalization of all but 48 stocks worldwide.
Among the companies eclipsed by Nvidia’s $200 billion rise in value are two of the most important catalysts of the AI revolution. Between them, Korea’s SK Hynix and Boise-based Micron Technology control 52% of the global dynamic RAM market. Together, they are worth only $140 billion (around Rs. 11 lakh crore). Their only rival, Samsung Electronics, accounts for 43% of the DRAM industry – one of at least four sectors it dominates globally – while it trades at $317 billion (around Rs. 26 lakh crore).
If the generative AI sector is to take off, as Nvidia and its customers believe, then established giants like Microsoft and newcomers like OpenAI should be knocking on the doors of Samsung, SK Hynix and Micron.
Machines that process reams of data, analyze video, audio and text patterns and spit out replicas of human-created content will need memory chips. In fact, AI companies will likely buy more DRAM than any other slice of the tech sector in history.
The reason for this demand for memory chips is quite simple: Nvidia’s AI chips differ from standard processors by inhaling huge amounts of data in one go, crunching the numbers in one go, then spitting out the results again. ‘one shot. But for this power advantage to be realized, they need the information to be fed into the computer quickly and without delay. This is where memory chips come into play.
Processors don’t read data directly from a hard drive: it’s too slow and inefficient. The first choice is to temporarily keep it in the chip itself. But there’s not enough room here to hold much of anything: chipmakers prefer to devote this precious space to computing functions. So the second best option is to use DRAM.
When you’re processing billions of pieces of information at once, you need that data at your fingertips and delivered quickly. A lack of adequate DRAM in a system will significantly slow down a computer, neutralizing the value of spending $10,000 (around Rs. 8.2 lakh) on the best processors to run sophisticated chatbots. This means that for every high-end AI processor purchased, up to 1 terabyte of DRAM can be installed, which is 30 times more than a high-end laptop.
Such a thirst for memory means that DRAM sold for use in servers is expected to exceed that installed in smartphones this year, according to Taipei-based researcher TrendForce.
These systems also need to be able to save large quantities of their results nearby so they can be read and written quickly. This is done on NAND Flash, the same chips used in smartphones and most modern laptops. Samsung is the world leader in this field, followed by the Japanese Kioxia Holdings Corp. (a spin-off of Toshiba Corp.) and SK Hynix.
Together, DRAM and NAND accounted for $8.9 billion (roughly Rs. 73,000 crore) in revenue at Samsung last quarter, far surpassing the $4.3 billion (roughly Rs. 35,000 crore) that Nvidia has leveraged its data center business which includes products used for AI. To put that in context, this is the worst performance for Samsung’s memory division in seven years, and its AI-related memory sales represent only a fraction of total revenue.
These two figures are expected to grow. For every high-end AI chip sold to customers, a dozen more DRAM chips will be shipped, meaning more revenue for Samsung, SK Hynix and Micron. As Nvidia grows, so will these three companies that collectively control 95% of the DRAM market.
There’s no doubt that the AI revolution is here, with the creators of cool chatbots, ubiquitous search engines, and high-powered processors among the biggest winners. But those who produce boring old memory chips won’t be left behind either.
© 2023 Bloomberg LP