BLOG | GENERATIVE AI

Nvidia’s Rise To The Top Should Surprise No-One

by | Mar 21, 2024 | AI

If you’ve been following the rapid rise of Nvidia’s stock recently and, like I, lamented not investing early on, we probably should have seen this coming.

Graphics Processing Units (GPUs) have undergone a remarkable transformation from hiding in the bowels of gamer PCs to becoming the brain behind generative artificial intelligence (AI) and subsequently a Wall Street darling.

Like Nvidia, the primary GPU protagonist, this meteoric rise seems to have happened from nowhere. It’s led to Nvidia holding a market position the envy of every organisation on the planet, a position akin to being the only organisation that can create oxygen.

But if you dive deeper into their story, you can see this overnight success has been building for years. And for Australian organisations looking to dive into the AI profit pool, this market dominance risks us being lapped amid an emerging chip scarcity.

dI logo

How we – well, Nvidia – got here

Initially designed to accelerate image rendering in video games, GPUs have emerged as the lynchpin in the advancement of AI and machine learning (ML), propelling a myriad of applications from deep learning to complex data analytics.

Nvidia had built up an enviable reputation for its ability to deliver a market-leading product that could deliver incredible parallel processing power, and that’s not by accident.

CEO Jensen Huang’s foresight in steering Nvidia towards AI was grounded in the belief that GPUs were ideal for AI’s computational demands.

This wasn’t just about creating a new product but about envisioning a future where Nvidia’s GPUs would be at the heart of AI.

So they went about making it happen.

Nvidia pushed its chips (no pun intended) into the middle of the table when it developed CUDA, a comprehensive platform that extended GPU capabilities beyond gaming to support complex computations needed for AI and deep learning.

A key cog in Nvidia’s gameplan was ensuring CUDA was closely tied to its hardware, which meant that any advancements in AI and deep learning fuelled by CUDA drove concurrent demand for Nvidia’s GPUs.

This closed, integrated ecosystem approach, similar to Apple’s product strategy, created a deep competitive moat and revenue stream.

Beyond just hardware, Nvidia also invested in creating an entire ecosystem around CUDA, including libraries, SDKs, and tools that made it easier for developers to build AI applications.

The significance of Nvidia’s shift became starkly apparent with the success of AlexNet in 2012, an AI model that dramatically outperformed existing technologies in image recognition.

This success, powered by CUDA on Nvidia GPUs, marked a “big bang” moment for AI, validating Nvidia’s bet on CUDA and setting the stage for the company’s explosive growth in the AI field.

This brought to the fore pivotal researchers and individuals of the industry, including Alex Krizhevsky, the primary lead of the winning AlexNet team; a PhD student who collaborated with Ilya Sutskever (you may recognise of OpenAI notoriety) and Geoff Hinton, one of the main proponents of ironically slowing down AI development while global guard rails are codified.

Strategic Acquisitions

Underpinning Nvidia’s strategy to dominate the AI ecosystem is a series of calculated acquisitions, in particular, the purchase of Mellanox Technologies for approximately $6.9 billion in 2020 which enhanced Nvidia’s data centre capabilities.

Nvidia’s vision extended beyond individual GPUs to an integrated data centre architecture optimised for AI.

By offering a comprehensive stack of created and acquired hardware and software, Nvidia solutions leverage AI at scale, facilitating advancements in cloud computing, autonomous technology, and more.

But is their position insurmountable?

Who else is emerging in this critical race – some would say for humanity – to innovate fastest?

The Competitive Horizon and Future Directions

While Nvidia has established an incredible foothold, the GPU and AI hardware sector remains dynamic, with emerging players like Google’s TPU and innovative startups challenging the status quo.

These companies are pushing AI hardware’s potential to its limits, seeking to bring to market solutions that address the ever-evolving requirements of AI applications.

As recently as last week, Sam Altman of OpenAI was reported to be seeking in the vicinity of $7 trillion (Yes, TRILLION – amazingly, that’s not a misprint) to design and manufacture AI chips and reduce the reliance on Nvidia.

In reality, what he’s told people privately, according to The Information, is that the quoted figure represents the sum total of investments that investors would need to make, from real estate and power for data centres to the manufacturing of the chips, over time.

But this rising up of potential competitors this has not dented Nvidia stock; in fact, its valuation surpassed the entire Chinese stock market this week; this was much to my extreme sadness, after I considered investing in them and ultimately not doing so in April last year.

What Does It Mean To Us In Australia?

The market opportunity for AI in the coming years is enormous with a projected CAGR of 21.6 per cent and value of US$2 trillion by 2030 – and it requires AI chips to fuel that growth.

But already we see a lag in AI chip availability in some quarters of the industry in Australia, and this chip scarcity threatens Australia’s ability to benefit economically from AI development.

Now, Nvidia has mega competitors, like Meta, Google and Amazon, and smaller, newer entrants, nipping at their heels.

And Databricks’ CEO Ali Ghodsi has gone ahead and predicted that prices for these chips will plummet over the next year.

But this is a far more complex equation than trying to create the fastest, most efficient chips available.

Nvidia has built an industry platform, akin to Apple’s formidable competitive moat.

More entrants are needed, more chips need to become available and, most importantly, they need to be the equivalent in terms of speed and efficiency to what Nvidia provides, otherwise Australia could be ranked among the have-nots, competing with the haves.

We need to watch this space carefully, pay attention to the new entrants and their end-to-end propositions or our AI lights in Australia may go out before they can fully shine.

Original article source: Information Age

Read More

Why embedding trust in AI is critical to its future

Why embedding trust in AI is critical to its future

BLOG | GENERATIVE AIWhy embedding trust in AI is critical to its future Original article source: Technology Decisions We’re now past the point of no return, and artificial intelligence (AI) is becoming ubiquitous, from consumer apps and devices to enterprise...