BLOG | GENERATIVE AI
Nvidia’s AI-fuelled Rocket Ship Draws Acclaim and Doubters
The explosive growth of Nvidia, which made Wall Street history this week has invited questions from investors around how long its dominance of the AI chip market can last.
In a staggering post-earnings charge on Friday AEDT, Nvidia shares catapulted 16.4 per cent, adding $US277 billion to its market capitalisation – the most ever achieved in one day for a Nasdaq stock.
The company, which started out in 1993 selling chips to make graphics work better in video games, confirmed its current position as the most influential stock of the AI era with its expectations-beating earnings causing record market highs around the world. Its success dates back to a Steve Jobs-like vision by the company’s co-founder and chief executive Jensen Huang in the early 2000s.
Nvidia CEO Jensen Huang has positioned Nvidia at the sweet spot of the AI boom. Bloomberg
The FTSE All-World index, Stoxx Europe 600 and Japan’s Nikkei 225 all reset records in the wake of Nvidia’s blockbuster quarter. The ASX added 0.4 per cent on Friday to 7643.
“This last year, we’ve seen generative AI really becoming a whole new application space, a whole new way of doing computing,” Mr Huang said on its earnings call. “A whole new industry is being formed, and that’s driving our growth.”
Nvidia’s revenue more than tripled to $US22.1 billion ($33.7 billion) for the last quarter, beating analyst expectations by 12 per cent. Its market capitalisation now sits at $US1.94 trillion, and it blindsided analysts for the second time in as many years with forecasts that sales will triple again to $US24 billion for the March quarter.
Aiden Heke said Nvidia made an iPhone-sized bet by developing CUDA.
While its earnings impressed investors around the world, how long its valuation will be sustained depends on how long its dominance of the AI universe will endure.
“It’s rare to have one company profiting so much from a global theme,” said Theo Maas, portfolio manager at Northcape Capital. “Nvidia is taking the lion’s share of generative AI investment at the moment.”
Inflection Point
In 2006, Mr Huang decided to expand Nvidia’s ambitions beyond selling chips for video games and target its graphics processing units (GPUs) for broader use in science, research and latterly artificial intelligence applications.
It launched Compute Unified Device Architecture, or CUDA, that year, which is a so-called parallel computing platform and programming model that lets customers in various industries harness the power of its GPUs for research and development.
Alongside its hardware, Nvidia poured money into building an ecosystem around CUDA, which included software development kits, code libraries, and other tools that made it easier for developers to build AI applications.
“Nvidia made an iPhone-sized bet by developing CUDA … A key aspect of Nvidia’s strategy was keeping CUDA closely tied to its hardware, ensuring that the advancements in AI and deep learning powered by CUDA drove demand for Nvidia’s GPUs,” Aiden Heke, chief executive of the Australian arm of data and analytics consultancy Decision Inc said.
“This closed, integrated approach created a strong competitive moat and revenue stream, akin to Apple’s product strategy.”
Big Name Competition
As tech companies from Netflix to Canva deepen their forays into AI, which need powerful chips to crunch through boundless data, investors are focused on how much competition may emerge. Nvidia has deep-pocketed rivals.
Nvidia doesn’t manufacture the GPUs itself, rather it designs them before they are fabricated at facilities in Taiwan. This is mostly done by dual-listed Taiwan Semiconductor Manufacturing Company.
Fund manager Thomas Rice, a technology specialist, underscored its Apple-like stranglehold on its market: “The platform locks people in, it’s been an incredible competitive advantage over the last decade,” he said.
Mr Heke said Nvidia’s ongoing dominance of the market was far from assured. Sam Altman’s OpenAI is currently trying to raise $US7 trillion to design and manufacture AI chips and reduce its reliance on Nvidia.
He named Google’s Tensor Processing Units (TPUs) as a major threat, as well as Meta and Amazon and numerous well-funded start-ups.
“These entities are not merely competing; they are pushing the boundaries of AI hardware, seeking to offer differentiated solutions that address the evolving needs of AI applications,” Mr Heke said.
In its favour, Nvidia’s earnings showed that at least 40 per cent of its data centre revenue comes from its “inference” software – that means it is discerning insights from data, rather than just processing it.
One of Nvidia’s many Australian customers, Daniel Roberts, chief executive of Iren Energy (formerly Iris Energy), predicted high demand for the products Nvidia is selling.
“They’re the market leading chip,” said Mr Roberts, whose company sells computing power to businesses experimenting with AI, as well as bitcoin mining. “We’re conscious of booms and busts and we’re conscious that this AI phase might be a craze, but there’s no doubt there’s going to be a continuous demand for computing power.”
The unbridled demand for computing power has left Nvidia’s supply chain struggling to keep pace. Last June, TSMC announced a $US2.9 billion investment to expand its capacity, but analysts are unsure whether it will be enough.
Speaking to US tech industry publication The Information earlier this month, chief executive of $US43 billion-valued AI software company Databricks, Ali Ghodsi, said he is anticipating a dramatic fall in price, analogous to the fall in the cost of internet bandwidth in the 2000s.
This would hit Nvidia – an investor in Databricks – hard.
“It turned out that actually capitalism, supply and demand, takes care of the problem and the price of bandwidth just plummeted and bandwidth was abundant everywhere,” Mr Ghodsi said. “The same thing will happen with GPUs.”
But not everyone is so sure. Nvidia’s next big AI system is called the B100 and is expected to launch later this year. Bank of America’s Vivek Arya has suggested the new B100 products might be priced 10 per cent to 30 per cent higher than the existing H100s.
“In our view, we are at-worst in middle innings of that upfront three year spending cycle, with demand likely stretching out to at least mid to late 2025,” he said.
Original article source: The Financial Review
Read More
AI-driven solution enables deeper fan insights and automates qualitative feedback
Decision Inc. Australia's AI-driven solution enables deeper fan insights and automates qualitative feedbackSolution OverviewOne of Australia’s leading sporting organisations, comprising men’s, women’s, and e-sports competitions, is the peak body of one of the world’s...
Major Hospitality Group Customer Feedback Review Process Revitalised with Automation
Decision Inc. Australia Automates Customer Feedback Review Process and Leverages AI to Streamline Sentiment Analysis for Major Hospitality GroupSolution OverviewAn Australian hospitality group, which operates cinemas, hotels, restaurants and resorts across Australia,...
Watterson Intelligent Automation Case Study
Decision Inc. Australia Streamlines Media Analysis and Sentiment, Leveraging AI and Automation for WattersonSolution OverviewWatterson is a boutique public relations and marketing communications consultancy.New South Wales, AustraliaMedia &...