In the news from November 1: Nvidia replaces Intel on the Dow Jones Industrial Average.
And looking at this year alone, it shouldn’t even have been a surprise: Intel’s shares have dropped 55% (this year to November 1), making it the worst performer on the index, with the lowest stock price.
Intel’s revenue was $54 billion in 2023 (down a third from 2021), and the company valued under $100 billion for the first time in 30 years. It’s expected to post its first annual net loss since 1986, per Reuters.
Compare all this to Nvidia, which is now valued at $3.32 trillion and currently the world’s second largest company (it’s been first at times this year).
Nvidia’s value has more than doubled this year alone, and its shares have grown by a multiple of seven over the past two years.
[For more PTP coverage of the Nvidia AI growth story, check out our reports on the domestic AI chip competition and the international struggle to control AI.]
And of course, here’s the stunner: in 2019, Intel was worth twice the value of Nvidia.
So what happened?
Some of it is innovation vs. inaction, yes. Good investments, savvy insight, moving fast and moving smart on one side. But on the other?
Let’s take a closer look.
Intel: Semiconductor Household Name
Do you remember those fancy stickers that used be put on personal computers, boasting of “Intel inside”?
Or the colorful ads, running on US national TV at a frequency and scale that rivalled even Apple at the time?
Intel was once the most famous name for computing components in the world, as they controlled a staggering 90% market share for the PC microprocessor.
They were—and are still—unique as one of the few companies that can both design and construct their own chips.
Their domain of excellence was the central processing unit (CPU), and their power here was unmatched. It was at this time, in 2005, that they made a fateful decision: pass on acquiring Nvidia.
Nvidia was then a niche player, known for their graphics processing units (GPUs), a staple of video gaming. Intel had struggled with prior acquisitions, and also felt the price was far too high.
(That price, according to The New York Times, was $20 billion.)
There were voices at Intel suggesting these GPUs might have other uses, as at data centers, but the board overall was not interested.
After all, Intel’s chips were the masters of rapid calculations in series, one after the other, while these GPUs broke down tasks and spread them across scores of processors in parallel. 3D graphics was not their business.
Though this would be the very thing that’s made the GPU essential in the AI explosion.
Nvidia: 3D Graphics Company
Founded by three engineers with a history in chips, Nvidia saw early promise in accelerating graphics processing for games, and in the late 1990s, they became one of a handful of survivors in the field. They went public in 1999, three years after having to restructure their already-small workforce.
It was around the same time Intel was considering the acquisition that Nvidia first began reaching beyond gaming. With a recognition of the broader application of GPUs, Nvidia pursued general purpose parallel programming with CUDA (Compute Unified Device Architecture), their proprietary API.
CUDA, as a software layer, showed that GPUs could do more than optimize graphics; they could also excel at multitasking, machine learning, and crypto mining. And while this endeavor wasn’t initially profitable, it became increasingly successful with a growing community of developers.
Today, CUDA is nearly inseparable from their AI chips, and the company’s data center processors (for analytics and AI) dwarf their GPU sales for consumer computers, 3D gaming, and the rest.
Nvidia’s meteoric rise has been almost without parallel, and it comes not only from the initial push to broaden the applications of GPUs, but also from a continued effort to innovate and expand their offerings, to stay adaptable and innovative, instead of working to exclusively dominate in their area of expertise.
That Kodak Moment
Did you know the first prototype for the digital camera came from an employee of Kodak? It was 1975, and that device bore little resemblance to the ubiquitous tech all the world uses today.
Kodak was a corporate giant in those days, enjoying a remarkably Intel-like market share, with 85% of cameras and 90% of the film sold in the US coming from them. And the story sometimes goes that Kodak didn’t want to pivot from their lucrative film business, and thus missed the explosive transformation of photography by digital cameras.
But according to Scott Anthony in writing for the Harvard Business Review, this isn’t quite accurate. Kodak actually invested billions in digital camera technology, and acquired a photo-sharing website in 2001, called Ofoto.
While film was still their bread-and-butter, some at Kodak had correctly guessed that digital cameras were the future, and also that online platforms would become key for sharing photographs.
But unlike another startup of this time, Instagram, which was bought by Facebook for $1 billion around the same time Kodak would be forced to declare bankruptcy, Kodak had geared Ofoto in the attempt to drive people to print out their digital photos.
This may have looked like a near-miss, but it was not. While savvy minds at Kodak saw how the business was transforming, the culture remained inflexible. Their innovation was only allowed to go so far.
They’d correctly identified areas of potential disruption and invested, but still failed to pivot successfully, again and again, until it was too late.
The Impact of AI on the Semiconductor industry
The parallels here with Intel are obvious, and I’m certainly not the first one to make them. As Nvidia was developing CUDA, identifying the importance of AI and machine learning, and finding new ways to expand the power and usefulness of the GPU, Intel remained entrenched in the CPU.
Even today, with the US government concerned about Chinese proximity to Nvidia’s primary manufacturer, TSMC, and betting heavily on Intel for American AI chip production, Nvidia has continued to dominate.
Like Kodak, Intel’s missed AI opportunity didn’t stem from a failure to enter the ring with Nvidia, but rather in the rigidity of their approach. From the failure of their so-called Larrabee effort to move into the graphics business, to a push into AI some ten years later, billions were invested, but without fruit.
Projects were shelved in progress, with their leaders moved on to other things, as new companies were acquired in failed bids to improve on existing initiatives.
Despite tens of billions in government contracts and incentives as America’s only manufacturer of advanced chips, Intel’s failure to innovate successfully has led to restructuring and the delay of breaking ground on new factories. Few have felt the AI disruption in tech industry more acutely.
Meanwhile, Nvidia’s impact on the AI market can hardly be overstated. Their GPUs, and CUDA, are the gold standard for AI, worldwide.
Maybe it can’t rightly be said that Intel failed to realize the future of AI in the tech industry, much like Kodak with the digital photography transformation. Instead, it was a failure to commit, to adapt, to move sufficiently in new directions.
Conclusion
Recently I wrote about founder mode and its relative strengths and weaknesses vs. manager mode approaches.
And here we see the worst nightmare for manager mode: undone by a slavish devotion to what a company does well. Both Intel and Kodak were international powerhouses, cornering their market and generating enormous profits. And that approach no doubt helped them reach such heights.
Yet both failed, not necessarily due to a lack of recognition or even investment in emerging technologies. Rather the failure came as an unwillingness or inability to adjust their thinking and behavior.
It’s hard to overstate the current importance of AI innovation in business. But for me the broader lesson is about the critical importance of adaptability.
Fluidity with innovation is essential, and with it the capacity to adjust one’s thinking. Because the leaders of tomorrow will embrace not only the future of technology, but also whole new ways of working.
References
Nvidia to take Intel’s spot on Dow Jones Industrial Average, Reuters
The White House Bet Big on Intel. Will It Backfire?, The New York Times
Visualizing Nvidia’s Revenue, by Product Line (2019-2024), Visual Capitalist
Kodak’s Downfall Wasn’t About Technology, Harvard Business Review
A Brief History of Kodak: The Rise and Fall of a Camera Giant, PetaPixel