has emerged as the leader in powering the early days of the artificial-intelligence revolution, but rivals big and small are looking to close the gap.

Heavyweights such as
Advanced Micro Devices
are spending billions of dollars to enhance their AI offerings, while startups are attracting investors eager to get into the next possible chip giant. Meanwhile, cloud-computing companies such as
and Google are developing their own chips and becoming bigger players in this area.

The current AI boom began late last year, when OpenAI’s ChatGPT tool captured the public’s imagination by generating cogent text in response to prompts. The attention led to a surge of investment in chips that can create and deploy ChatGPT and other so-called generative AI language systems.

Nvidia was a leader in producing such AI chips, thanks in part to its background in making semiconductors for videogame graphics that were repurposed for AI years ago. The latest wave of enthusiasm juiced sales for Nvidia, skyrocketing its valuation above $1 trillion and prompting Chief Executive

Jensen Huang
to declare AI a revolutionary technology on par with personal computers and smartphones.

The Santa Clara, Calif.-based company controls more than 80% of the increasingly lucrative business of doing the behind-the-scenes number-crunching that is driving the AI wave, according to analyst estimates.

The need for AI chips is seen growing the industry’s size in the coming years.
Photo: I-Hwa Cheng/Bloomberg News

AMD is seen as the closest rival to Nvidia. The company is a longtime competitor that has its own line of AI processors and deep relationships with big datacenter operators hungry for computing power. 

Forrest Norrod,
the head of datacenter hardware at AMD, said the generative-AI boom led by ChatGPT caught the company by surprise, but he said the industry wants a competitor for Nvidia. “There are a lot of people who desperately want an alternative,” he said.

AMD had been planning a new generation of its AI chips since last year, but the surge in generative AI led to an addition to the lineup: a version of the chip, unveiled in June, specifically meant for the sophisticated algorithms that power AI models such as ChatGPT.

AMD could reach a 20% market share in AI chips over time, Northland Capital Markets analyst

Gus Richard
estimated recently, both because of the strength of its products and because the world needs a second source to Nvidia.

Intel has also pushed back against what it called the narrative suggesting only Nvidia’s chips can run generative AI. In a blog post late last month, Intel touted its performance in recent AI benchmarks, saying its chips could be “compelling options for customers looking to break free from closed ecosystems.”

AMD and Intel have made purchases in recent years to enhance their AI offerings. In 2022, AMD spent $35 billion on Xilinx, a designer of chips that can be reprogrammed after they are produced and can be made adept at AI computation. Intel bought the Israeli AI startup Habana Labs in 2019 for about $2 billion and is now producing its chips.

The rivals might be able to make inroads against Nvidia because of how their AI software is structured. Nvidia’s software is proprietary, meaning software developers don’t have the freedom to toy with it. Intel and AMD offer open-source alternatives, an option that some customers may find appealing.

AMD and other heavyweights are spending billions of dollars to increase their AI offerings.
Photo: Qilai Shen/Bloomberg News

Venture-capital investors are also staking billions of dollars on chips, including many geared toward AI, in the hope of making inroads in the burgeoning market. AI chip startups Mythic and Tachyum raised new funding this year, following a couple of heady years of investment. 

Chip startups drew in $8.3 billion in 2021 and $7.9 billion in 2022, according to Crunchbase figures. Those totals are strong but lag behind other hot sectors such as crypto, which attracted $26.8 billion of venture funding last year, according to CB Insights.

Big cloud-computing companies such as Amazon and Google have also made investments in AI computing. With their large scale, these deep-pocketed tech companies can afford to design their own AI chips for their data centers and outsource their production to companies such as Taiwan Semiconductor Manufacturing. The companies use those chips to drive AI features in their products and rent them out to customers who want to do their own AI work. 

Amazon’s cloud-computing unit, Amazon Web Services, is pitching its home-brewed chips as cheaper alternatives to Nvidia’s hardware, and has drawn in
ByteDance and Snap, among other large internet companies. ByteDance, the Chinese owner of TikTok, had saved as much as 60% on the cost of deploying its AI models by using Amazon’s chips, said

Nafea Bshara,
an AWS executive.

Amazon, he said, understood AI’s potential years ago. “We realized then that we would need to help customers keep their [computational] costs under control in order to make AI accessible to customers of all sizes and across industries that want to use it,” Bshara said.

The newfound AI fervor comes at a welcomed time for the semiconductor industry, which is feeling the effects of a postpandemic slowdown in sales of tech products such as smartphones and PCs.

This year, worldwide semiconductor revenue is expected to fall 9% to roughly $511 billion, according to International Business Strategies, a chip-industry consulting firm. 

The need for AI chips is seen growing the industry’s size in the coming years. IBS had earlier projected that the chip industry’s revenue would double by the end of the decade to roughly $1.1 trillion, fueled by advances in 5G networks, self-driving cars and other technologies. Now, IBS has boosted its revenue projection by another $150 billion to roughly $1.25 trillion in 2030.

Nvidia, based in Santa Clara, Calif., has ridden the current AI rise to a $1 trillion valuation.
Photo: Philip Pacheco/Bloomberg News

“Generative AI is one of the biggest events that the semiconductor industry has experienced to date,” said

Handel Jones,

The AI-computing semiconductors market—including Nvidia and AMD’s graphic processing units, chips specially designed for conducting AI calculations and the outsourced manufacturing of those chips—as a group is expected to post around $43 billion in annual sales this year, or roughly 8% of the chip industry’s overall sales, according to estimates by Morgan Stanley analysts. 


Can Nvidia maintain its current lead in the industry? Why or why not? Join the conversation below.

Within four years, the AI-computing semiconductors’ portion of the entire industry is projected to roughly double with revenue hitting $125 billion, according to Morgan Stanley estimates.

Nvidia’s head start isn’t discouraging newcomers.

Nigel Toon,
the CEO of U.K.-based Graphcore, which has raised about $750 million from investors, said a more powerful version of his company’s AI processors was in the works as it tries to compete with Nvidia. 

Graphcore has been particularly successful in Asia, Mr. Toon said. Chinese companies have been looking for alternatives to U.S. suppliers as U.S. authorities restrict the sale of Nvidia’s most advanced chips to China amid heightened geopolitical tensions.

“They need to share some of that market cap,” he said, referring to Nvidia’s $1 trillion valuation.

Write to Asa Fitch at asa.fitch@wsj.com and Jiyoung Sohn at jiyoung.sohn@wsj.com