3 Super Semiconductor Stocks (Including Nvidia) to Buy Hand Over Fist for the Artificial Intelligence (AI) Revolution – The Motley Fool

Founded in 1993, The Motley Fool is a financial services company dedicated to making the world smarter, happier, and richer. The Motley Fool reaches millions of people every month through our premium investing solutions, free guidance and market analysis on Fool.com, top-rated podcasts, and non-profit The Motley Fool Foundation.
Founded in 1993, The Motley Fool is a financial services company dedicated to making the world smarter, happier, and richer. The Motley Fool reaches millions of people every month through our premium investing solutions, free guidance and market analysis on Fool.com, top-rated podcasts, and non-profit The Motley Fool Foundation.
You’re reading a free article with opinions that may differ from The Motley Fool’s Premium Investing Services. Become a Motley Fool member today to get instant access to our top analyst recommendations, in-depth research, investing resources, and more. Learn More
Artificial intelligence (AI) is creating a trillion-dollar opportunity for chip companies like Nvidia, Advanced Micro Devices, and Micron.
ChatGPT, Claude, and Gemini are just a few of the powerful artificial intelligence (AI) chatbot applications available today. They can interpret and generate text, images, videos, and computer code, which could drive a productivity boom across the global economy.
But those AI applications wouldn’t be possible without data centers filled with advanced semiconductors, which developers use to build, train, and deploy their models. Nvidia (NVDA -3.77%) CEO Jensen Huang estimates the current $1 trillion installed base of data center infrastructure around the world will double over the next five years because of AI.
That creates a substantial opportunity for leading chip companies Nvidia, Advanced Micro Devices (AMD 0.97%), and Micron Technology (MU -4.02%). Here’s why investors might want to own a stake in each of them to ride the AI boom.
The gaming industry used to be Nvidia’s largest source of revenue, but its graphics processing units (GPUs) for the data center are now driving its business forward. In fiscal 2024 (ended Jan. 28), Nvidia’s H100 GPU sent the company’s data center revenue surging 217% to $47.5 billion, and the current fiscal 2025 year is expected to bring more growth.
Nvidia’s H100 is the most sought-after data center chip for AI workloads. Many tech giants are lining up to buy it in large volumes, from Amazon to Oracle to Meta Platforms. But Nvidia will ramp up shipments of its new H200 GPU this year, which could fuel the company’s next wave of growth.
The H200 can inference — which means to feed live data to an AI model so it can make predictions — at twice the speed of its predecessor, and it also consumes 50% less energy, so it’s a big cost saver for data center operators.
Longer term, all eyes are on Nvidia’s new Blackwell architecture, which will power its next-generation chips. The company is expected to ship its Blackwell B200 GPUs next year, which will deliver 15 times the inferencing speed of the H100.
Wall Street expects Nvidia‘s total revenue to soar 81% to a record $110 billion in the current fiscal year. Despite the 240% gain in its stock over the last 12 months, there is still plenty of room for upside.
AMD makes some of the world’s most popular chips for consumer electronics. They can be found in the Microsoft Xbox Series X and the Sony PlayStation 5, not to mention a long list of personal computers. But AMD recently launched its MI300 series of data center chips in a quest to chase down Nvidia.
The MI300 is available as a standard GPU called the MI300X, but it’s also available in a combined GPU and CPU configuration called the MI300A. AMD is already shipping the MI300A to the Lawrence Livermore National Laboratory for its El Capitan supercomputer, which will be one of the most powerful in the world when it comes online this year. But a number of leading data center operators, including Microsoft, Oracle, and Meta Platforms, are also lining up to buy the MI300 series.
While AMD is chasing Nvidia in the data center, the company has a dominant market share of 90% in the emerging AI-enabled personal computing industry. Millions of computers from manufacturers like Dell and HP have already shipped with AMD’s Ryzen AI chips, which allow users to process AI on-device. That leads to a faster, more responsive experience, because AI workloads don’t have to travel to and from the data center.
In the recent fourth quarter of 2023, the Ryzen AI series sent the company’s Client segment revenue up 62% year over year. Combined with the growing shipments of the MI300 chips — which could generate $3.5 billion in sales in 2024 — this could be AMD’s biggest year.
Micron doesn’t get as much attention as Nvidia and AMD because it produces memory (DRAM) and storage (NAND) chips, which aren’t as glamorous as GPUs. However, every GPU requires memory, and it’s critical to extracting its maximum performance.
Micron’s latest HBM3E (high-bandwidth memory) data center solution was selected by Nvidia to power its new H200. It’s an obvious choice because it consumes 30% less power than competing HBM hardware, making it far more cost efficient. Micron is currently sampling a new HBM3E product, which packs 50% more memory into each GPU, allowing Nvidia’s customers to bring their AI models to market far more quickly.
Micron’s entire 2024 HBM inventory is already sold out, and much of its 2025 stock is already spoken for, too.
Like AMD, Micron’s AI opportunity is expanding beyond the data center. The AI-enabled chipsets inside new personal computers can require 80% more DRAM content than traditional models, which is great for Micron’s business. Many modern smartphones are also fitted with AI chips, which need twice as much DRAM as traditional devices. Samsung’s new Galaxy S24 comes with a slew of AI features, and it runs on Micron’s hardware.
Micron’s revenue jumped 57% year over year during the fiscal 2024 second quarter (ended Feb. 29). The company also delivered net income of $793 million, marking a welcome return to profitability after more than a year grappling with falling prices due to an oversupply of chips.
Micron’s forecast for the upcoming fiscal 2024 third quarter (ending June 1) points to accelerated revenue growth of 76%. Micron stock is trading at an all-time high, but it’s still cheaper than both Nvidia and AMD, so this could be a great buying opportunity for investors.
John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Anthony Di Pizio has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Amazon, HP, Meta Platforms, Microsoft, Nvidia, and Oracle. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.
Invest better with The Motley Fool. Get stock recommendations, portfolio guidance, and more from The Motley Fool’s premium services.
Making the world smarter, happier, and richer.
© 1995 – 2024 The Motley Fool. All rights reserved.
Market data powered by Xignite and Polygon.io.

source

Facebook Comments Box

Trả lời

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *