5 Super Semiconductor Stocks to Buy Hand Over Fist for the Artificial Intelligence (AI) Revolution – The Motley Fool

Founded in 1993, The Motley Fool is a financial services company dedicated to making the world smarter, happier, and richer. The Motley Fool reaches millions of people every month through our premium investing solutions, free guidance and market analysis on Fool.com, top-rated podcasts, and non-profit The Motley Fool Foundation.
Founded in 1993, The Motley Fool is a financial services company dedicated to making the world smarter, happier, and richer. The Motley Fool reaches millions of people every month through our premium investing solutions, free guidance and market analysis on Fool.com, top-rated podcasts, and non-profit The Motley Fool Foundation.
You’re reading a free article with opinions that may differ from The Motley Fool’s Premium Investing Services. Become a Motley Fool member today to get instant access to our top analyst recommendations, in-depth research, investing resources, and more. Learn More
Developing the most advanced artificial intelligence (AI) models wouldn't be possible without the semiconductor industry.
In 2016, Nvidia (NVDA -3.49%) CEO Jensen Huang personally delivered the first artificial intelligence (AI) supercomputer to OpenAI. In hindsight, that was a significant moment in history because the start-up went on to develop one of the AI industry’s most advanced applications, ChatGPT.
ChatGPT and many of its competitors can already generate text, images, videos, and computer code on command. But Nvidia and its peers are designing hardware that is more powerful and more energy efficient, paving the way for the development of even better AI applications with expanded capabilities.
Here are five semiconductor stocks investors might want to buy during the AI revolution.
Image source: Getty Images.
Nvidia is best known for designing the most powerful data center graphics processing units (GPUs) for AI development. But the Nvidia AI Enterprise platform is an entire cloud-based operating system for AI developers that extends far beyond chips alone.
The platform offers developers a library of ready-made large language models (LLMs) they can use to build AI applications, which can save them substantial amounts of time and money. It’s also home to CUDA (the software layer of the GPU), which allows developers to optimize their chips so they can build applications more quickly. CUDA can’t be used with any other chips, so data center operators need to stick with Nvidia’s hardware or risk upsetting customers who are proficient with the software.
Nvidia‘s H100 GPU set the benchmark in the AI industry, but the company is gearing up to ship a new generation of chips based on its Blackwell architecture. The upcoming GB200 GPU, for example, can can perform AI inference at five times the speed of the H100, which will reduce costs for developers, who often pay for computing capacity by the minute.
Nvidia stock has tripled over the past year alone, and it probably isn’t done moving higher. Wall Street thinks the company will bring in over $120 billion in revenue during the current 2025 fiscal year (ending Jan. 30, 2025), which is nearly double its result from fiscal 2024. The majority of that revenue will come from its AI-focused data center segment.
Nvidia is struggling to meet demand for its GPUs, so some of its biggest customers are turning to Advanced Micro Devices (AMD -2.54%) instead. AMD’s MI300 GPU is winning over data center titans like Microsoft, Oracle, and Meta Platforms, which have found that it offers inference performance and cost advantages over the H100.
AMD’s data center revenue surged 80% to $2.3 billion during the recent first quarter of 2024 (ended March 31). The company expects to generate $4 billion in sales from its GPUs alone this year, which is up from a January forecast of $3.5 billion.
But AMD has taken a leadership position in another crucial area of AI: personal computing. It says millions of PCs fitted with its Ryzen AI chips have shipped to date — from leading manufacturers like Dell, HP, and Asus — giving AMD a 90% market share in this segment.
The company is working closely with Microsoft to develop new Ryzen chips capable of processing the growing suite of AI features built into the Windows operating system.
AMD stock is up 51% over the past year, but it remains 18% below its all-time high. Now might be a great time to buy as the company’s AI revenue ramps up in both the data center and PC segments.
Unlike Nvidia and AMD, Axcelis Technologies (ACLS -4.73%) doesn’t produce any chips. It makes ion implantation equipment that is crucial for the fabrication of processors (CPUs), memory chips, and storage chips. AI applications require higher capacities from all three, which presents a big opportunity for this company.
Most of Nvidia’s data center GPUs have built-in memory, and advanced models like the Blackwell GB200 even come with built-in CPUs for higher efficiency.
Plus, AI-enabled PCs and smartphones require more processing power and up to twice as much memory capacity as their predecessors. As a result, Axcelis CEO Russell Low says AI will require significant manufacturing capacity expansions throughout the semiconductor industry, which should lead to more sales of the company’s equipment.
In addition, power devices (chips that regulate the flow of electrical power in high-current workloads) are very implant-intensive to manufacture. Demand is rapidly growing for AI data centers, which is driving up demand for energy generation, distribution, and cooling. And that, in turn, drives up demand for Axcelis’ equipment.
Despite soaring 835% over the last five years, the stock trades at a price-to-earnings (P/E) ratio of just 18.7, which is a 49% discount to the 36.9 P/E of the iShares Semiconductor ETF. In other words, Axcelis stock would have to almost double just to trade in line with its peers in the chip industry. That spells opportunity for investors.
Broadcom (AVGO -2.07%) is a multifaceted AI play. It makes hardware and software networking solutions for data centers, including its switches, which regulate how quickly data travels between servers and devices. They are key components when thousands of GPUs are clustered together to process vast amounts of data in AI development.
During the recent fiscal 2024 second quarter (ended May 5), sales of Broadcom’s Tomahawk 5 and Jericho3 switches doubled compared to the year-ago period.
Many of its subsidiaries are also using AI. It acquired Symantec for $10.7 billion in 2019, which is using Alphabet‘s Vertex AI platform to bring AI to its cybersecurity software. Broadcom also acquired cloud software provider VMware for $69 billion in 2023, which allows developers to create virtual machines. This means multiple users can plug into one server to use all of its capacity, which is important during a time when AI infrastructure is in short supply.
Broadcom says its companywide AI revenue soared 280% year over year to $3.1 billion during the second quarter. The company expects to deliver a record high $51 billion in total revenue for the whole of fiscal 2024, with $11 billion coming from AI specifically.
As I mentioned earlier, AI-enabled PCs and smartphones require significantly more memory capacity than non-AI devices. That’s because AI applications need mountains of data to function, and memory chips serve as a brain that temporarily stores that information in a ready state. Micron Technology (MU -3.93%) is a leading provider of memory and storage chips, and it’s experiencing a wave of demand as a result of the AI revolution.
Every tier 1 manufacturer of Android-based AI smartphones (like Samsung) uses Micron’s  LPDDR5X memory chip. It offers between 12 gigabytes and 16 gigabytes of capacity, which is 50% to 100% more memory than previous flagship smartphones required. Plus, the minimum memory requirement for Microsoft‘s new Copilot+ AI PCs is 16 gigabytes, which is twice the minimum amount its previous Surface lineup needed. This trend spells more revenue for Micron.
On the data center side, Micron’s HBM3e (high-bandwidth memory) solution is used in Nvidia’s new H200 GPU. It can perform AI inference at almost twice the speed of the H100 while consuming half the amount of energy, which translates to substantial cost savings for data center operators.
Micron says HBM3e contributed $100 million to its revenue during the fiscal 2024 third quarter (ended May 30). By the end of fiscal 2024, it expects to have sold hundreds of millions of dollars’ worth — but that is forecast to grow into the billions of dollars in fiscal 2025.
Micron is already completely sold out of HBM3e memory for the next two years, which should give the company an incredible amount of pricing power and, therefore, drive significant growth in its profitability.
Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Anthony Di Pizio has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, HP, Meta Platforms, Microsoft, Nvidia, Oracle, and iShares Trust – iShares Semiconductor ETF. The Motley Fool recommends Broadcom and recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.
Invest better with The Motley Fool. Get stock recommendations, portfolio guidance, and more from The Motley Fool’s premium services.
Making the world smarter, happier, and richer.
© 1995 – 2024 The Motley Fool. All rights reserved.
Market data powered by Xignite and Polygon.io.

source

Facebook Comments Box

Trả lời

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *