Post-Conflict Recovery Line Emerges: NVIDIA-Led AI Computing Power Team Poised for Strong Rally

Stock News11:03

Oppenheimer, a major Wall Street financial firm, recently released a research report stating that NVIDIA (NVDA.US), Broadcom (AVGO.US), Monolithic Power Systems (MPWR.US), and Marvell Technology (MRVL.US) remain its top picks in the global semiconductor sector. The firm cited an "oversold rebound logic based on earnings certainty and high beta attributes," coupled with persistently expanding global artificial intelligence expenditure, as the core rationale for its long-term bullish stance on these preferred semiconductor stocks.

Several veteran Wall Street analysts recently indicated that when global equity markets enter an oversold rebound window, or when clear signals of easing emerge in Middle East geopolitical tensions, chip stocks—which have historically demonstrated market-outperforming characteristics and are largely perceived as having been unduly sold off—are highly likely to become one of the core forces leading the market's counterattack and valuation recovery. They could even serve as the primary engine for a significant rebound in the Nasdaq 100 Index, often regarded as a bellwether for technology stocks. However, this scenario is contingent on a simultaneous decline in international oil prices and long-term Treasury bond yields. The underlying logic is that high-beta chip stocks are most sensitive to the chain reaction of "geopolitical de-escalation → falling oil prices → reduced interest rate pressure." Consequently, when risk appetite recovers, they tend to rebound first and most vigorously.

Within the broad chip stock sector, the sub-segment dominated by NVIDIA and Broadcom—comprising chips closely linked to AI computing infrastructure, which boasts the most optimistic earnings growth prospects—is expected to be the most sensitive, responsive, and explosive in terms of rebound magnitude under a "risk-easing rebound" scenario. In other words, in such a scenario, AI computing-related chip stocks are highly probable to be among the market's core bullish directions for offensive positioning.

Jan de Bruijn, portfolio manager of the Robeco Emerging Stars Equities fund, which has outperformed 96% of its peers over the past year, recently stated that chip stocks focused on high-performance AI and advanced process technologies offer the best hedging tools against the prospect of a prolonged conflict involving Iran. The manager revealed that 40% of the fund's risk exposure is concentrated in memory chips and cutting-edge advanced process chip themes. He emphasized that chip giants closely tied to AI will maintain strong pricing power and fundamental expansion potential even during economic downturns or periods of high volatility in global financial markets.

As model sizes, inference chains, and multimodal/agentic AI workloads drive exponential growth in computing resource consumption, the capital expenditure focus of tech giants is increasingly converging on AI computing infrastructure to meet soaring demand. Global investors continue to anchor the "AI bull market narrative"—centered on expectations around NVIDIA's product iterations, Google's TPU clusters, and AMD's deliveries—as one of the most certain growth investment stories in global equity markets. This also implies that investment themes closely related to AI training/inference, such as power supply, liquid cooling systems, and optical interconnect supply chains, will remain among the stock market's hottest investment camps, following leaders like NVIDIA, AMD, Broadcom, TSMC, and Micron, even amidst uncertainties in Middle East geopolitical tensions.

According to the latest analyst estimates compiled by institutions, Amazon, along with Alphabet (Google's parent), Meta Platforms Inc. (Facebook's parent), Oracle, and Microsoft, are projected to accumulate approximately $650 billion in AI-related capital expenditures by 2026. Some analysts believe total spending could exceed $700 billion—implying a year-over-year increase in AI capital expenditure potentially surpassing 70%. Notably, these five US tech super giants are expected to invest a cumulative total of about $1.5 trillion between 2023 and 2026 to build massive AI computing infrastructure. In contrast, their cumulative investment throughout history prior to 2022 was approximately $600 billion.

From the perspective of Wall Street giants Morgan Stanley, Citigroup, Loop Capital, and Wedbush, the global wave of investment in AI infrastructure, centered on AI computing hardware, is far from over and is merely in its infancy. Driven by an unprecedented "storm of AI inference-side computing demand," this round of global AI infrastructure investment, lasting until 2030, could reach a scale of $3 to $4 trillion.

The AI arms race is in full swing. Oppenheimer maintains long-term positions in four leading chip stocks. "Last week we visited multiple companies within the Asian semiconductor supply chain," wrote Oppenheimer analyst Rick Schafer in a report to clients. "The AI computing arms race continues at full speed. The demand for AI computing infrastructure from cloud service providers is essentially unconstrained and far exceeds supply, a situation expected to persist beyond 2027. Supply constraints related to AI computing infrastructure span multiple dimensions, most notably advanced wafer fabrication, advanced packaging, and high-end HBM memory systems. Delivery lead times continue to lengthen. As near-insatiable AI demand aggressively absorbs supply, prices for chips associated with AI computing infrastructure are rising broadly and are likely to be passed on to more major customers."

Consequently, analyst Schafer stated his preference for chip companies that can deliver "structurally strong growth, enabling significant outperformance throughout the cycle." Furthermore, Schafer noted that application-specific integrated circuits focused on AI continue to ramp up volume, led by Google's Tensor Processing Unit clusters. He also pointed out that "multiple large projects" are steadily increasing to support the growth trend of large language model parameters, with related market participation also rising. Schafer mentioned that many orders announced today might not start generating actual revenue until mid-2028, due to challenges associated with NVIDIA's AI GPU cabinet systems and Google's AI ASIC cabinet interconnection systems, along with other issues related to high-speed connectivity within data centers, including differences between traditional and new architectures.

Regarding NVIDIA, Schafer stated that the conservative estimate for the number of AI server cabinets based on the Grace Blackwell and Vera Rubin architectures this year might exceed 75,000 units. He also anticipates that the average selling price for Vera Rubin will be 50% or more higher than Grace Blackwell, with a conservative selling price per AI computing cabinet unit potentially approaching $7 million. Additionally, Schafer added that Vera Rubin has higher requirements for power management systems compared to the Grace Blackwell 200 or more advanced GB300 series, which could bring sustained and substantial benefits to Monolithic Power Systems.

NVIDIA, Broadcom, and Marvell Technology are arguably the three biggest winners in the global AI computing infrastructure arms race. Notably, NVIDIA's recent valuation compression to levels near or even below the S&P 500 index suggests it possesses stronger rebound elasticity compared to other chip stocks.

Other conclusions from Oppenheimer's research trip included: server CPU shortages have not significantly impacted traditional server growth; cloud service providers and NVIDIA still prefer copper cable-based interconnect technology where possible but firmly adopt co-packaged optics in "certain necessary scenarios"; cloud giants led by Google are widely adopting active copper cables. Schafer also observed the effects of memory chip shortages, noting the smartphone and PC markets were most negatively impacted. While the smartphone market is widely expected to experience an "overall decline," Schafer indicated the low-end and mid-range segments are "bearing the brunt," with Chinese smartphone shipments down nearly 20%. Conversely, Apple's flagship smartphones showed "greater resilience" amid persistently rising memory prices. Regarding PCs, Schafer expects the market size to decline by 11% this year, though higher-priced AI PCs may offset some of the weakness.

The massive wave of AI agents is approaching forcefully, driving continuous explosive expansion in AI computing demand. As Agent AI/Agentic AI, focused on agentic workflows, takes center stage in the digital world, 2026 could become the breakout year for AI agents. This implies global demand for AI computing infrastructure will continue its explosive expansion trajectory. The concentrated emergence of autonomous task-executing AI agents like Anthropic's Claude Cowork and OpenClaw in 2026 is not accidental; it essentially represents the first simultaneous convergence of five curves: "model capabilities, tool protocols, development frameworks, inference costs, and end-context capabilities."

At the AI application layer, AI agents are highly likely to become the dominant commercial interface because they translate "intelligence" directly into "action," signifying AI's progression from "knowing how to answer" to "knowing how to execute, collaborate, and complete highly complex multi-step tasks." Corporate urgency to improve efficiency and reduce operational costs has recently significantly accelerated the widespread adoption of two core categories of AI application software: generative AI applications and AI agents. Among these, AI agents are very likely the ultimate mega-trend in AI applications for the next decade-plus. Their emergence signifies AI's evolution from an information-assistance tool into a highly intelligent productivity tool.

Latest research from MarketsandMarkets indicates the AI agent market size could reach $53 billion by 2030, implying a compound annual growth rate of 46% starting from 2025. Recent research from Omdia shows global semiconductor industry revenue is expected to surge over 30% in 2026, breaking the historic milestone of $1 trillion for the first time. It bets this strong growth will be primarily driven by data center memory chips, AI GPUs/AI ASICs, and data center server CPUs, fueled by persistently exploding AI training/inference computing demand.

Within the broad chip sector, the AI chip/AI computing infrastructure chain, represented by NVIDIA and Broadcom, has proven in recent years to be the group most sensitive to market rebound logic, with the fastest reaction times and often the greatest elasticity. This is because they simultaneously possess the strongest earnings certainty, the clearest AI capital expenditure theme, and significant valuation recovery potential following previous pullbacks. Oppenheimer's continued listing of NVIDIA, Broadcom, Monolithic Power Systems, and Marvell Technology as its top semiconductor picks itself reflects the market's view that the extreme supply-demand tightness and structural growth in AI computing infrastructure, driven by AI agents, remains the core theme for the semiconductor sector.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment