AI Investment Shifts Beyond Nvidia as Capital Spending Hits $650 Billion
Analysts track infrastructure expansion and evolving market dynamics as hyperscalers pour record capital into AI buildout.
Big Tech’s four largest companies will deploy $650 billion in capital expenditures during 2026, nearly doubling last year’s AI infrastructure spending in a buildout that now extends well beyond Nvidia’s dominance. The spending surge marks a structural shift in how analysts assess AI investment opportunities, with capital deployment outpacing revenue visibility and forcing a more selective approach to stock picking.
The Infrastructure Arms Race
According to CNBC, Alphabet expects 2026 capital expenditures between $175 billion and $185 billion, more than double its 2025 spend. Microsoft, Alphabet, Amazon, Meta, and Oracle collectively plan to spend $660 billion to $690 billion on Infrastructure in 2026, with the vast majority directed at AI compute, data centers, and networking, according to analysis by Futurum Group.
Hyperscaler capex for the “big five” is forecast to exceed $600 billion in 2026, a 36% increase over 2025, with roughly 75% ($450 billion) directly tied to AI infrastructure, reports the IEEE ComSoc Technology Blog. Alphabet plans to invest approximately 60% in servers and 40% in data centers and networking equipment, setting the template for infrastructure allocation across the sector.
Wall Street’s consensus estimate for 2026 capital spending is now $527 billion, up from $465 billion at the start of third-quarter earnings season, according to Goldman Sachs Research. Yet analysts have consistently underestimated AI spending. Consensus capex estimates have proven too low for two years running, with projections of roughly 20% growth at the start of both 2024 and 2025, while reality exceeded 50%.
Market Fragmentation and Selectivity
Investor sentiment around AI remains bullish but increasingly discriminating. US AI-related stocks beat earnings expectations, yet many advisors are underweight, with the average technology allocation 9% below the S&P 500 across moderate portfolios, even though 60% of advisors say they are bullish on AI stocks, notes BlackRock.
Bloomberg’s MLIV column on February 26 and 27 highlighted that the AI trade has evolved beyond Nvidia, with all the excitement now in AI evolution. Since June, the average stock price correlation across large public AI hyperscalers has declined from 80% to just 20%, driven by investor confidence in which AI investments are generating revenue benefits, Goldman Sachs reports.
The AI infrastructure spending cycle differs markedly from past technology booms. Unlike the late-1990s telecom bubble, today’s capital deployment is overwhelmingly cash-funded by companies generating robust free cash flow. Today’s AI spending is being funded with cash, not debt, by companies that regularly generate high volumes of free cash flow, according to Fidelity analysis.
Earlier this year, the biggest AI stocks rose as a group, but since June investors have rotated away from AI infrastructure companies where operating earnings growth is under pressure and capex is debt-funded, while rewarding companies demonstrating a clear link between capex and revenues, Goldman Sachs observes.
Competition Intensifies in AI Chips
The chip landscape is fragmenting as hyperscalers diversify suppliers. Meta’s multiyear deal with AMD involves deploying up to 6 gigawatts of GPUs for AI data centers, with early shipments of MI450 GPUs in Helios rack-scale servers beginning later this year, CNBC reported February 24. The deal came days after Meta committed to deploying millions of Nvidia processors.
Nvidia controls roughly 90% of the market with a $4.66 trillion valuation, while AMD is valued at $320 billion. Yet AMD’s AI accelerator market share is projected to climb from approximately 9% in 2025 to over 15% by year-end 2026, and if Meta’s migration to AMD’s ROCm software ecosystem proves durable, it gives other hyperscalers like Microsoft and Alphabet a green light to follow, according to analysis on Investing.com.
| Company | 2026 Market Share | Key Differentiator |
|---|---|---|
| Nvidia | ~90% | CUDA ecosystem, annual cadence |
| AMD | 9-15% | Custom silicon, open standards |
| Broadcom | Custom silicon | TPU design, cost optimization |
Broadcom designs custom AI chips for others, most notably Google’s TPUs, with the big story for 2026 being expansion to external clients like Anthropic, which placed orders totaling $21 billion, offering lower total cost of ownership than standard GPUs, reports TipRanks.
Valuation and Productivity Questions
Morgan Stanley’s Global Investment Committee projects near double-digit percentage returns for the S&P 500 with a target around 7,500, yet analysts are projecting 14% to 16% annual EPS growth in 2026, which for the 493 stocks excluding the Magnificent 7 would represent a doubling in earnings growth pace compared to 2025, according to Morgan Stanley.
AI spending in North America is expected to accelerate, with global AI infrastructure investment projected at $1.4 trillion and total AI spending exceeding $2.5 trillion in 2026—a 44% year-over-year increase, notes Harbourfront Wealth Management. Yet revenue visibility remains limited. Pure-play AI vendors led by OpenAI and Anthropic are posting rapid revenue growth, though their combined revenues remain a fraction of the infrastructure investment being deployed on their behalf.
“The combination of continued corporate AI adoption and growing concerns about the AI infrastructure complex has increased recent investor focus on the next beneficiaries of the ever-expanding AI trade.”
— Ryan Hammond, Goldman Sachs Research
Goldman’s framework for companies benefiting from AI productivity focuses on labor costs as a share of sales and exposure to AI automation, with this group of Productivity Beneficiaries having lagged their earnings trajectory, suggesting attractive risk-reward for investors seeking to expand exposure beyond the infrastructure layer.
What to Watch
Three metrics will determine whether 2026’s record capital deployment translates to sustained stock performance. First, monitor quarterly capex guidance revisions—two years of upward surprises suggest consensus remains conservative. Second, track software ecosystem adoption around AMD’s ROCm and Broadcom’s custom silicon; market share gains here would validate diversification away from Nvidia’s CUDA moat. Third, watch for concrete AI revenue disclosures from hyperscalers, particularly return-on-investment metrics that justify infrastructure spending at 45-57% of revenue.
The shift from infrastructure plays to productivity beneficiaries represents the next phase of AI investment, but timing remains uncertain. Companies demonstrating clear links between AI capex and revenue generation will command premium valuations, while those with spending ahead of monetization face increasing scrutiny. For now, the capital deployment trajectory points to sustained demand for AI infrastructure through 2027, but selectivity has replaced the broad-based enthusiasm that characterized 2024 and early 2025.