Microsoft's earnings reveal AI infrastructure's capital intensity and physical bottlenecks, with nearly all earnings reinvested and capacity struggling to meet demand due to power and land constraints. Amazon benefits from secured power capacity and exclusive PPAs, positioning it as an energy and compute real estate player with model-neutral contracts. Alphabet's self-developed TPUs offer a cost advantage in chip design, enabling lower cloud pricing and margin restoration. Investors should scrutinize Capex efficiency and accounting practices, favoring companies with control over physical resources.

As the Q1 2026 earnings season kicks off, the investment narrative for Big Tech is undergoing a profound paradigm shift. Following Microsoft's earnings release last week, the market responded with an extreme $400 billion single-day wipeout in market capitalization, despite seemingly solid core data. The essence of this valuation correction is not a simple fluctuation in growth rates, but a deep reassessment by capital markets of the physical bottlenecks and capital efficiency in the AI infrastructure race. On the eve of Alphabet and Amazon’s earnings disclosures, understanding the "physical wall" effect reflected by Microsoft is crucial for gauging the valuation elasticity and potential risks of these two companies.
Microsoft’s financial performance last quarter serves as a highly instructive negative case study. An analysis of internal earnings summaries reveals that Microsoft's quarterly Capex reached $37.5 billion, a staggering 66% year-over-year increase. However, compared to its massive capital investment, operating profit for the same period was only $38.3 billion. This implies that for every dollar Microsoft currently earns, it must reinvest approximately 98 cents into the bottomless pit of AI infrastructure. This near "break-even" capital intensity signals that AI investment has entered a dangerous zone of diminishing marginal utility. A deeper issue is that even with record-high capital investment, a significant gap in capacity delivery persists. During the earnings call, Satya Nadella admitted that capacity cannot keep pace with demand; the core pain point is no longer the supply of Nvidia B200 chips, but the power resources and land permits required to run data centers. In the macro context of 2026, the computing power moat is giving way to physical-layer energy monopolies.
In contrast to Microsoft's struggles with capacity delivery, Amazon’s early positioning in the infrastructure sector is translating into a significant competitive advantage. Over the past year, AWS has secured 3.8 gigawatts (GW) of power capacity reserves—enough to power 1.5 million homes, effectively building a power grid the size of a small nation in a single year. The brilliance of Amazon’s strategy lies in securing exclusive Power Purchase Agreements (PPAs), particularly in nuclear-rich regions like Pennsylvania, effectively cutting off competitors' access to incremental power. This logic of "physical hoarding" transforms Amazon from a mere cloud provider into an energy and compute real estate player with immense bargaining power.
Regarding its business model, the "model-neutral" strategy Amazon promotes through its Bedrock platform places it in an excellent hedging position in the AI war. Unlike Microsoft’s deep tie-up with OpenAI, Amazon does not bet on the success of a single model's iteration. Instead, by integrating Claude 3.7, the Llama series, and its own Nova models, it has built a "house" model for "computing power rental." This logic insulates Amazon from systemic risks caused by fluctuations in model performance. A significant portion of AWS’s $200 billion order backlog comes from these highly flexible, neutral contracts; this steady "landlord mindset" provides stronger valuation resilience in the current high-volatility market environment.
Alphabet (Google) faces a different set of complexities. While concerns about its search business being eroded by AI Agents persist, its vertical integration capabilities in chip design constitute its core defensive shield. Wall Street predicts Google’s Capex will surge to $27.3 billion this quarter, a massive outlay largely intended to counter the structural rise in search costs. In the AI era, the cost of responding to each second of search is growing non-linearly. However, by relying on its self-developed TPU v7 chips, Google has achieved a "decoupling" from the Nvidia ecosystem regarding computing costs.
Currently, few companies in Silicon Valley besides Google can achieve complete chip autonomy. While Microsoft and Meta are still paying high gross margin premiums to Nvidia, Google has significantly optimized internal inference costs through its TPUs, translating this into a price advantage for external customers. Surveys indicate that the computing power prices Google offers to major third-party clients are 30% to 50% lower than its peers. Consequently, the potential jump in Google Cloud’s profit margins may not stem entirely from revenue scale effects, but rather from the forceful restoration of margins by in-house hardware. While this defensive war is expensive, the silicon-based moat formed by TPUs has secured a critical window for Google to convert brand premium into efficiency premium.
A detail that demands high scrutiny when observing these two companies' earnings is the accounting adjustment of server depreciation periods. As hardware Capex accounts for an increasing share of revenue, tech giants are universally motivated to extend server depreciation cycles to smooth out profit performance. By extending the useful life from six to eight years, book losses can be instantly transformed into "artificial profits." Investors should be wary of this phantom prosperity in Earnings Per Share (EPS) and focus instead on Free Cash Flow (FCF). If FCF continues to shrink while depreciation periods are extended, it indicates the company is resorting to accounting magic to mask declining capital efficiency.
Latest 13F filings from early 2026 show that top quantitative institutions, including Renaissance Technologies and Bridgewater Associates, strategically realigned their positions ahead of the earnings reports. In a 3.5% high-interest-rate environment, institutional investors are shifting from "chasing stories" to "chasing certainty." Large-scale divestment from Microsoft in favor of Amazon, which possesses a physical moat, reflects extreme market caution regarding the Return on Investment (ROI) in the AI infrastructure race.
In summary, the earnings reports from Google and Amazon this week will serve as the market’s primary benchmark for determining whether AI investment has entered a "bubble burst" phase. Investors should not merely be swayed by first-order data such as revenue growth, but should look deeply into Capex efficiency, power supply exclusivity, and the robustness of accounting practices. In this capital-intensive game, companies with a voice in physical resources and control over vertical infrastructure are the ultimate "house" with the right to survive and set prices in the face of massive bills.
This content was translated using AI and reviewed for clarity. It is for informational purposes only.