Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
AI Earnings Report Showdown Night: $650 Billion Invested in AGI
On April 29, 2026, Microsoft, Google, Meta, and Amazon all released their first-quarter earnings reports on the same day. Looking at the capital expenditure guidance provided by these four companies alone, the figure approaches $650 billion. This scale is already comparable to the entire annual GDP of Sweden.
In other words, the four wealthiest tech giants in the world are preparing to spend an amount equivalent to a medium-developed country’s annual economy to buy their way into the ticket to the AGI era.
Now everyone’s eyes are firmly fixed on that ticket to AGI. In this moment, often dubbed the “Global AI Asset ‘Decisive Night’,” if we slightly shift our focus away from those grand narratives and look into the unnoticed hidden corners, we will find an underground battle concerning physical constraints, capital anxiety, and industry restructuring, which has actually reached a point of no return.
How can a company without earnings reports crash the US stock market?
The true market sentiment controllers are not necessarily the companies with the most profitable books, but those that are regarded as “faith symbols” by everyone.
April 29 was originally the most important day of the US earnings season. But before the listed companies finished their reports, the market experienced an unanticipated stampede. Data from Goldman Sachs shows that this was the second-worst trading day for AI assets this year.
The trigger was not a company’s earnings miss, but a report from The Wall Street Journal the day before, stating that OpenAI failed to meet its 2025 revenue targets, and the goal of 1 billion active weekly users remains far off. What stung the market even more was the mention that OpenAI CFO Sarah Friar had internally warned that if revenue growth continued to underperform, the company might struggle to support its massive $600 billion compute procurement commitments.
A company that is not publicly listed and does not need to release financial reports, just based on a rumor, caused Oracle’s stock to fall by 4%, CoreWeave to drop 5.8%, and even SoftBank across the Pacific to plunge 12% in the over-the-counter market.
When a $600 billion compute commitment collides with unfulfilled revenue growth, the market suddenly realizes that the most dangerous aspect of the AI narrative is not that no one believes in the future, but that the future is just too expensive.
Over the past two years, OpenAI has become a religion in Silicon Valley.
Graphics card procurement, data center construction, cloud provider expansion, startup valuations—many seemingly scattered decisions are fundamentally based on the same judgment: model capabilities will continue to leap, user scale will keep expanding, and AGI will eventually turn all today’s costly investments into future tickets.
What makes this logic most powerful is its self-reinforcing nature. The more believers there are, the higher the valuation; the higher the valuation, the more others dare not to disbelieve.
But around April 29, for the first time, the market seriously questioned the cash flow of this belief. Even OpenAI had to face customer acquisition costs, user retention, revenue growth, and compute bills.
Printing Money and Cooling Water
The most fascinating aspect of the internet era is that growth appears nearly limitless.
Write a piece of code, copy it to ten million users, and the marginal cost is spread so thin that it’s almost negligible. Over the past twenty years, Silicon Valley has dared to overturn traditional industries with “burning money for growth,” relying on this belief: as long as network effects are strong enough, scale will swallow costs.
But in the AI era, the digital printing press is being tightly choked by the cooling water pipes of the physical world.
At the April 29 earnings call, despite cloud business growing at an astonishing 63% (with quarterly revenue surpassing $20 billion for the first time), Google CEO Pichai’s tone revealed helplessness: “If we could meet demand, cloud revenue could have been even higher.”
Behind this statement lies the most peculiar business dilemma of the AI era: demand far exceeds supply, but growth is ruthlessly limited by the physical world.
Google holds a backlog of cloud orders worth up to $462 billion, nearly doubling quarter-over-quarter. AI solution products grew nearly 800% year-over-year, Gemini Enterprise paid users increased 40% quarter-over-quarter, and API token usage soared from 10 billion per minute to 16 billion.
These numbers would be celebrated as growth for any internet company. But in Pichai’s words, we hear a new kind of dilemma emerging in the AI age: customers are queuing, money is on the way, but servers are not yet built, power is not yet connected, and advanced chips are not yet manufactured in wafer fabs.
It’s not that there’s no demand; it’s that demand is so overwhelming that it pulls growth back into the physical realm.
Microsoft faces the same dilemma. Azure’s growth hit 40%, and AI annualized revenue surpassed $37 billion—up from just $13 billion in January 2025, nearly tripling in 15 months.
However, Microsoft’s capital expenditure dropped quarter-over-quarter to $31.9 billion, down nearly $6 billion from the previous quarter’s $37.5 billion. The company explained this as “infrastructure build timing.” The implication is that money can be allocated today, but data centers won’t be built overnight; GPUs can be ordered, but power, land, cooling systems, and construction cycles cannot be hastened by capital markets.
While everyone thought we were rushing toward a virtual world, the ultimate determinants of victory remain the oldest assets—heavy physical assets and physical laws.
Compute power is becoming a new form of “land resource”: limited in the short term, slow to build, location-dependent, and first-come, first-served. In this land grab, the four giants are willing to push capital expenditure to the $650 billion level not because they have all calculated the returns, but because they fear that if they don’t hoard this “land,” they might not even get a seat at the table tomorrow.
The Art of Burning Money
After market hours on April 29, despite exceeding expectations and raising capital expenditure guidance again, Google’s stock rose 7%, while Meta plummeted 7%.
To be fair, Meta delivered a quite impressive report: revenue of $56.31 billion, up 33% year-over-year, fastest growth since 2021; EPS of $10.44, far exceeding Wall Street expectations.
But Zuckerberg made a taboo mistake: Meta raised its 2026 capital expenditure guidance to between $125 billion and $145 billion. The better the results, the more nervous the market became. Because what investors truly worry about isn’t whether Meta is profitable now, but whether it will use the cash generated from its current advertising business to support a high-stakes AI gamble with an unclear path to recoupment.
Market punishment is ruthless. The difference lies in the granularity of business monetization.
Google, Amazon, and Microsoft’s AI investments can at least be incorporated into relatively clear accounting books.
Google has a backlog of $462 billion in cloud orders, Amazon has AI annualized revenue from AWS, and Microsoft has Copilot paid users and high RPO. Every dollar spent may not pay off immediately, but Wall Street at least knows roughly where the money will come from: enterprise clients, cloud contracts, software subscriptions, compute leasing.
This is why capital markets are willing to keep listening to their stories. The story can go far, but the cash flow path cannot be entirely invisible.
Meta’s problem is that it doesn’t have a cloud business to sell externally.
The hundreds of billions it poured in will ultimately be realized through a more convoluted path: Meta AI assistants to increase user stickiness, improved recommendation algorithms to boost ad conversions, AI-generated content to extend user engagement, and future hardware like smart glasses to become new entry points.
This logic isn’t invalid; it’s just that the chain is too long. Cloud providers burn money by putting GPUs into already signed orders; Meta burns money by putting GPUs into an unproven advertising efficiency model. The former can be discounted, the latter can only be believed first. Logically sound, but the monetization chain is too long, and Wall Street lacks the patience.
And in capital markets, patience is a luxury. Especially when capital expenditure hits the trillion-dollar level, investors are willing to pay for the future but won’t pay indefinitely for ambiguity.
Even more worrying is the time lag.
Amazon CEO Andy Jassy admitted on the call that most of the investments made in 2026 won’t generate returns until 2027 or even 2028.
This means giants are pushing today’s cash flow into capacity realization two years from now, with gaps in data center construction, chip supply, power access, customer demand, and model iteration. Any deviation in any link can lead to a revaluation by the capital market.
The most dangerous aspect of the AI arms race is here: money is spent today, stories are told today, but the answers won’t be revealed until two years later.
Blurring Industry Boundaries
AI has not, as many expected two years ago, rapidly pushed search off the table.
When ChatGPT first appeared, the market believed that search ads would be directly replaced by answers, and companies like Perplexity were thus highly anticipated. But in Google’s April 29 earnings report, search query volume hit a record high, with ad revenue reaching $77.25 billion, up 15% year-over-year.
This is more like the “Jevons Paradox” of the AI era. In 1865, British economist William Stanley Jevons discovered that improvements in steam engine efficiency did not reduce coal consumption—instead, coal consumption increased significantly because efficiency made steam engines affordable to more people, fueling overall demand. Similarly, AI makes search more complex and prompts users to ask more questions.
This is also why Google is easier to convince the market than Meta. It has both the cash flow from its old entry points and a new ledger from cloud business; it can profit from advertising and from enterprise compute needs. AI has not dismantled its city walls—in fact, so far, it has added a layer of reinforcement.
Similar boundary reconfigurations are happening in the chip industry. On the same day, Qualcomm, the king of mobile chips, reported revenue of $10.6 billion. During the earnings call, CEO Cristiano Amon announced a major decision: Qualcomm is officially entering the data center market, collaborating with a leading large-scale cloud provider on custom chips, expected to start shipping later this year.
Qualcomm’s main battlefield has always been mobile devices. But as AI’s computational load begins redistributing between the cloud and the edge, it must redefine its position.
If future AI is fully dominated by cloud large models, the value of mobile chips will be compressed; if edge AI becomes standard, Qualcomm must prove it’s not just for phones but also capable of inference, terminals, and low-power data centers.
Its move into data centers is more defensive than offensive.
When AI shifts from a “luxury in the cloud” to a “standard on the edge,” all industry boundaries start to blur. Mobile chip companies try to enter data centers, cloud providers begin developing their own chips, and chip companies explore models. Qualcomm’s “defection” is just the tip of this major restructuring.
The Same Gold Rush, Two Valuation Languages
The same AI gold rush has entered a harsh “value realization disproof” phase in the US stock market. Even leading semiconductor process control and inspection equipment companies, if they reveal any geopolitical or tariff risks, will be revalued. On April 29 after hours, KLA Corporation reported revenue of $8B, exceeding expectations, with Non-GAAP EPS of $9.40, above the expected $9.16.
Yet, its stock price fell sharply by 8% afterward.
The reason isn’t poor performance but market concerns over tariffs and China exposure.
KLA’s customer list includes many Chinese wafer fabs. Against the backdrop of US-China tech decoupling, this “China exposure” is like the sword of Damocles hanging overhead. No matter how stellar the results, it cannot offset the market’s instinctive fear of geopolitical risks.
In A-shares, a different language is used.
Here, performance still matters, but often, performance is just fuel; the real ignition is the narrative—whether you hold the ticket called “domestic substitution.”
On the evening of April 29, Cambrian (Hanmi) released a remarkable quarterly report: revenue of 8B yuan, up 159.56% year-over-year, breaking the 2 billion mark for the first time in a single quarter; net profit of 8B yuan, up 185.04%. The next day, Cambrian’s stock soared, with a market cap surpassing 670 billion yuan, hitting a record high, up over 62% since the start of the year.
On the same day, Mu Xi Co., Ltd. reported revenue of 562 million yuan, up 75% year-over-year, and a significant narrowing of losses from 233 million yuan last year to 98.84 million yuan. This is the first quarterly report from this GPU company, listed only in December 2025.
Both are involved in AI infrastructure, yet the US and A-shares give completely different valuation responses.
KLA faces a complex global supply chain—performance, orders, tariffs, China exposure, export controls—all factors that can enter valuation models.
Cambrian and Mu Xi face a different narrative environment: external restrictions strengthen, the strategic value of domestic computing power is amplified. In the US, risk is discounted; in A-shares, scarcity is valued.
Smart Money Exits
But just as the market cheers for Cambrian, a detail looks somewhat glaring.
By the end of 2025, super-investor Zhang Jianping still held 6.8149 million shares of Cambrian, worth about 9.2 billion yuan, making him the second-largest individual shareholder. By this quarterly report, he had quietly exited the top ten shareholders.
Roughly estimating from the quarterly stock price range, this divestment involved at least tens of millions of yuan. The exact price is unknown, but it’s clear that before the explosive performance and record-high stock price, the earliest beneficiaries of this narrative chose to cash out.
There are always two types of people in the market: those who buy into the narrative, and those who price it.
Zhang Jianping clearly belongs to the latter. He entered Cambrian before it became a household name, and after it was written into the grand story of “domestic computing power leader,” he turned and exited.
On this $650 billion earnings night, Silicon Valley giants are anxious over compute shortages, Wall Street analysts are agonizing over the time lag in realization, and A-shares are busy re-pricing domestic computing power.
In this same AI gold rush, each market speaks its own language. The US talks about return cycles, A-shares about domestic substitution; cloud providers discuss order backlogs, Meta talks about ad efficiency; OpenAI doesn’t release earnings but still influences the entire compute chain.
Everyone is convinced they hold the ticket to the AGI era. But no one knows when this show will end, or where the exit is. The ticket to the AI era is expensive, but more costly than the ticket itself is knowing when to leave.