Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Nvidia (NVDA.US) Significantly Raises AI Chip Revenue Forecast, Cumulative Sales Could Reach One Trillion Dollars by End of 2027
According to Cailian Press Finance APP, against the backdrop of continuous explosive growth in artificial intelligence computing demand, global chip giant NVIDIA (NVDA.US) has once again raised its long-term revenue expectations for AI chip business. CEO Jensen Huang stated at the annual developer conference GTC that the company expects its Blackwell and Rubin generations of AI chips to generate at least $1 trillion in cumulative revenue by the end of 2027, further highlighting the huge market potential brought by the AI wave. As of Monday’s close, NVIDIA’s stock rose 1.63% to $183.187.
Previously, NVIDIA predicted that these chips would generate approximately $500 billion in sales by the end of 2026. The latest forecast not only raises the revenue scale to the trillion-dollar level but also extends the time frame by one year. Huang said at the conference that global computing demand has seen unprecedented growth over the past two years, “I believe computing demand has increased by 1 million times in the past two years. Not only do we feel this, almost all startups do too.”
At GTC, NVIDIA also announced several new products and technological updates to further strengthen its leading position in AI infrastructure. The company revealed plans to integrate technology from AI chip startup Groq into its product ecosystem and launched the Groq 3 LPU (Language Processing Unit). This dedicated chip, mainly used for inference calculations in large language models, can significantly improve the speed of AI systems in generating text and responding to requests. NVIDIA plans to use it as a co-processor alongside existing AI accelerators to enhance overall system performance. The chips will be manufactured by South Korea’s electronics giant Samsung Electronics, with systems based on this technology expected to launch in the second half of this year.
Meanwhile, NVIDIA also showcased a new general-purpose CPU architecture called “Vera,” marking the company’s further expansion into the traditional data center processor market. Huang said that the CPU business “will definitely become a multi-billion-dollar market opportunity.” As AI data center architectures become increasingly complex, general-purpose CPUs responsible for coordinating different computing tasks are becoming more important.
NVIDIA stated that the Vera processor will combine multiple advantages of data center, gaming PC, and laptop processors, capable of handling large amounts of data input simultaneously, completing complex calculations quickly, and consuming less power. The company plans to launch server systems composed entirely of CPUs, which will become a new product form for NVIDIA. These computers can operate in conjunction with other NVIDIA systems or independently.
In recent years, NVIDIA has accelerated its technological updates, launching a new core architecture almost every year. The company’s next-generation flagship AI system is expected to be launched in the second half of 2026, named “Vera Rubin,” in honor of the renowned astronomer Vera Rubin, who provided evidence for the existence of dark matter.
The explosive growth in demand for AI chips has propelled NVIDIA to become one of the world’s most valuable companies, with a current market capitalization of about $4.4 trillion. However, as investors pay more attention to the return cycle of AI investments, the company’s stock price has recently slowed in its rise. Since the beginning of the year, NVIDIA’s stock has fallen about 3.4% at one point, but on GTC day, it still rose 1.6%, closing at $183.19.
Despite NVIDIA’s leading position in the AI chip market, competitive pressure is also increasing. Competitor AMD (AMD.US) is accelerating the launch of new AI accelerators, and tech giants including Meta (META.US) are developing self-designed chips to reduce reliance on NVIDIA products.
Meanwhile, as AI software matures, some companies are exploring the use of lower-cost, lower-power CPUs to run already trained AI models, providing new strategic space for NVIDIA to expand its CPU business. Previously, the company partnered with social media giant Meta (META.US), indicating that its processors may be sold as standalone products in the future.
Industry insiders believe that NVIDIA is gradually transforming from a graphics processor manufacturer into a comprehensive AI computing platform provider. Its product ecosystem now covers processors, network devices, software platforms, and AI models. By continuously expanding its technological ecosystem and product lines, NVIDIA is trying to establish deeper industry barriers in the AI infrastructure field.