On May 21, NVIDIA CEO Jensen Huang said that the open-source release of DeepSeek R1 has driven the widespread adoption of inference AI technology and accelerated the need for computing. Models such as OpenAI’s o3, DeepSeek R1, and xAI’s Grok 3 demonstrate how AI can move from perceiving and generative models to long-term thinking inference models that require significantly enhanced computing power. These models can solve complex problems, make strategic decisions, and apply logical reasoning, but each task requires 100 times more computation than traditional inference-based AI. “The scale of post-training and model customization is huge, and the computational effort can be orders of magnitude higher than before training,” Huang said. Our inference needs are accelerating, driven by expanded testing times and new inference models."
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Nvidia CEO: DeepSeek has driven an explosive rise in AI computing demand.
On May 21, NVIDIA CEO Jensen Huang said that the open-source release of DeepSeek R1 has driven the widespread adoption of inference AI technology and accelerated the need for computing. Models such as OpenAI’s o3, DeepSeek R1, and xAI’s Grok 3 demonstrate how AI can move from perceiving and generative models to long-term thinking inference models that require significantly enhanced computing power. These models can solve complex problems, make strategic decisions, and apply logical reasoning, but each task requires 100 times more computation than traditional inference-based AI. “The scale of post-training and model customization is huge, and the computational effort can be orders of magnitude higher than before training,” Huang said. Our inference needs are accelerating, driven by expanded testing times and new inference models."