Nvidia Corp. (NASDAQ: NVDA) hosted GTC (GPU Technology Conference) 2025, the company’s major AI conference for developers, in mid-March. Here are some of our key observations post-GTC.
Q1 revenue estimates move higher
Since last quarter’s earnings release, estimates for the B-series chips have moved around 15% higher for Q1 2026, driven by the outlook for Blackwell provided in both the earnings release and CEO Jensen Huang’s comments at GTC. Huang highlighted how the demand for Blackwell continues to be very strong. This optimism around the B-series chips is captured in the Data Center revenues, which have increased in tandem with the Q1 B-series expectations. However, expectations from Q2 have come down as uncertainty remains around the macro environment and the actual cadence of the ramp for Blackwell for the rest of FY 2026. In addition, even though revenues have been moving higher for Q1, gross profit has remained flat on the back of the muted margin guidance the company gave for the first half. Beyond Q1 Data Center revenues, questions seem to remain about the trajectory of B-series revenue growth and the Data Center margins.
From January 1, 2023 to March 26, 2025, CY 2025 – CY 2026 aggregated Visible Alpha consensus expectations for Data Center revenues have been revised up a massive $370 billion. Despite the significant upward revisions from January 2023 to January 2024, numbers continued to move through CY 2024 and Q1 2025. There are questions about whether or not the Data Center business segment will continue to see further upward revisions.
According to Huang, Cloud Service Providers (CSP) are continuing to drive significant demand for Blackwell, because it supports faster inference speeds and model training but uses less energy. He also pointed out that demand is coming from sources beyond the CSPs, which may imply that the total addressable market (TAM) is increasing. While the Data Center revenue expectations have continued to increase significantly, it has been challenging for investors to grasp the magnitude of the TAM. Last year at GTC, Data Center revenue estimates were already taken up $250 billion in aggregate. However, they have moved an additional $120 billion for CY 2025 and CY 2026. Given the strength of demand for Blackwell, there are questions around the optimal amount of capacity and spend for innovation to happen at-scale with AI, especially given how much DeepSeek accomplishes at a much lower cost.
Nvidia Blackwell demand from CSPs
While there has been significant innovation in the chip and model, there has not been much at the application layer. The more tokens generated, the smarter the AI, but these tokens can be very expensive. Ultimately, the high costs have been a barrier to innovation for finding ways to fully monetize end-users. Additionally, high costs have been an obstacle for inferencing at scale. Reducing the costs could be the gateway to seeing business models emerge and finally driving ROI and revenue generation. In his presentation at GTC, Huang addresses the challenges for the business case. According to Huang, “Blackwell is a giant leap in Inference Performance”. Will Blackwell’s performance improvements remove some of the barriers to innovation and monetization and serve as a catalyst for AI applications and new business models to emerge?
Inferencing at-scale
AI at the Edge
This year, Huang emphasized the importance of Edge computing solutions, which can enable data-processing and decision-making in real-time, because it is closer to the actual source of the data. Autonomous vehicles, AI-enhanced laptops and smartphones represent potential areas for the two companies as Edge AI computing continues to expand.
CEO Jensen Huang highlighting the Edge
Focus on autonomous driving
According to Huang, accelerated computing and Generative AI are going to move the world closer to autonomous driving. He emphasized larger models and better performance. With foreign auto tariffs looming, this may be an opportunity for US automakers to capture significant share in autonomous vehicles by creating a pricing advantage.
The move to accelerated computing is laying the groundwork and positioning the auto industry to scale innovative solutions in manufacturing. Omniverse Digital Twins will be able to simulate real world experiences and solutions that can help automotive manufacturers reduce risk and cost, while improving efficiency and creativity. The Omniverse enables everything to be manufactured digitally first.
CEO Jensen Huang highlights Nvidia Halos
Longer-term, going higher?
For FY 2026, the range of estimates remains substantial for the B-series and implies that there is significant debate about Nvidia’s growth outlook and whether the company will deliver its AI revolution dream. Based on Visible Alpha consensus, the B-series, which includes B100, B200, B300 GPUs based on Blackwell architecture, is expected to generate revenues of $66.9 billion in FY 2026. The top-end estimate is currently at $128.0 billion, while the low-end estimate is at $35.0 billion for B-series revenue. Clarity around the revenue growth for the B-series will likely be a key aspect to the valuation this year and next.
Non-GAAP diluted consensus EPS for FY 2026 is now projected to be $4.60/share, but ranges from $5.83/share to $4.22/share, with the consensus P/E at 24x, ranging from 26x to 19x. NVDA stock is down over 17% since the late February earnings release, due to macro uncertainty around tariffs and concerns the capex spend and expectations may be getting stretched. Will the B-series drive the Data Center business to beat expectations in FY 2026 and FY 2027 and continue to drive upside in the stock or have expectations gone too high?