18 3月, 2026
NVIDIA GTC highlights the ‘five-layer cake’ of AI
The conversation about artificial intelligence is shifting from compute and models to building blocks, layer cakes and stacks — all ways to describe the necessities for long-term AI infrastructure.
As Seagate Chief Commercial Officer B.S. Teh puts it, “AI and data-driven workloads have put infrastructure back in the spotlight.”
Why all the attention? Because “AI doesn’t just consume data — it compounds it,” he says. This accelerating data growth is “changing operating models where data isn’t just an asset but the foundation of intelligence, automation and competitive differentiation.”
According to Teh, these new models rely on five essential building blocks: Data storage, compute/memory, networking, infrastructure and power/cooling.
In early coverage of AI, data storage was barely a footnote. But with the explosion of generative and video AI, it’s forced itself into the headlines.
Modern hard drives in particular answer a big question on the mind of CIOs and IT leaders: “How can infrastructure scale so that data growth becomes a long-term asset — rather than a constraint?”
From another perspective, NVIDIA CEO Jensen Huang also stresses the importance of infrastructure. “AI is becoming the foundational infrastructure of the modern world,” he says. It runs on “real hardware, real energy and real economics.”
Huang sees AI as a cake or stack with five interconnected layers: Applications, models, infrastructure, chips and energy.
The infrastructure layer includes AI storage, which he thinks could become “the largest storage market in the world.”
“This conference is going to cover every single layer of the five-layer cake of artificial intelligence,” said Huang during his keynote at NVIDIA GTC 2026.
In his remarks, he also pointed to the foundational value of data.
Structured data — “giant spreadsheets” holding “all of life’s information” — is the “the ground truth of AI” and enterprise computing.
Unstructured data — like PDFs and videos — is the “context of AI.” Growing by hundreds of zettabytes a year, it makes up the vast majority of the world’s data.
AI can make use of both, not just to train models but to wield tools, read files and do “productive work.”
“We are now at the beginning of a new platform shift,” he said. “The inference inflection has arrived.”
When it comes to the building blocks for inference and agentic AI, the “winning approach relies on multi-tier, permanent storage architectures,” adds Seagate Chief Systems Technologist Mohamad El-Batal.
A smart AI stack uses each tier for what it does most efficiently.
企业应问:能否把握新数据的价值,还是会被“打了个措手不及”?
彭博社与 CNBC 评论员 Bob O’Donnell 深度解读数据分析普及化趋势及其对数据存储的影响
机械硬盘和 SSD 将与 GPU、CPU、HBM 和 DRAM 一起成为人工智能应用的重要组件。