# Our CEO @jerryjliu0 in @VentureBeat , on what's actually changing in the LLM stack: "We've really ... Canonical URL: https://www.traeai.com/articles/d2f2e654-95e7-4c00-9a95-f46f60600d59 Original source: https://x.com/llama_index/status/2050322902923272337 Source name: LlamaIndex 🦙(@llama_index) Content type: tweet Language: 中文 Score: 7.8 Reading time: 2 分钟 Published: 2026-05-01T21:14:07+00:00 Tags: LLM, RAG, 数据层, LlamaIndex, AI架构 ## Summary LlamaIndex CEO指出LLM栈的核心变革在于数据层:企业关键上下文仍锁在PDF、合同等文件中,框架抽象已成负担,真正护城河是高效提取与供给高质量上下文的能力。 ## Key Takeaways - LLM应用成败取决于上下文质量,而非底层模型选择 - PDF/合同/申报文件等非结构化数据是企业最核心但未被激活的上下文来源 - 2023年流行的框架抽象正迅速过时,轻量、专注数据连接的基础设施成为新焦点 ## Outline - 引言:LLM栈的范式转移 — 指出当前LLM技术演进重心正从模型层转向数据与上下文供给层。 - 核心论点:上下文即护城河 — 强调企业级AI能力差异源于上下文质量,而非模型本身。 - 数据困境:格式容器锁死价值 — PDF、合同、监管文件等非结构化数据是高价值上下文,但长期被格式隔离。 - 框架退潮与数据层崛起 — 2023年流行抽象层已成冗余,轻量、可插拔的数据接入层成为关键基建。 ## Highlights - > Ultimately, whether you use OpenAI Codex or Claude Code doesn't really matter. The thing that they all need is context. — 原文引述 - > The framework abstractions that saved developers months in 2023 are dead weight now. — 原文引述 - > What survives is the data layer because agents are only as good as the context they get. — 原文引述 ## Citation Guidance When citing this item, prefer the canonical traeai article URL for the AI-readable summary and include the original source URL when discussing the underlying source material.