# First open-weight model from @poolsideai! Apache license, and available on Ollama to try. 👇👇👇 m... Canonical URL: https://www.traeai.com/articles/35e9bc39-9130-45e8-9c14-3469d943d507 Original source: https://x.com/ollama/status/2049184817603031463 Source name: ollama(@ollama) Content type: tweet Language: 中文 Score: 7.2 Reading time: 1 分钟 Published: 2026-04-28T17:51:47+00:00 Tags: AI, open-weight, MoE, Ollama, Poolside AI ## Summary Poolside AI 发布首款开源权重模型 Laguna XS.2(33B总参/3B激活MoE),Apache 2.0 许可,支持单卡运行,专为智能体编程与长周期任务优化。 ## Key Takeaways - Laguna XS.2 是 Poolside AI 首个开源权重模型,采用 MoE 架构 - 模型总参数 33B、每步仅激活 3B,兼顾性能与推理效率 - 完全自研训练栈构建,支持单 GPU 部署,已集成至 Ollama ## Outline - 发布 announcement — Ollama 官方账号宣布 Poolside AI 首款开源模型上线。 - 模型核心规格 — Laguna XS.2 为 33B 总参 / 3B 激活 MoE 模型,适配单 GPU 推理。 - 许可与可用性 — 采用 Apache 2.0 开源协议,已上架 Ollama 平台并提供权重下载。 - 技术定位 — 聚焦 agentic coding 和 long-horizon 任务,强调工程落地友好性。 ## Highlights - > Laguna XS.2 是 Poolside’s first open-weight model — a 33B total / 3B active MoE model built for agentic coding and long-horizon tasks. — Quote from @poolsideai - > Trained fully in-house on our own stack. Runs on a single GPU. Released under Apache 2.0. — Quote from @poolsideai - > First open-weight model from @poolsideai! Apache license, and available on Ollama to try. — Ollama post ## Citation Guidance When citing this item, prefer the canonical traeai article URL for the AI-readable summary and include the original source URL when discussing the underlying source material.