返回首页
Rowan Cheung(@rowancheung)

Boston Dynamics just gave its robot dog a brain that reasons about the physical world. Google DeepM...

8.5Score
Boston Dynamics just gave its robot dog a brain that reasons about the physical world.

Google DeepM...
AI 深度提炼
  • Spot 现能通过多视角摄像头自主识别遮挡物体并计算仪表读数,实现物理世界推理。
  • 内置成功检测机制让机器人能判断任务失败并自主决定重试或继续,迈向真正自主。
  • 数千台商用 Spot 持续回传真实场景数据,形成难以复制的 AI 迭代飞轮优势。
#Boston Dynamics#Google DeepMind#具身智能#机器人
打开原文

Google DeepMind's Gemini Robotics model is now running inside Spot, the four-legged robot already deployed at thousands of facilities worldwide.

The upgrade centers on something called https://t.co/46bGiuSKkm" / X

Post

Conversation

Boston Dynamics just gave its robot dog a brain that reasons about the physical world. Google DeepMind's Gemini Robotics model is now running inside Spot, the four-legged robot already deployed at thousands of facilities worldwide. The upgrade centers on something called embodied reasoning. Basically, instead of following rigid instructions, Spot now interprets what it sees and makes decisions on its own. One standout example: Spot can now look at a pressure gauge, zoom in on the needle, calculate the intervals between tick marks, and deliver an accurate reading. All autonomously. The model also introduces: > Multi-view reasoning, combining feeds from multiple cameras to understand its environment even when objects are blocked from one angle > Built-in success detection, so Spot knows when it's failed and decides whether to retry or move on That last part is a key requirement for true autonomy. Most companies building AI-powered robots are still stuck in the demo phase. Boston Dynamics is one of the few with actual paying customers... which means thousands of Spots collecting real-world data that feeds back into making the AI smarter. That flywheel is hard to compete with.

Image 1