It is interesting that the headline here is not just what the model can do, but how many tokens it t...

For a while, model capability mostly meant: can the model handle the task at all? Progress looked like bigger models, bigger context windows, more tokens, and more" / X
Taranjeet on X: "It is interesting that the headline here is not just what the model can do, but how many tokens it takes to do it. For a while, model capability mostly meant: can the model handle the task at all? Progress looked like bigger models, bigger context windows, more tokens, and more" / X
Don’t miss what’s happening
People on X are the first to know.
Post
See new posts
Conversation

Taranjeet 
It is interesting that the headline here is not just what the model can do, but how many tokens it takes to do it. For a while, model capability mostly meant: can the model handle the task at all? Progress looked like bigger models, bigger context windows, more tokens, and more
Quote

Sam Altman

@sama
·
12h
Replying to @sama
It is smart and fast; per-token speed matches 5.4 and it uses significantly fewer tokens per task. In my experience, it "gets what to do". Rolling out today in ChatGPT and Codex. We are working with API customers on security and safeguards and plan to launch in API very soon.
3
9
1
New to X?
Sign up now to get your own personalized timeline!
Sign up with Apple
By signing up, you agree to the Terms of Service and Privacy Policy, including Cookie Use.
Relevant people
-  Taranjeet  @taranjeetio Follow Click to Follow taranjeetio Founder & CEO @mem0ai - The Memory Layer for AI agents. Open Source: https://github.com/mem0ai/mem0
-  Sam Altman  @sama Follow Click to Follow sama AI is cool i guess
Trending now
What’s happening
News · Trending
動物園勤務の30代男性
News · Trending
妻の遺体
News · Trending
北海道旭川市
News · Trending
普段どおり出勤
|
|
|
|
|
More
© 2026 X Corp.