It's worth pointing out that we have been pushing on large-scale training and asynchronous technique...

Jeff Dean on X: "It's worth pointing out that we have been pushing on large-scale training and asynchronous techniques for the last ~14 years. Here's our NeurIPS 2012 paper where we demonstrated that this approach could be used to train very large neural networks (for the time: 30X larger than" / X
Don’t miss what’s happening
People on X are the first to know.
Post
See new posts
Conversation

It's worth pointing out that we have been pushing on large-scale training and asynchronous techniques for the last ~14 years. Here's our NeurIPS 2012 paper where we demonstrated that this approach could be used to train very large neural networks (for the time: 30X larger than any previous neural network), and to spread the training out across thousands of machines in a fault-tolerant manner. PDF: https://static.googleusercontent.com/media/research.google.com/en//archive/large_deep_networks_nips2012.pdf…https://static.googleusercontent.com/media/research.google.com/en//archive/large_deep_networks_nips2012.pdf… (This paper doesn't get as much attention because we neglected to put it on Arxiv at the time: oops!)
·
3
1
23
5
New to X?
Sign up now to get your own personalized timeline!
Sign up with Apple
By signing up, you agree to the Terms of Service and Privacy Policy, including Cookie Use.
Relevant people
-  Jeff Dean @JeffDean Follow Click to Follow JeffDean Chief Scientist, Google DeepMind & Google Research. Gemini Lead. Opinions stated here are my own, not those of Google. TensorFlow, MapReduce, Bigtable, ...
Trending now
What’s happening
Trending in United States
Intel
Sports · Trending
Tanner Scott
Politics · Trending
Lisa Murkowski
Trending with Thom Tillis
Sports · Trending
#FlyTheW
|
|
|
|
|
More
© 2026 X Corp.