![What this work shows is that if you have the right input bas...](https://substackcdn.com/image/fetch/w_1200,h_600,c_fill,f_jpg,q_auto:good,fl_progressive:steep,g_auto/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd6d17996-2bef-40a4-abe3-be72a0e8a227_258x258.png)
What this work shows is that if you have the right input basic material (data) with the right distribution (here, a heterogeneous one across a bunch of robots), and then you train a high-capacity neural net on it, you get out something greater than the sum of its parts - a model with surprisingly good out-of-distribution generalization as a consequence of some critical reaction that occurs due to your combo of data + architecture + complexity.
Sometimes I think that developing AI is more like a chemical process rather than a machining one.
Josh Beckman