the vast majority of intelligence supply in the future will be consumed by use cases we can’t foresee yet. It won’t be doing a billion times the same intellectual work we do today, or doing it a billion times faster, but something structurally different.
Very useful analogy for when you’re thinking about where to use LLMs/AI in products. Where can you guide it to erode or dam it to provide a useful potential energy.