I was agog, for example, to read that Travis Kalanick recently claimed that he’s come “close to some interesting breakthroughs” in quantum physics just by conversing with Grok–truly the definition of a man who doesn’t know what he doesn’t know:

“I’ll go down this thread with [Chat]GPT or Grok and I’ll start to get to the edge of what’s known in quantum physics and then I’m doing the equivalent of vibe coding, except it’s vibe physics,” Kalanick explained. “And we’re approaching what’s known. And I’m trying to poke and see if there’s breakthroughs to be had. And I’ve gotten pretty damn close to some interesting breakthroughs just doing that.”

What’s strange here is that epistemic humility is in some sense the single most important skill necessary to make good use of these chatbots–an awareness not just of their limitations, but also of your own.

Please remember that [at least current] LLMs are sycophantic and you really need to test your own assumptions and gain proof in your interactions. Agreement is easy to get - but proof is also easy to get, if you try! But you have to try, and understand what proof would suffice.

This is the dangerous flip side of high agency and curiosity and you need to guard against it by realizing that you need to have enough agency and curiosity to question your own position and initial response from the sycophant.


Keyboard Shortcuts

Key Action
o Source
e Edit
i Insight
r Random
h Home
s or / Search
www.joshbeckman.org/notes/921398763