Note on Nathan Labenz on AI Pricing via Tyler Cowen
pruning for sparsity – turns out some LLMs work just as well if you set 60% of the weights to zero (though this likely isn’t true if you’re using Chinchilla-optimal training)
Reference
-
Permalink (
2023.NTE.031) - On
- In Notes
- Tagged llm, optimization
- From Nathan Labenz on AI Pricing
- Edit
| ← Previous | Next → |
| Note on Nathan Labenz on AI Pricing via Tyler Cowen | Note on Edge Cities With and Without Historic Cores via Alon Levy |