Note on Nathan Labenz on AI Pricing via Tyler Cowen
pruning for sparsity – turns out some LLMs work just as well if you set 60% of the weights to zero (though this likely isn’t true if you’re using Chinchilla-optimal training)
Reference
- Notes
- llm, optimization
- Nathan Labenz on AI Pricing
-
Permalink to
2023.NTE.031
- Edit
← Previous | Next → |
Note on Nathan Labenz on AI Pricing via Tyler Cowen | Note on Edge Cities With and Without Historic Cores via Alon Levy |