In machine learning, double descent is a surprising phenomen...

In machine learning, double descent is a surprising phenomenon where increasing the number of model parameters causes test performance to get better, then worse, and then better again. It refutes the classical overfitting finding that if you have too many parameters in your model, your test error will always keep getting worse with more parameters. For a surprisingly wide range of models and datasets, you can just keep on adding more parameters after you’ve gotten over the hump, and performance will start getting better again.

Comments
www.joshbeckman.org/notes/516049435