Scaling Laws
Scaling laws are empirical formulae that relate two or more measurable quantities of a system. In deep learning, “scaling law” most commonly refers to the relationship between the amount of data and computation (model capacity) we put into a training run, vs. the generalisation error we get on some held-out test data.