I am reading algorithm analysis topic. Here is the text snippet from the book
When n doubles, the running time goes up by a factor of 2 for linear programs, 4 for quadratic programs, and 8 for cubic programs. Programs that run in logarithmic time take only an additive constant longer when n doubles, and programs that run in O(n log n) take slightly more than twice as long to run under the same circumstances.
These increases can be hard to spot if the lower-order terms have relatively large coefficients and n is not large enough.
My question is what does author mean lower-order terms have relatively large coefficients? Can any one explain with example
Thanks!