I was trying to understand how to compare the big-O growth of two functions by taking the logarithm (or some increasing function like $\sqrt{f(n)}$. For example, take $2^{({log_2n})^2}$ vs $ n^{log_2n} $. Just seeing the functions as they stand, its not obvious which one grows faster than the other. However, if we take the logs:
$$ log( f_1(n) ) = log(2^{({log_2n})^2}) = ({log_2n})^2log(2) = O(({log_2n})^2)$$
and
$$ log( f_2(n) ) = log( n^{log_2n} ) = log_2(n)log(n) = O( ({log_2n})^2 )$$
which shows that the two functions grow at the same speed. So the trick of taking logs can sometimes be useful.
However, notice how it can fail, for example:
$$ g_1(n) = n^2$$
vs
$$ g_2(n) = n^3$$
if we take logs we get:
$$ log( g_1(n) ) = log( n^2 ) = 2log(n)$$
and
$$ log( g_2(n) ) = log ( n^3 ) = 3log(n)$$
using the above reasoning, it would be obvious that something is wrong since before taking the logs, the two functions were clearly growing at different rates but then taking the logs made things confusing and now it looks they are within a constant factor of each other.
My question, when is it correct to take logs to make asymptotic comparisons and why?
I was told that:
functions are asymptotically the same only if they are within a constant additive factor of each other.
But I am not sure what that statement means rigorously so I don't know what it really means. Maybe an explanation and an example would be helpful.
(for context I saw this in a CS class teaching big O notations etc).