(log(n))^log(n) and n/log(n), which is faster? - algorithm

f(n)=(log(n))^log(n)
g(n)= n/log(n)
f = O(g(n))?

Take the log of both sides:
log(f(n)) = log(log n) * log n
log(g(n)) = log(n) - log(log(n)) = log(n)(1 - log(log(n))/log(n))
Clearly log(log(n)) dominates (1 - log(log(n))/log(n)), so g is O(f). f is not O(g). Since it's homework, you may need to fill in the details.
It's also fairly easily to get an idea what the answer should be just by trying it with a large number. 1024 is 2^10, so taking n=1024:
f(n) = 10^10
g(n) = 1024/10.
Obviously that's not a proof, but I think we can see who's winning this race.

f(n) grows faster than g(n) if and only if f(en) also grows faster than g(en) since exp is strictly increasing to infinity (prove it yourself).
Now f(en) = nn and g(en) = en / n, and you can quote the known results.

If Limit[f[x] / g[x], x -> Infinity] = Infinity, then f[x] grows faster than g[x].
Limit[Log[x] ^ Log[x] / (x / Log[x]), x -> Infinity] = + Infinity
So, Log[x] ^ Log[x] grows faster than x / Log[x]

Mathematica gives the limit of f(n) / g(n) as n tends towards infinity as infinity, which means that f grows faster. This means that g(n) belongs to (=) O(f(n)).
You can use this for example if you don't have Mathematica.

f is vastly bigger. By n^loglog(n) -1 . log n

Related

How to solve a problem on relative asymptotic growth (table) from CLRS?

I struggle to fill this table in even though I took calculus recently and good at math. It is only specified in the chapter how to deal with lim(n^k/c^n), but I have no idea how to compare other functions. I checked the solution manual and no info on that, only a table with answers which provides little insight.
When I solve these I don't really think about limits -- I lean on a couple facts and some well-known properties of big-O notation.
Fact 1: for all functions f and g and all exponents p > 0, we have f(n) = O(g(n)) if and only if f(n)p = O(g(n)p), and likewise with o, Ω, ω, and Θ respectively. This has a straightforward proof from the definition; you just have to raise the constant c to the power p as well.
Fact 2: for all exponents ε > 0, the function lg(n) is o(nε). This follows from l'Hôpital's rule for limits: lim lg(n)/nε = lim (lg(e)/n)/(ε nε−1) = (lg(e)/ε) lim n−ε = 0.
Fact 3:
If f(n) ≤ g(n) + O(1), then 2f(n) = O(2g(n)).
If f(n) ≤ g(n) − ω(1), then 2f(n) = o(2g(n)).
If f(n) ≥ g(n) − O(1), then 2f(n) = Ω(2g(n)).
If f(n) ≥ g(n) + ω(1), then 2f(n) = ω(2g(n)).
Fact 4: lg(n!) = Θ(n lg(n)). The proof uses Stirling's approximation.
To solve (a), use Fact 1 to raise both sides to the power of 1/k and apply Fact 2.
To solve (b), rewrite nk = 2lg(n)k and cn = 2lg(c)n, prove that lg(c) n − lg(n) k = ω(1), and apply Fact 3.
(c) is special. nsin(n) ends up anywhere between 0 and n. Since 0 is o(√n) and n is ω(√n), that's a solid row of NO.
To solve (d), observe that n ≥ n/2 + ω(1) and apply Fact 3.
To solve (e), rewrite nlg(c) = 2lg(n)lg(c) = 2lg(c)lg(n) = clg(n).
To solve (f), use Fact 4 and find that lg(n!) = Θ(n lg(n)) = lg(nn).

if log n^2 is big theta of log n , is (logn)^2 also big theta of logn?

log n^2 is equivalent to 2logn which grows at the same rate as logn, as I disregard the factors and constants. but if I was to square the whole term so that I end up with (logn)^2 is it also big theta of logn?
No. If f is any unbounded function then f(n)^2 is not O(f).
Because f(n)^2 = O(f) means there's a c and N such that n > N implies f(n)^2 <= cf(n). Which implies f(n) <= c, and so f is bounded.
log(n) is unbounded, so log(n)^2 is not O(log(n)).
log (n^2) = 2 log(n)
and as you know x^2 is not in thetha(x).
Think this way: let N=log(n). Then f1(N)=N^2 where f2(N)=N, obviously,
N=o(N^2)!=theta(N^2), i.e., log(n)=o((log(n))^2)!=theta((log(n))^2).
Also, lim {n->inf} f2(n) / f1(n) = lim {n->inf} 1 / log(n) = 0, by definition of small o (https://en.wikipedia.org/wiki/Big_O_notation) it implies f2(n)=o(f1(n)).

Comparing O((logn)!) and O(2^n)

I am having a hard time comparing these two functions,
(logn)!
and
2^n
Any good mathematical proof?
You cannot compare O((logn)!) and O(2^n) since big O notation represents a set. O(g(n)) is the set of of all function f such that f does not grows faster than g, formally is the same is saying that there exists C and n0 such that we have |f(n)| <= C|g(n)| for every n >= n0. The expression f(n) = O(g(n)) is a shorthand for saying that f(n) is in the set O(g(n)). what we can do is check if 2^n=O((logn)!) or (log n)!=O(2^n) (note that it could be that both are not true). Luckily, if we use the Stirling approximation we get that
log((logn)!) = (logn)*(log (logn)) - logn + O(log(log n)) = O(n*(log 2))
since n * cost grows faster than (logn)*(log (logn)) and (logn)*(log (logn)) is the leading term in (logn)*(log (logn)) - logn + O(log(log n)). So we get that log((logn)!) = O(log(2^n)) which is same as saying that (log n)! = O(2^n)
One can easily show that for sufficiently large n it holds that:
log(n)! <= log(n)^{log(n)} <= n^{log(n)} = 2^{log^2(n)}
We can now only consider exponents of 2 in the 2^n and the expression above - n and log^2(n) respectively (we can do that since we consider only sufficiently large n and 2^x is strictly rising for positive x). It is sufficient to show that the limit below diverges to prove that log(n)! is, in fact, o(2^n):
lim[n -> inf] (n)/(log^2(n))
Now we apply L'Hospital rule:
= lim [n -> inf] `n/(2log(n))`
And again:
= lim [n -> inf] `n/(2)`
Which diverges to infinity.

(log n)^k = O(n)? For k greater or equal to 1

(log n)^k = O(n)? For k greater or equal to 1.
My professor presented us with this statement in class, however I am not sure what it means for a function to a have a time complexity of O(n). Even stuff like n^2 = O(n^2), how can a function f(x) have a run time complexity?
As for the statement how does it equal O(n) rather than O((logn)^k)?
(log n)^k = O(n)?
Yes. The definition of big-Oh is that a function f is in O(g(n)) if there exist positive constants N and c, such that for all n > N: f(n) <= c*g(n). In this case f(n) is (log n)^k and g(n) is n, so if we insert that into the definition we get: "there exist constants N and c, such that for all n > N: (log n)^k <= c*n". This is true so (log n)^k is in O(n).
how can a function f(x) have a run time complexity
It doesn't. Nothing about big-Oh notation is specific to run-time complexity. Big-Oh is a notation to classify the growth of functions. Often the functions we're talking about measure the run-time of certain algorithms, but we can use big-Oh to talk about arbitrary functions.
f(x) = O(g(x)) means f(x) grows slower or comparably to g(x).
Technically this is interpreted as "We can find an x value, x_0, and a scale factor, M, such that this size of f(x) past x_0 is less than the scaled size of g(x)." Or in math:
|f(x)| < M |g(x)| for all x > x_0.
So for your question:
log(x)^k = O(x)? is asking : is there an x_0 and M such that
log(x)^k < M x for all x>x_0.
The existence of such M and x_0 can be done using various limit results and is relatively simple using L'Hopitals rule .. however it can be done without calculus.
The simplest proof I can come up with that doesn't rely on L'Hopitals rule uses the Taylor series
e^z = 1 + z + z^2/2 + ... = sum z^m / m!
Using z = (N! x)^(1/N) we can see that
e^(x^(1/N)) = 1 + (N! x)^(1/N) + (N! x)^(2/N)/2 + ... (N! x)^(N/N)/N! + ...
For x>0 all terms are positive so, keeping only the Nth term we get that
e^((N! x)^(1/N)) = N! x / N! + (...)
= x + (...)
> x for x > 0
Taking logarithms of both sides (since log is monotonic increasing), then raising to Nth power (also monotonic increasing since N>0)
(N! x)^(1/N) > log x for x > 0
N! x > (log x)^n for x > 0
Which is exactly the result we need, (log x)^N < M x for some M and all x > x_0, with M = N! and x_0=0

When f(n) = n^.1 and g(n) = log(n)^10, does f(n) = Ω(g)?

I was told that "any exponential trumps any logarithm".
But when the exponential is between zero and one, doesn't the execution time of the logarithm grow much faster? So by that logic it would be f = O(g)
I'm having trouble choosing whether to follow my intuition or what I've been told, but what I've been told may have been not totally accurate.
Let's try out some math here. One important fact is that the logarithm function is monotonically increasing, which means that if
log f(x) ≤ log g(x)
then
f(x) ≤ g(x)
Now, let's see what that does here. We have two functions, x0.1 and log10 x. If we take their logs, we get
log (x0.1) = 0.1 log x
and
log (log10 x) = 10 log log x
Since log log x grows much more slowly than log x, intuitively we can see that the function x0.1 is going to eventually overtake log10 x.
Now, let's formalize this. We want to find some value of x such that
x0.1 > log10 x
Let's suppose that these are base-10 logarithms just to make the math easier. If we assume that x = 10k for some k, we get that
(10k)0.1 ≥ log10 10k
100.1 k > log10 10k
100.1 k > k
Now, take k = 100. Now we have that
100.1 * 100 > 100
1010 > 100
which is clearly true. Since both functions are monotonically increasing, this means that for x ≥ 10100, it is true that
x0.1 > log10 x
Which means that it is not true that x0.1 = O(log10 k).
Hope this helps!
The asymptotic analysis is really focused on the long run relationship (as n assumes larger values, how do the values of the functions compare)? It also disregards constants, which is why you sometimes see strange situations like f(x) = 10000000*x = O(x^2).
For large values of n, f(n) > g(n) which is all that really matters.
Another way to verify that n^0.1 = big omega(log^10(n)) by using the limit rule?
The limit rule is:
limit as n->infinity f(n)/g(g).
if the limit is positive infinity, f(n) != O(g(n)) & g(n) = O(f(n)) or f(n) = big omega(g(n))
if the limit is 0, f(n) = O(g(n)) & g(n) != O(f(n))
if the limit is a positive real number, f(n) = O(g(n)) & g(n) = O(f(n)) or f(n) = big theta(g(n))
For this problem:
let f(n) = O(n^0.1) and let g(n) = log^10(n)
That gives us the limit:
limit as n->infinity (n^0.1)/(log^10(n))
Using L'Hospital's rule on the limit 10 times we get:
limit as n->infinity ((0.1)^10 * ln^10(b) * n^0.1)/(10!) where b is the base of the log
Since the n term is only in the numerator, the limit approaches infinity.
By the limit rule
log^10(n) = O(n^0.1) & n^0.1 != O(log^10(n) or n^0.1 = big omega(log^10(n)).

Resources