Compare growth rate of two functions. (Tricky) - algorithm

I need to compare the growth rate of the following functions:
f(n)=2^n and g(n)=n^log(n) (when n approaches positive infinity).
Is this even possible?

Let n = 2^k. We have:
2^n = 2^(2^k)
n^log(n) = (2^k)^log(2^k) = (2^k)^(k log 2)
= 2^(k^2 log 2)
Now compare 2^k to k^2 log 2. This is a basic comparison: 2^k is bigger for all large enough k.

Taking log (base 2) for both the functions, we get log(f(n)) = n where log(g(n)) = (log(n))^2.
Now, (log(n))^2 = o(n) and log being a monotonically increasing function, we have
g(n) = o(f(n)), i.e., f(n) grows much faster for large values of n.
Here is another way to prove it more rigorously:
Let L = lim{n->inf} g(n) / f(n) = lim{n->inf} n^(log(n))/2^n.
Hence log (L) = lim{n->inf} log^2(n) - n
` = lim{n->inf} n*(log^2(n)/n) - 1)`
` = lim{n->inf} (n) * lim{n->inf} (log^2(n)/n) - 1)`
` = lim{n->inf} (n) * (0-1)`
` = lim{n->inf} (-n) = -inf`
=> L = 2^(-inf) = 0
According to the alternative definition of o(n) (small o, see here: https://en.wikipedia.org/wiki/Big_O_notation),
L = lim{n->inf} g(n) / f(n) = 0
=> g(n) = o(f(n)).
Here are the figures comparing f(n) and g(n) growth in original and in log scale:

Related

How to Solve this task š‘‡(š‘›)=4š‘‡(š‘›/4)+š‘›āˆšn?

I'm new to Divide and Conquer. I would like to know how to find out the running time of this calculation?
What exactly do I have to pay attention to and how do I proceed?
n=1 runningtime = O(1)
so lets see the calculation for this :
T(n) = 4T(n/4) + n * sqrt(n)
expanding for sum k steps it will be like
T(n) = 4^k[T(n/4^k)] + n * sqrt(n) * {sqrt(1/4)+sqrt(1/16)....}
here {sqrt(1/4)+sqrt(1/16)....} is Geometric progression
if we take k=log4(n) //here base is 4
T(n) = n * [T(1)] + n * sqrt(n)*{1-[2/sqrt(n)]}
T(n) = n * [T(1)] + n * sqrt(n) -2 * n
you still can use
Master theorem
T(n) = aT(n/b) + f(n).
If f(n) = Ī˜(n^d), where d ā‰„ 0, then
T(n) = Ī˜(n^d) if a < bd,
T(n) = Ī˜((n^d)log n) if a = bd,
T(n) = Ī˜(n^(logba)) if a > bd
yes the ans is O(n^3/2)
{sorry could not comment due to low reputation}

Big O notation 2^log(n) equals what Big O and can something be greater

I am trying to figure out if f(n) = O(g(n)).
I understand that:
O(g(n)) = { f(n) there exists constants c, n0 > 0 such that 0 ā‰¤ f(n) ā‰¤ c Ɨ g(n) for all n ā‰„ n0 }
So I have:
f(n) = 2^(logn)
g(n) = n^1000
I understand that f(n) most closely resembles O(n) usually. However it is less than g(n) therefore would the proof hold true even though the Big O is much much larger than expected?
If
f(n) = n
g(n) = n^1000
then f = O(g). Broadly, if f=O(g) and h is "bigger" than g, then f=O(h).
But there's a catch here:
f(n) = 2^(logn)
This log has what base? Ordinarily We write something like O(logn) and we don't care about the base; O(log2n) and O(log99n) are the same thing, because logan = k logbn, where k is constant. But what is 2logbn?
2^(logb n) = 2^((log2 n)(logb 2))
= (2^(log2 n))^(logb 2)
= n^(logb 2)
How does that compare to n1000?
Suppose:
b = 2^(1/2000) (just slightly more than 1)
2 = b^2000
logb 2 = 2000
So in some cases, it is not true that 2logn = O(n1000).

Algorithm's complexity : Big O notation

O( sqrt(n) ) = O(n) ?
We should find c and n0 verifying :
0 ( sqrt(n) ) < c*n ; c>0 and n>n0
How to find c and n0 ? or should i find another idea ?
Thanks
For n > 1, we have āˆšn > 1, hence we have the following inequality:
āˆšn < āˆšn * āˆšn = n, for any n > 1.
So we can take c = 1 and n0 = 2 to prove that āˆšn = O(n).
Remark
Strictly speaking, you should avoid writing down something like O(āˆšn) = O(n). Big-O notation is for describing the asymptotic upper bound of a function, but O(āˆšn) is not a function.
O(āˆšn) = O(n) is an abuse of notation, it really means the following:
If f is a function such that f(n) = O(āˆšn), then f(n) = O(n).
In our case, if for any function f we have f(n) = O(āˆšn), since āˆšn < n for any n > 1, clearly we have f(n) < c * āˆšn < c * n for any n > 1, and hence f(n) = O(n).

Solving Recurrence relation: T(n) = 3T(n/5) + lgn * lgn

Consider the following recurrence
T(n) = 3T(n/5) + lgn * lgn
What is the value of T(n)?
(A) Theta(n ^ log_5{3})
(B) Theta(n ^ log_3{5})
(c) Theta(n Log n )
(D) Theta( Log n )
Answer is (A)
My Approach :
lgn * lgn = theta(n) since c2lgn < 2*lglgn < c1*lgn for some n>n0
Above inequality is shown in this picture for c2 = 0.1 and c1 = 1
log_5{3} < 1,
Hence by master theorem answer has to be theta(n) and none of the answers match. How to solve this problem??
Your claim that lg n * lg n = Ī˜(n) is false. Notice that the limit of (lg n)2 / n tends toward 0 as n goes to infinity. You can see this using l'Hopital's rule:
limn ā†’ āˆž (lg n)2 / n
= lim n ā†’ āˆž 2 lg n / n
= lim n ā†’ āˆž 2 / n
= 0
More generally, using similar reasoning, you can prove that lg n = o(nĪµ) for any Īµ > 0.
Let's try to solve this recurrence using the master theorem. We see that there are three subproblems of size n / 5 each, so we should look at the value of log5 3. Since (lg n)2 = o(nlog5 3), we see that the recursion is bottom-heavy and can conclude that the recurrence solves to O(nlog5 3), which is answer (A) in your list up above.
Hope this helps!
To apply Master Theorem we should check the relation between
nlog5(3) ~= n0.682 and (lg(n))2
Unfortunately lg(n)2 != 2*lg(n): it is lg(n2) that's equal to 2*lg(n)
Also, there is a big difference, in Master Theorem, if f(n) is O(nlogb(a)-Īµ), or instead Ī˜(nlogba): if the former holds we can apply case 1, if the latter holds case 2 of the theorem.
With just a glance, it looks highly unlikely (lg(n))2 = Ī©(n0.682), so let's try to prove that (lg(n))2 = O(n0.682), i.e.:
āˆƒ n0, c āˆˆ N+, such that for n>n0, (lg(n))2 < c * n0.682
Let's take the square root of both sides (assuming n > 1, the inequality holds)
lg(n) < c1 * n0.341 , (where c1 = sqrt(c))
now we can assume, that lg(n) = log2(n) (otherwise the multiplicative factor could be absorbed by our constant - as you know constant factors don't matter in asymptotic analysis) and exponentiate both sides:
2lg(n) < 2c2 * n0.341 <=> n < 2c2 * n0.341 <=> n < (n20.341)c2 <=> n < (n20.341)c2 <=> n < (n1.266)c2
which is immediately true choosing c2 = 1 and n0 = 1
Therefore, it does hold true that f(n) = O(nlogb(a)-Īµ), and we can apply case 1 of the Master Theorem, and conclude that:
T(n) = O(nlog53)
Same result, a bit more formally.

Is O(log(n*log n) can be considered as O(log n)

Consider I get f(n)=log(n*log n). Should I say that its O(log(n*log n)?
Or should I do log(n*log n)=log n + log(log n) and then say that the function f(n) is O(log n)?
First of all, as you have observed:
log(n*log n) = log(n) + log(log(n))
but think about log(log N) as N->large (as Floris suggests).
For example, let N = 1000, then log N = 3 (i.e. a small number) and log(3) is even smaller,
this holds as N gets huge, i.e. way more than the number of instructions your code could ever generate.
Thus, O(log(n * log n)) = O(log n + k) = O(log(n)) + k = O(log n)
Another way to look at this is that: n * log n << n^2, so in the worse case:
O(log(n^2)) > O(log(n * log n))
So, 2*O(log(n)) is an upper bound, and O(log(n * log n)) = O(log n)
Use the definition. If f(n) = O(log(n*log(n))), then there must exist a positive constant M and real n0 such that:
|f(n)| ā‰¤ M |log(n*log(n))|
for all n > n0.
Now let's assume (without loss of generality) that n0 > 0. Then
log(n) ā‰„ log(log(n))
for all n > n0.
From this, we have:
log(n(log(n)) = log(n) + log(log(n)) ā‰¤ 2 * log(n)
Substituting, we find that
|f(n)| ā‰¤ 2*M|log(n))| for all n > n0
Since 2*M is also a positive constant, it immediately follows that f(n) = O(log(n)).
Of course in this case simple transformations show both functions differ by a constant factor asymptotically, as shown.
However, I feel like it is worthwhile remind a classic test for analyzing how two functions relate to each other asymptotically. So here's a little more formal proof.
You can check how does f(x) relates to g(x) by analyzing lim f(x)/g(x) when x->infinity.
There are 3 cases:
lim = infinty <=> O(f(x)) > O(g(x))
inf > lim > 0 <=> O(f(x)) = O(g(x))
lim = 0 <=> O(f(x)) < O(g(x))
So
lim ( log( n * log(n) ) / log n ) =
lim ( log n + log log (n) ) / log n =
lim 1 + log log (n) / log n =
1 + 0 = 1
Note: I assumed log log n / log n to be trivial but you can do it by de l'Hospital Rule.

Resources