If n = 100 , then (100)^2(log 100) = 100^2 (2) = 10,000 * 2 = 20,000. As opposed to (100^2) = 10,000.
I am thinking n^2(log n) grows faster because it has a higher value. But I am thinking they are similar since they are multiplied by the same n^2.
What exactly is a good to way to tell which functions grows fast, is it by inputting a large value of n such as 100 for each function?
Function f grows faster than g if
lim f(n) / g(n) = +inf
n -> +inf
In your case f(n) = (n^2) * log(n) and g(n) = n^2
lim f(n)/g(n) = lim (n^2) * log(n) / n^2 = lim log(n) = +inf
n -> +inf n -> +inf n -> +inf
so (n^2) * log(n) grows faster than n^2.
Let's compare f(n) = n^(2 * log(n)) and g(n) = n^2. We have
lim n^(2 * log(n)) / n^2 = lim n^(2 * (log(n) - 1)) > /* log(n) > 2 */ >
n -> +inf n -> +inf n -> +inf
log n^(2 * (2 - 1)) = log n^2 = +inf
n -> +inf n -> +inf
so n^(2 * log(n)) grows faster than n^2
Related
I try to solve this problem many times but I still don't have a solution.
The question as follows:
Use the definition of Big Omega to prove that
nlogn - n belongs to Omega of nlogn.
Thank you
As follows:
f(n) = n log n - n
>= n log n - 0.5 n log n, for all n > 100 (assuming log10 base)
= 0.5 * n log n,
= c * n log n, for c = 0.5, n > n0 = 100
To summarize:
f(n) >= c * g(n), for all n > n0 = 100, and
with c = 0.5, and
with g(n) = n log n.
When logn > 2, then 1/2 nlogn - n > 0. Adding 1/2 nlogn to both sides gives nlogn - n > 1/2 nlogn.
This proves that nlogn - n is Omega(nlogn) (with the constant 1/2, and N the smallest integer such that logN>2).
I've tried calculating this function and I am a bit unsure of my result. I set it to True. Can anyone explain if my answer is correct and why?
(3 log 2 n + 55 log(n 10 ) + 8 log n) · log n = Ω(log 10 n)
I set it to True
Your result is correct, but can be further simplified to Ω(log(n)) as log(10n) + log(10) + log(n) and log(10) is a constant.
To prove that f(n) = Ω(g(n)) you need to show that g(n) is a "lower bound" asymptotically of f(n).
The formal definition is that f(n) = Ω(g(n)) is there exists some c, n0 > 0 s.t. for all n > n0 it holds that f(n) >= g(n).
Recall that for every natural integer bigger than 2 it holds that log(n) > 1 so
(3log(2n) + 55log(10n) + 8log n) · log n > 3log(2n) + 55log(10n) + 8log(n) > 8log(n) > log(n).
Choose c = 1, n0 = 2 and we got that for all n > n0: (3log(2n) + 55log(10n) + 8log n) · log n > log(n), thus (3log(2n) + 55log(10n) + 8log n) · log n = Ω(log(n).
I need to compare the growth rate of the following functions:
f(n)=2^n and g(n)=n^log(n) (when n approaches positive infinity).
Is this even possible?
Let n = 2^k. We have:
2^n = 2^(2^k)
n^log(n) = (2^k)^log(2^k) = (2^k)^(k log 2)
= 2^(k^2 log 2)
Now compare 2^k to k^2 log 2. This is a basic comparison: 2^k is bigger for all large enough k.
Taking log (base 2) for both the functions, we get log(f(n)) = n where log(g(n)) = (log(n))^2.
Now, (log(n))^2 = o(n) and log being a monotonically increasing function, we have
g(n) = o(f(n)), i.e., f(n) grows much faster for large values of n.
Here is another way to prove it more rigorously:
Let L = lim{n->inf} g(n) / f(n) = lim{n->inf} n^(log(n))/2^n.
Hence log (L) = lim{n->inf} log^2(n) - n
` = lim{n->inf} n*(log^2(n)/n) - 1)`
` = lim{n->inf} (n) * lim{n->inf} (log^2(n)/n) - 1)`
` = lim{n->inf} (n) * (0-1)`
` = lim{n->inf} (-n) = -inf`
=> L = 2^(-inf) = 0
According to the alternative definition of o(n) (small o, see here: https://en.wikipedia.org/wiki/Big_O_notation),
L = lim{n->inf} g(n) / f(n) = 0
=> g(n) = o(f(n)).
Here are the figures comparing f(n) and g(n) growth in original and in log scale:
I am doing an introductory course on algorithms. I've come across this problem which I'm unsure about.
I would like to know which of the 2 are dominant
f(n): 100n + log n or g(n): n + (log n)^2
Given the definitions of each of:
Ω, Θ, O
I assumed f(n), so fn = Ω(g(n))
Reason being that n dominates (log n)^2, is that true?
In this case,
limn → ∞[f(n) / g(n)] = 100.
If you go over calculus definitions, this means that, for any ε > 0, there exists some m for which
100 (1 - ε) g(n) ≤ f(n) ≤ 100 (1 + ε) g(n)
for any n > m.
From the definition of Θ, you can infer that these two functions are Θ of each other.
In general, if
limn → ∞[f(n) / g(n)] = c exists, and
0 < c < ∞,
then the two functions have the same order of growth (they are Θ of each other).
n dominates both log(n) and (log n)^2
A little explanation
f(n) = 100n + log n
Here n dominates log n for large values of n.
So f(n) = O(n) .......... [1]
g(n) = n + (log n)^2
Now, (log n)^2 dominates log n.
But n still dominates (log n)^2.
So g(n) = O(n) .......... [2]
Now, taking results [1] and [2] into consideration.
f(n) = Θ(g(n)) and g(n) = Θ(f(n))
since they will grow at the same rate for large values of n.
We can say that f(n) = O(g(n) if there are constants c > 0 and n0 > 0 such that
f(n) <= c*g(n), n > n0
This is the case for both directions:
# c == 100
100n + log n <= 100(n + (log n)^2)
= 100n + 100(log(n)^2) (n > 1)
and
# c == 1
n + (log n)^2 <= 100n + log n (n > 1)
Taken together, we've proved that n + (log n)^2 <= 100n + log n <= 100(n + (log n)^2), which proves that f(n) = Θ(g(n)), which is to say that neither dominates the other. Both functions are Θ(n).
g(n) dominates f(n), or equivalently, g(n) is Ω(f(n)) and the same hold vice versa.
Considering the definition, you see that you can drop the factor 100 in the definition of f(n) (since you can multiply it by any fixed number) and you can drop both addends since they are dominated by the linear n.
The above follows from n is Ω(n + logn) and n is Ω(n + log^2n.
hope that helps,
fricke
Consider I get f(n)=log(n*log n). Should I say that its O(log(n*log n)?
Or should I do log(n*log n)=log n + log(log n) and then say that the function f(n) is O(log n)?
First of all, as you have observed:
log(n*log n) = log(n) + log(log(n))
but think about log(log N) as N->large (as Floris suggests).
For example, let N = 1000, then log N = 3 (i.e. a small number) and log(3) is even smaller,
this holds as N gets huge, i.e. way more than the number of instructions your code could ever generate.
Thus, O(log(n * log n)) = O(log n + k) = O(log(n)) + k = O(log n)
Another way to look at this is that: n * log n << n^2, so in the worse case:
O(log(n^2)) > O(log(n * log n))
So, 2*O(log(n)) is an upper bound, and O(log(n * log n)) = O(log n)
Use the definition. If f(n) = O(log(n*log(n))), then there must exist a positive constant M and real n0 such that:
|f(n)| ≤ M |log(n*log(n))|
for all n > n0.
Now let's assume (without loss of generality) that n0 > 0. Then
log(n) ≥ log(log(n))
for all n > n0.
From this, we have:
log(n(log(n)) = log(n) + log(log(n)) ≤ 2 * log(n)
Substituting, we find that
|f(n)| ≤ 2*M|log(n))| for all n > n0
Since 2*M is also a positive constant, it immediately follows that f(n) = O(log(n)).
Of course in this case simple transformations show both functions differ by a constant factor asymptotically, as shown.
However, I feel like it is worthwhile remind a classic test for analyzing how two functions relate to each other asymptotically. So here's a little more formal proof.
You can check how does f(x) relates to g(x) by analyzing lim f(x)/g(x) when x->infinity.
There are 3 cases:
lim = infinty <=> O(f(x)) > O(g(x))
inf > lim > 0 <=> O(f(x)) = O(g(x))
lim = 0 <=> O(f(x)) < O(g(x))
So
lim ( log( n * log(n) ) / log n ) =
lim ( log n + log log (n) ) / log n =
lim 1 + log log (n) / log n =
1 + 0 = 1
Note: I assumed log log n / log n to be trivial but you can do it by de l'Hospital Rule.