What is the order of f=(log n)/ (log(log n))?
Is f= O(log n)? Why is that?
And what is order of h=(log n) * (log log n)?
Is it also h= O(log n)? And why is that correct?
Is f= O(log n)?
Yes
Is it also h= O(log n)?
No
Proof:
Use the formal definition:
f(n) = O(g(n)) means there are positive constants c and n0, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0. The values of c and n0 must be fixed for the function f and must not depend on n.
f = O(logn) <=> (log n)/ (log(log n)) = O(logn)
So, you need to find c and n0 such that 0 ≤ (log n)/ (log(log n)) ≤ c*logn for all n ≥ n0. Let's suppose that the logarithm base is b (it doesn't matter actually, but you can consider b in {2,e,10}). If you choose c=1 and n0=b^b^2, 0 ≤ (log n)/ (log(log n)) ≤ logn for all n ≥ b^b^2.
the first part is true, because log n ≥ log b^b^2 = b^2 ≥ 0 and log(log n) ≥ log(log b^b^2) = 2 ≥ 0
the second part is also true, because it becomes log(log n) ≥ 1 and log(log n) ≥ log(b^2) = 2 ≥ 1.
Similarly to the first proof, you need to prove that you can't choose c and n0 so that (log n) * (log(log n)) ≤ c*logn is true for all n ≥ n0. And for a big n it becomes (log(log n)) ≤ c because log n cannot be 0. And it's obvious that you can't choose a c because it won't be true for n > b^b^c.
Related
Give an algorithm with time complexity of O(N+M) and M<N.
Can we conclude O(N+M) => O(N+N) => O(2N) => O(N)
Will that be correct?
f(N, M) = O(N + M) is by definition
E c, N0, M0: A N ≥ N0, M ≥ M0: f(N, M) ≤ c (N + M)
But by your hypothesis, M < N so that
E c, N0, M0: A N ≥ N0, M ≥ M0: f(N, M) ≤ c (N + M) < c 2N
and
E c', N0, M0: A N ≥ N0, M ≥ M0: f(N, M) ≤ c' N.
Show that n^2 is not O(n)
f(n)=n^2
g(n) = n
c = 1
n_0=2
n^2 <= 1*n for all n_0 >= 2
4 <= 2
4 is not less than or equal to 2. Therefore, n^2 is not O(n).
I need to show that NO c works with this, however, the c of 2, with n of 2 will work. How is n^2 not n?
Let's assume that n² is in O(n).
Then there must be a c and a n₀ such that for all n ≥ n₀, n² ≤ c*n (by the definition of O notation).
Let k = max(c, n₀) + 1. By the above property we have k² ≤ c*k (since k > n₀), from which it follows that k ≤ c.
However, k > c by construction. That's a contradiction.
Therefore our assumption is false and n² cannot be in O(n).
I am doing an introductory course on algorithms. I've come across this problem which I'm unsure about.
I would like to know which of the 2 are dominant
f(n): 100n + log n or g(n): n + (log n)^2
Given the definitions of each of:
Ω, Θ, O
I assumed f(n), so fn = Ω(g(n))
Reason being that n dominates (log n)^2, is that true?
In this case,
limn → ∞[f(n) / g(n)] = 100.
If you go over calculus definitions, this means that, for any ε > 0, there exists some m for which
100 (1 - ε) g(n) ≤ f(n) ≤ 100 (1 + ε) g(n)
for any n > m.
From the definition of Θ, you can infer that these two functions are Θ of each other.
In general, if
limn → ∞[f(n) / g(n)] = c exists, and
0 < c < ∞,
then the two functions have the same order of growth (they are Θ of each other).
n dominates both log(n) and (log n)^2
A little explanation
f(n) = 100n + log n
Here n dominates log n for large values of n.
So f(n) = O(n) .......... [1]
g(n) = n + (log n)^2
Now, (log n)^2 dominates log n.
But n still dominates (log n)^2.
So g(n) = O(n) .......... [2]
Now, taking results [1] and [2] into consideration.
f(n) = Θ(g(n)) and g(n) = Θ(f(n))
since they will grow at the same rate for large values of n.
We can say that f(n) = O(g(n) if there are constants c > 0 and n0 > 0 such that
f(n) <= c*g(n), n > n0
This is the case for both directions:
# c == 100
100n + log n <= 100(n + (log n)^2)
= 100n + 100(log(n)^2) (n > 1)
and
# c == 1
n + (log n)^2 <= 100n + log n (n > 1)
Taken together, we've proved that n + (log n)^2 <= 100n + log n <= 100(n + (log n)^2), which proves that f(n) = Θ(g(n)), which is to say that neither dominates the other. Both functions are Θ(n).
g(n) dominates f(n), or equivalently, g(n) is Ω(f(n)) and the same hold vice versa.
Considering the definition, you see that you can drop the factor 100 in the definition of f(n) (since you can multiply it by any fixed number) and you can drop both addends since they are dominated by the linear n.
The above follows from n is Ω(n + logn) and n is Ω(n + log^2n.
hope that helps,
fricke
Consider I get f(n)=log(n*log n). Should I say that its O(log(n*log n)?
Or should I do log(n*log n)=log n + log(log n) and then say that the function f(n) is O(log n)?
First of all, as you have observed:
log(n*log n) = log(n) + log(log(n))
but think about log(log N) as N->large (as Floris suggests).
For example, let N = 1000, then log N = 3 (i.e. a small number) and log(3) is even smaller,
this holds as N gets huge, i.e. way more than the number of instructions your code could ever generate.
Thus, O(log(n * log n)) = O(log n + k) = O(log(n)) + k = O(log n)
Another way to look at this is that: n * log n << n^2, so in the worse case:
O(log(n^2)) > O(log(n * log n))
So, 2*O(log(n)) is an upper bound, and O(log(n * log n)) = O(log n)
Use the definition. If f(n) = O(log(n*log(n))), then there must exist a positive constant M and real n0 such that:
|f(n)| ≤ M |log(n*log(n))|
for all n > n0.
Now let's assume (without loss of generality) that n0 > 0. Then
log(n) ≥ log(log(n))
for all n > n0.
From this, we have:
log(n(log(n)) = log(n) + log(log(n)) ≤ 2 * log(n)
Substituting, we find that
|f(n)| ≤ 2*M|log(n))| for all n > n0
Since 2*M is also a positive constant, it immediately follows that f(n) = O(log(n)).
Of course in this case simple transformations show both functions differ by a constant factor asymptotically, as shown.
However, I feel like it is worthwhile remind a classic test for analyzing how two functions relate to each other asymptotically. So here's a little more formal proof.
You can check how does f(x) relates to g(x) by analyzing lim f(x)/g(x) when x->infinity.
There are 3 cases:
lim = infinty <=> O(f(x)) > O(g(x))
inf > lim > 0 <=> O(f(x)) = O(g(x))
lim = 0 <=> O(f(x)) < O(g(x))
So
lim ( log( n * log(n) ) / log n ) =
lim ( log n + log log (n) ) / log n =
lim 1 + log log (n) / log n =
1 + 0 = 1
Note: I assumed log log n / log n to be trivial but you can do it by de l'Hospital Rule.
A question in one of my past exams is a multi-choice question:
Choose the FALSE statement: 7(log n) + 5n + n(log log n) + 3n(ln n) is
A. O(n^2)
B. Ω(n^2)
C. O(n log^2 n)
D. Ω(n)
E. Θ(n log n)
First I concluded that the running time of the algorithm had to be Θ(n log n), which ruled out option E. I then concluded that Option B, Ω(n^2), ws false, because I knew that Θ(n log n) was smaller than Θ(n^2), therefore Ω(n^2) could not be true. So I thought B would be the answer.
But I also realised that C couldn't be true either, since Θ(n log n) is a larger running time than Θ(n log^2 n). But it couldn't possibly be 2 of the answers correct.
So which is correct: is it B? or is it C? or neither? I'm so confused. :S
the false statement is omega(n^2).
it is exactly theta(nlogn) (since 3n(ln n)) is the "highest", and it is theta(nlogn).
omega(n^2) says it is not better then n^2 complexity, which is false in here.
addition: in your example the following is true:
7(log n) < 5n < n(log log n) < 3n(ln n)
7logn = theta(logn), 5n = theta(n), n(loglogn) = theta (nloglog(n)), 3nln(n) = theta(nlogn)
which as said, (nlogn) is the highest, thus:
7(log n) + 5n + n(log log n) + 3n(ln n) = theta(nlogn)
and all are o(n^2) (small o here on purpose), so Omega(n^2) is the false statement.
EDIT: clarifying and adding what I've wrote in comments:
option C is True because O(nlogn) < O(nlog^2(n)). mathematically:
nlog^2(n) = n*log(n)*log(n) > n*log(n) for every n>2.
example: for n=1,000,000: nlogn = 1,000,000*20 = 20,000,000
while n*log^2(n) = 1,000,000*20*20=400,000,000.
Assuming that log is the base 2 logarithm and ln is the natural logarithm, the following statement is true:
Θ(log(n)) < Θ(n·log(log(n))) < Θ(n) < Θ(n·ln(n))
So the overall complexity is Θ(n·ln(n)).
Now let’s check the statements:
n·ln(n) ∈ O(n2) is true
n·ln(n) ≤ c·n2 for c = 1, ∀n ≥ 0
n·ln(n) ∈ Ω(n2) is false
n·ln(n) ≱ n2 hence ln(n) < c·*n* for c = 1, ∀n ≥ 0
n·ln(n) ∈ O(n·(log(n))2) is true
n·(log(n))2 = n·(ln(n)/ln(2))2 = n·(ln(n))2/(ln(2))2 and n·ln(n) ≤ c·n·(ln(n))2/(ln(2))2 for c = (ln(2))2, ∀n ≥ e
n·ln(n) ∈ Ω(n) is true
n·ln(n) ≥ c·n for c = 1, ∀n ≥ 0
n·ln(n) ∈ Θ(n·log(n)) is true
n·log(n) = n·ln(n)/ln(2) and n·ln(n) = c·n·ln(n)/ln(2) for c = ln(2), ∀n ≥ 0