I have to find whether the following is true or false:
If f(n) ∈ ω(g(n)), then 2 ^ f(n) ∈ ω(2 ^ g(n) )
I did the calculations f(n) = 1/n and g(n) = 1/n^2 and got the ans as false.
It should be :
If f(n) ∈ ω(g(n)), then 2 ^ f(n) ∈ Θ(2 ^ g(n) )
Could some one please verify this?
Statement: f(n) ≥ g(n) ⋅ k for all k ⇒ 2^f(n) ≥ 2^g(n)⋅k for all k.
Your counterexample is correct: 1/n ≥ k/n² is true for all k. We can show this by taking the limit:
limn → ∞ (1 / n) / (k / n²) = 1/k ⋅limn → ∞ n² / n = ∞
However: 21/n ≥ 21/n² ⋅ k is false. We can also show this by taking the limit:
limn → ∞ 21/n / (21/n² ⋅ k) =
= 1/k lim of 21/n - 1/n² =
= 1/k lim of 2(n - 1) / n² = 1/k ⋅ 2⁰ =
= 1/k
The statement would only have been true if the limit was infinity.
A single counterexample is enough to prove that a statement is false, so you're done.
Related
Given two functions f = Ω(log n) and g = O(n), consider the following statements. For
each statement, write whether it is true or false. For each false statement, write two
functions f and g that show a counter-example.
1) g(n) = O(f (n))
2) f (n) = O(g(n))
3) f (n) = Ω(log (g(n)))
4) f (n) = Θ(log (g(n)))
5) f (n) + g(n) = Ω(log n)
I know that Big O means no better than (function), and Big Omega means no worse than (function). But I don't know if that makes the above statement true or false.
False. A counterexample is g(n) = n \in O(n) and f(n) = log(n) \in Omega(log(n)). Both assumptions are correct, but g(n)is not in O(f(n)).
False. A counterexample is g(n) = log(n) \in O(n) and f(n) = n \in Omega(log(n)), but f(n) is not in O(g(n)).
True. As f(n) \in Omega(log(n)), it means lim_{n\to\infty} f(n)/log(n) > 0. As g(n) \in O(n), we can straightforwardly conclude that lim_{n\to\infty} f(n)/log(g(n)) > 0. Hence f(n) \in Omega(log(g(n)).
False. A counterexample is f(n) = n \in Omega(n) and g(n) = log(n) \in O(n), but f(n) is not in theta(g(n)) = theta(log(n)).
True. As f(n) \in Omega(log(n)), it means lim_{n\to\infty} f(n)/log(n) > 0. Hence A = lim_{n\to\infty} (f(n) + g(n))/log(g(n)) = f(n)/log(n) + g(n)/log(n). Hence, as g(n) > 0, A > 0, and it means f(n) + g(n) \in Omega(log(n)).
I was reading an article and came across the following :
Informally, O(g(n)) can be defined as the set of mathematical functions that contains all functions that don’t “grow faster” than g(n). Thus, all functions below are in the set O(n²):
f(n) = n²+3n + 2, f(n) = n log(n), f(n) = 3n+1
.
Can please anyone tell me how f(n) = n²+3n + 2 grows faster than g(n)?
Can please anyone tell me how f(n) = n²+3n + 2 grows faster than g(n)?
Here is one way to understand it (a bit informal, but I find it more intuitive).
Let L be limit as n goes to infinity of f(n)/g(n)
If L is infinity then f(n) grows faster than g(n) (numerator overwhelms denominator).
If L is 0 then f(n) grows slower than g(n) (denominator overwhelms numerator)
If L is finite number then they have same (comparable) growth rates.
We can define O(g(n)) as the following set:
O(g(n)) = { f(n) ∶ ∃ c > 0 and n0 > 0 | 0 ≤ f(n) ≤ c ⋅ g(n), ∀n ≥ n0 }
This means O(g(n)) is the set of all functions f(n) which grow slower than g(n) for some constant c and for n ≥ n0. In order to find n0 and c we use a justification like the following:
n²+3n + 2 ≤ n² + 3n² + 2n²
n²+3n + 2 ≤ 6n² for c = 6 and n ≥ 1
Now if you just use g(n) = n² obviously f(n) = n² + 3n + 2 will grow faster than g(n); but by choosing the value of c correctly g(n) will grow faster than f(n) for n ≥ n0.
I don't really understand how or what I'm supposed to prove. I've researched into each, but still unclear to me what is expected.
Which of the following statements are true? Prove your answers.
n² ∈ O(n³)
n² ∈ Ω(n³)
2ⁿ ∈ Θ(2n+1)
n! ∈ Θ((n+1)!)
Any help would be much appreciated!
Since this (probably homework) questions is some days old, I think I can answer this question in short.
The wikipedia page (and hopefully your textbook and/or notes too) says
f(n) ∈ O(g(n)) ⇔ lim sup |f(n)/g(n)| < ∞
f(n) ∈ Ω(g(n)) ⇔ lim sup |f(n)/g(n)| > 0
f(n) ∈ Θ(g(n)) ⇔ f(n) ∈ O(g(n)) and f(n) ∈ Ω(g(n))
To prove the left side you can prove the right side.
n² ∈ O(n³) is true, due to
lim sup |n²/n³| = lim (n²/n³) = lim (1/n) = 0 < ∞
n² ∈ Ω(n³) is false, due to
lim sup |n²/n³| = lim (n²/n³) = lim (1/n) = 0
2ⁿ ∈ Θ(2n+1) is true, due to
0 < lim sup |2ⁿ/2<sup>n+1</sup>| = lim (2ⁿ/(2⋅2ⁿ) = lim (1/2) = 1/2 < ∞
n! ∈ Θ((n+1)!) is false, due to
lim sup |n!/(n+1)!| = lim (n!/((n+1)⋅n!) = lim (1/(n+1)) = 0
Notice: All limits holds for n → ∞.
I have these three questions for an exam review:
If f(n) = 2n - 3 give two different functions g(n) and h(n) (so g(n) doesn't equal h(n)) such that f(n) = O(g(n)) and f(n) = O(h(n))
Now do the same again with functions g'(n) and h'(n), but this time the function should be of the form
g'(n) = Ɵ(f(n)) and f(n) = o(h'(n))
Is it possible for a function f(n) = O(g(n)) and f(n) = Ω(g(n))?
I know that a function is O(n) of another, if it is less than or equal to the other function. So I think 1. could be g(n) = 2n²-3 and h(n) = 2n²-10.
I also know that a function is Ɵ(n) of another if it is basically equal to the other function (we can ignore constants), and o(n) if it is only less than the function, so for 2. I think you could have g'(n) = 2n-15 and h'(n) = 2n.
To 3.: It is possible for a function to be both O(n) and Ω(n) because O(n) and Ω(n) allows for the function to be the same as the given function, so you could have a function g(n) that equals f(n) and satisfies the rules for being both O and Ω.
Can someone please tell me if this is correct?
Your answers are mostly right. But I would like to add some points:
Given is f(n) = 2n - 3
With g(n) = 2n²-3 and h(n) = 2n²-10 f(n) is in O(g(n)) and in O(h(n)). But your g(n) and h(n) are basicly the same, at least they are both in Θ(n²). There exists many other function that would also work. E.g.
f(n) ∈ O(n) ⇒ g(n) = n
f(n) ∈ O(nk) ⇒ g(n) = nk ∀ k ≥ 1
f(n) ∈ O(2ⁿ) ⇒ g(n) = 2ⁿ
g'(n) = 2n-15 reduces to g'(n) = n, if we think in complexities, and this is right. In fact, it is the only possible answer.
But f(n) ∈ o(h'(n)) does not hold for h'(n) = 2n. Little-o means that
limn → ∞ | f(n)/g(n) | = 0 ⇔ f(n) ∈ o(g(n))
So you can choose h'(n) = n² or more general h'(n) = nk ∀ k > 1 or h'(n) = cⁿ for a constant c > 1.
Yes it is possible and you can take it also as a definition for Θ(g(n)):
f(n) ∈ Θ(g(n)) ⇔ f(n) ∈ O(g(n)) and f(n) ∈ Ω(g(n))
Can the master theorem be applied?
Or say for T (n) = 2T (n/16) + n log n, how is the master theorem applied here?
I get a = 2, b = 16 and I am not sure about c and k.
To solve such a recurrence relation T(n) = a⋅T(n/b) + f(n), you have to calculate e = logb(a).
Then (for an ε > 0):
f(n) ∈ O(ne - ε) ⇒ T(n) ∈ Θ(ne)
f(n) ∈ Θ(ne) ⇒ T(n) ∈ Θ(ne⋅log(n))
f(n) ∈ Ω(ne + ε) ⇒ T(n) ∈ Θ(f(n))
For more details see Masters Theorem.
So in your case: a = 2, b = 16 ⇒ e = log16(2) = 0.25 holds for case 3,
so T(n) is in Θ(n log n).
Even if the log (n) term was not there the reduction in work per sub-problem at each level dominates (b > a). Hence in my opinion the complexity shall be dictated by the work done at highest level viz O (nlogn).