For functions, n^k and c^n , what is the asymptotic relationship between these functions? Assume that k ≥ 1,and c ≥ 1 are constant.
n^k is O(c^n)
n^k is Ω(c^n)
n^k is Θ(c^n)
None of the above
My thought : when c=1 then for every value, n^k > c^n and when c>1 then C^n > n^k.
so, answer suggested is 3. n^k is Θ(c^n).
Is the thought and my answer correct ? Requesting for your inputs.
There are in fact two answers to this question, depending on the value of c:
If c = 1, then c^n = 1^n = 1. n^k for k ≥ 1 will obviously outgrow this, so the answer would be (2).
If c > 1, then the exponential c^n far outgrows any polynomial term, i.e. the answer would be (1).
Note that (3) is never true.
Related
So I've been solving this exercise that asks to prove by big-O definition
2^(2log(n)) = O(n^2)
so I realized that 2^(2log(n)) = n^2
and i found out that c = 1 and n0 = 1 because n^2 <= 1*n^2 starts from n >=1
but why at the answer the teacher chose n0 = 2 ?
does it matter? or it can be 1 also?
is there any trick to find c and n0 easily in these kind of questions ?
As you correctly noticed, they are exactly equal using logarithmic rules / power rules, thus we can choose c=1 and some arbitrary n0 because the definition asks for n0>0 such that for all n>n0:
n^2 <= c*n^2 = n^2. And of-course it is true for all values of n0.
So it does not matter if your teacher chose n0=2 or you chose n0=1, you are both correct by definition.
According to the definition of Big O:
f(n) = O(g(n)) if there exists a positive integer n0 and a positive constant c, such that f(n) ≤ c*g(n) ∀ n≥n0
From your question, it is unclear of what the base value of log function is?
Let f(n) = 2^(2log(n)) and g(n) = n^2.
Let us consider 3 following cases:
Case 1: base = 2
f(n) evaluates to n^2 and therefore it is clear that c=1 and n0=1.
Case 2: base = 10
f(n) = 2^(2log10(n)) ~ n^(0.602)
In this case, we can also say that c=1 and n0=1.
As a proof, I plotted the graph for the functions x^2 and x^0.602 as follows:
In the above figure, you can clearly see that for n0 > 1, the x^2 > x^0.602.
Case 3: base = e
f(n) = 2^(2loge(n)) ~ n^(1.3862)
In this case as well, we can say that c=1 and n0=1.
As a proof, I plotted the graph for the functions x^2 and x^1.3862 as follows:
Therefore, in all the cases, you are correct.
PS: There is a strong possibility that you and professor are assuming different value for the base of the logarithmic function. But in that case as well, if base>=2, I don't see there is any wrong to consider n0=1.
I'm wondering if an algorithm with an exponential worst time complexity, should always have that stated as O(2^n). For example, if I had an algorithm that's operations triple for each addition to the input size, would I write it's time complexity as O(3^n), or would it still be classified as O(2^n).
Any formal explanation would be greatly appreciated.
3^n != O(2^n).
Let us assume that it were true.
Then there exists constants c and n0 such that 3^n ≤ c * 2^n for all n ≥ n0.
The last requirement is equivalent to (3/2)^n ≤ c for all n ≥ n0.
However, (3/2)^n → ∞ as n → ∞, so (3/2)^n ≤ c cannot be true for all n ≥ n0 for any constant c.
No, O(2^n) and O(3^n) are different. If 3^n were O(2^n), there'd be a constant k such that 3^n <= k * 2^n for all large n. There's no such k because 3^n / 2^n is (3/2)^n which grows arbitrarily large.
I am learning about complexity theory and have a question asking to show truth/falsehood of a number of Big-O statements.
I've done the first few e.g. showing 2^(n+1) is in O(2^n) by finding a constant and N value. But now they are asking more abstract things, for example:
If f(n) is O(g(n)), is log f(n) in O(log g(n))?
Is 2^(f(n)) in O(2^(g(n)))
These both seem like they would be true but I don't know how to express them formally with a constant and a N value. If you can give an example of how I could show these I can go do the rest of the problems.
The comments are both accurate. Here are some notes along the lines you are probably looking for.
Assume f(n) is O(g(n)). Then there exist n0 and c such that f(n) <= cg(n) for n >= n0. Take the logarithm of both sides. log(f(n)) <= log(cg(n)). We can use the laws of logarithms to rewrite this as log(f(n)) <= log(c) + log(g(n)). If g(n) is greater than 1, then log(c) + log(g(n)) <= (1+log(c))*log(g(n)), so we can choose c' = 1 + log(c) and get the desired result. Otherwise, note that for g(n) = 1 we're still good since any choice for c' works.
The second one is not true. Choose f(n) = 2n and g(n) = n. We see f(n) is O(g(n)) by choosing c = 3. However, 2^(2n) = 4^n is not O(2^n). To see that, assume we had n0 and c. Then 4^n <= c*2^n. Dividing by 2^n gives 2^n <= c. But this can't be correct since n can increase indefinitely whereas c is fixed.
Is 2^n = Θ(4^n)?
I'm pretty sure that 2^n is not in Ω(4^n) thus not in Θ(4^n), but my university tutor says it is. This confused me a lot and I couldn't find a clear answer per Google.
2^n is NOT big-theta (Θ) of 4^n, this is because 2^n is NOT big-omega (Ω) of 4^n.
By definition, we have f(x) = Θ(g(x)) if and only if f(x) = O(g(x)) and f(x) = Ω(g(x)).
Claim
2^n is not Ω(4^n)
Proof
Suppose 2^n = Ω(4^n), then by definition of big-omega there exists constants c > 0 and n0 such that:
2^n ≥ c * 4^n for all n ≥ n0
By rearranging the inequality, we have:
(1/2)^n ≥ c for all n ≥ n0
But notice that as n → ∞, the left hand side of the inequality tends to 0, whereas the right hand side equals c > 0. Hence this inequality cannot hold for all n ≥ n0, so we have a contradiction! Therefore our assumption at the beginning must be wrong, therefore 2^n is not Ω(4^n).
Update
As mentioned by Ordous, your tutor may refer to the complexity class EXPTIME, in that frame of reference, both 2^n and 4^n are in the same class. Also note that we have 2^n = 4^(Θ(n)), which may also be what your tutor meant.
Yes: one way to see this is to notice 4^n = 2^(2n). So 2^n is the same complexity as 4^n (exponential) because n and 2n are the same complexity (linear).
In conclusion, the bases don't affect the complexity here; it only matters that the exponents are of the same complexity.
Edit: this answer only shows that 4^n and 2^n are of the same complexity, not that 2^n is big-Theta of 4^n: you're correct that this is not the case as there is no constant k such that k*n^2 >= n^4 for all n. At some point, n^4 will overtake k*n^2. (Acknowledgements to #chiwangc / #Ordous for highlighting the distinction in their answer/comment.)
Yes. Both have exponential complexity.
Yes theta is possible even though big omega did not satisfied but equality exist by using stirling approximation. hence (2^n)=θ(3^n).
I came across two asymptotic function proofs.
f(n) = O(g(n)) implies 2^f(n) = O(2^g(n))
Given: f(n) ≤ C1 g(n)
So, 2^f(n) ≤ 2^C1 g(n) --(i)
Now, 2^f(n) = O(2^g(n)) → 2^f(n) ≤ C2 2^g(n) --(ii)
From,(i) we find that (ii) will be true.
Hence 2^f(n) = O(2^g(n)) is TRUE.
Can you tell me if this proof is right? Is there any other way to solve this?
2.f(n) = O((f(n))^2)
How to prove the second example? Here I consider two cases one is if f(n)<1 and other is f(n)>1.
Note: None of them are homework questions.
The attempted-proof for example 1 looks well-intentioned but is flawed. First, “2^f(n) ≤ 2^C1 g(n)” means 2^f(n) ≤ (2^C1)*g(n), which in general is false. It should have been written 2^f(n) ≤ 2^(C1*g(n)). In the line beginning with “Now”, you should explicitly say C2 = 2^C1. The claim “(ii) will be true” is vacuous (there is no (ii)).
A function like f(n) = 1/n disproves the claim in example 2 because there are no constants N and C such that for all n > N, f(n) < C*(f(n))². Proof: Let some N and C be given. Choose n>N, n>C. f(n) = 1/n = n*(1/n²) > C*(1/n²) = C*(f(n))². Because N and C were arbitrarily chosen, this shows that there are no fixed values of N and C such that for all n > N, f(n) < C*(f(n))², QED.
Saying that “f(n) ≥ 1” is not enough to allow proving the second claim; but if you write “f(n) ≥ 1 for all n” or “f() ≥ 1” it is provable. For example, if f(n) = 1/n for odd n and 1+n for even n, we have f(n) > 1 for even n > 0, and less than 1 for odd n. To prove that f(n) = O((f(n))²) is false, use the same proof as in the previous paragraph but with the additional provision that n is even.
Actually, “f(n) ≥ 1 for all n” is stronger than necessary to ensure f(n) = O((f(n))²). Let ε be any fixed positive value. No matter how small ε is, “f(n) ≥ ε for all n > N'” ensures f(n) = O((f(n))²). To prove this, take C = max(1, 1/ε) and N=N'.