Im currently having a problem with big O notation. I have the following question which I am trying to figure out.
I currently have the formula: T(n) is O(f(n)) and I must use this to prove directly from the definition of big O that 3n^2+11n+6 is O(n^2).
I was wondering if anybody could possibly help me figure out this problem as I am having trouble working it out.
I think this may help:
For n≥k, there is a constant, let's name it "c" which satisfies 3n^2 + 11n + 6 ≤ c∗n^2.
Let's say we pick k = 1.
We know that n^2 ≥ n^2 ≥ n ≥ 1
So :
3n^2 + 11n + 6 ≤ 3n^2 + 11n^2 + 6n^2 =>3n^2 + 11n + 6 ≤ 20n^2
Now, let c = 20.
=>complexity is O(n2).
Related
I am trying to learn how to prove Big O correctly.
what i am trying to do is find some C and N0 for a given function.
the definition given for Big-O is
Let f(n) and g(n) be functions mapping nonnegative integers to real numbers.
We say that f(n) is O(g(n)) if there is a real constant c > 0 and an integer
constant n0 ≥ 1 such that for all n ≥ n0, f(n) ≤ c g(n).
Given the polynomial (n+1)^5 i need to show that it has a runtime of O(n^5).
my question is, how do i find such c and N0 from the definition above, and how do i continue my algebra to see if it runs n^5?
So far by trying induction i have,
(n+1)^5 = n^5 + 5n^4 + n^3 + 10n^2 + 5n^1 + n^0
find the n+1 element so
n^5 + 5n^4 + n^3 + 10n^2 + 5n^1 + n^0 <= n^5 + 5n^5 + n^5 + 10n^5 + 5n^5 + n^5
n^5 + 5n^4 + 10n^2 + 5n + 1 <= 22n^5
You want a constant c such that (n + 1) 5 ≤ c n 5. For that, you do not need induction, only a bit of algebra and it turns out you actually already found such a c, but missed the n0 in the process. So let's start from the beginning.
Note that c does not need to be tight, it can be way bigger than necessary and will still prove time-complexity. We will use that to our advantage.
We can first develop the left side as you did.
(n + 1) 5 = n5 + 5n4 + 10n3 + 10 n2 + 5n + 1
For n ≥ 1, we have that n, n2, n3, n4 ≤ n5, an thus.
(n + 1) 5 ≤ (1 + 5 + 10 + 10 + 5 + 1) n5 = 22n5
And there you got a c such that (n + 1) 5 ≤ c n5. That c is 22.
And since we stated above that this holds if n ≥ 1, then we have that n0 = 1.
Generalization
This generalizes for any degree. In general given the polynomial f(n) = (n + a)b, then you know that there exists a number c that is found by summing all the coefficients of the polynomial after development. It turns out the exact value of c does not matter so you do not need to compute it, all that matter is that we proved its existence and thus (n + a)b is O(nb).
I am wonder how to exactly find the tight upper bound for T(n)?
for one example below:
T(n)=T( n/2 + n(1/2)) + n.
I am not that sure how to use the domain or range transform here.
I use the domain transform here.
let
n = 22k ==> n/2 = 22k-1
and n1/2 = 22k-1
After that, i do not know how to solve this kind of problem with addition in T(n).
Hope someone can tell me how to solve these kind recurrences.
Thanks Ali Amiri,
As what you said, I approximately consider.
T(n)=T( n/2 ) + n.
and let,
n = 2k,
==> T(2k)= T(2k-1)+ 2k
suppose ak = T(2k)
using domain transform, I get:
ak = 2kc1 + c2
hence,
T(n) = O(n).
Am I right? or still wrong?
Ali Amiri's intuition is correct, but it's not a formal argument. Really there needs to be a base case like
T(n) = 1 for all 0 ≤ n < 9
and then we can write
1/2
n ≤ n/3 for all n ≥ 9
and then guess and check a nondecreasing O(n) solution for the recurrence
T'(n) = T'(n/2 + n/3) + n
and argue that T = O(T') = O(n).
How do you work this out? do you get c first which is the ratio of the two functions then with the ratio find the range of n ? how can you tell ? please explain i'm really lost, Thanks.
Example 1: Prove that running time T(n) = n^3 + 20n + 1 is O(n^3)
Proof: by the Big-Oh definition,
T(n) is O(n^3) if T(n) ≤ c·n^3 for some n ≥ n0 .
Let us check this condition:
if n^3 + 20n + 1 ≤ c·n^3 then 1 + 20/n^2 + 1/n^3 <=c .
Therefore,
the Big-Oh condition holds for n ≥ n0 = 1 and c ≥ 22 (= 1 + 20 + 1). Larger
values of n0 result in smaller factors c (e.g., for n0 = 10 c ≥ 1.201 and so on) but in
any case the above statement is valid.
I think the trick you're seeing is that you aren't thinking of LARGE numbers. Hence, let's take a counter example:
T(n) = n^4 + n
and let's assume that we think it's O(N^3) instead of O(N^4). What you could see is
c = n + 1/n^2
which means that c, a constant, is actually c(n), a function dependent upon n. Taking N to a really big number shows that no matter what, c == c(n), a function of n, so it can't be O(N^3).
What you want is in the limit as N goes to infinity, everything but a constant remains:
c = 1 + 1/n^3
Now you can easily say, it is still c(n)! As N gets really, really big 1/n^3 goes to zero. Hence, with very large N in the case of declaring T(n) in O(N^4) time, c == 1 or it is a constant!
Does that help?
So I have this problem to do and I am not really sure where to start:
Using the definition of Big-O, prove the following:
T(n) = 2n + 3 ∈ O(n)
T(n) = 5n + 1 ∈ O(n2)
T(n) = 4n2 + 2n + 3 ∈ O(n2)
if anyone can point me in the right direction (you don't necessarily have to give me the exact answers), I would greatly appreciate it.
You can use the same trick to solve all of these problems. As a hint, use the fact that
If a ≤ b, then for any n ≥ 1, na ≤ nb.
As an example, here's how you could approach the first of these: If n ≥ 1, then 2n + 3 ≤ 2n + 3n = 5n. Therefore, if you take n0 = 1 and c = 5, you have that for any n ≥ n0 that 2n + 3 ≤ 5n. Therefore, 2n + 3 = O(n).
Try using a similar approach to solve the other problems. For the second problem, you might want to use it twice - once to upper-bound 5n + 1 with some linear function, and once more to upper bound that linear function with some quadratic function.
Hope this helps!
From what I have studied: I have been asked to determine the complexity of a function with respect to another function. i.e. Given f(n) and g(n), determine O(f(n(). In such cases, I substitute values, compare both of them and arrive at a complexity - using O(), Theta and Omega notations.
However, in the substitution method for solving recurrences, every standard document has the following lines:
• [Assume that T(1) = Θ(1).]
• Guess O(n3) . (Prove O and Ω separately.)
• Assume that T(k) ≤ ck3 for k < n .
• Prove T(n) ≤ cn3 by induction.
How am I supposed to find O and Ω when nothing else (apart from f(n)) is given? I might be wrong (I, definitely am), and any information on the above is welcome.
Some of the assumptions above are with reference to this problem: T(n) = 4T(n/2) + n
, while the basic outline of the steps is for all such problems.
That particular recurrence is solvable via the Master Theorem, but you can get some feedback from the substitution method. Let's try your initial guess of cn^3.
T(n) = 4T(n/2) + n
<= 4c(n/2)^3 + n
= cn^3/2 + n
Assuming that we choose c so that n <= cn^3/2 for all relevant n,
T(n) <= cn^3/2 + n
<= cn^3/2 + cn^3/2
= cn^3,
so T is O(n^3). The interesting part of this derivation is where we used a cubic term to wipe out a linear one. Overkill like that is often a sign that we could guess lower. Let's try cn.
T(n) = 4T(n/2) + n
<= 4cn/2 + n
= 2cn + n
This won't work. The gap between the right-hand side and the bound we want is is cn + n, which is big Theta of the bound we want. That usually means we need to guess higher. Let's try cn^2.
T(n) = 4T(n/2) + n
<= 4c(n/2)^2 + n
= cn^2 + n
At first that looks like a failure as well. Unlike our guess of n, though, the deficit is little o of the bound itself. We might be able to close it by considering a bound of the form cn^2 - h(n), where h is o(n^2). Why subtraction? If we used h as the candidate bound, we'd run a deficit; by subtracting h, we run a surplus. Common choices for h are lower-order polynomials or log n. Let's try cn^2 - n.
T(n) = 4T(n/2) + n
<= 4(c(n/2)^2 - n/2) + n
= cn^2 - 2n + n
= cn^2 - n
That happens to be the exact solution to the recurrence, which was rather lucky on my part. If we had guessed cn^2 - 2n instead, we would have had a little credit left over.
T(n) = 4T(n/2) + n
<= 4(c(n/2)^2 - 2n/2) + n
= cn^2 - 4n + n
= cn^2 - 3n,
which is slightly smaller than cn^2 - 2n.