How do i apply the Master Theorem on this equation?
if T(1) = 1 and
T(n) = T(n-1)+2
What would be the runtime of such a program?
What is the T(1) = 1 for?
Which case is this and why?
Please an detailed explanation. Thanks.
You cannot use Master Theorem here (without variable substitution at least) here, since it is not in the correct format.
Your function however is easy to analyze and is in Theta(n).
Proof by induction, T(k) <= 2k for each k<n
T(n) = T(n-1) + 2 <= 2(n-1) + 2 = 2n -2 + 2 <= 2n
^
induction
hypothesis
Base of induction is T(1) = 1 <= 2
The above shows that T(n) is in O(n), since we found c=2 such that for n>0, the following is correct: T(n) <= c*n, and this is the definition of big O notation.
Proving similarly that T(n) is in Omega(n) is easy, and from this you can conclude T(n) is in Theta(n)
Thank you for your help. So with the help of you answer and a new recursiv equation i think i finally understood it. Lets say i have T(N) = 1*T(n-1) + n^2. Master Theorem doesn´t apply here aswell so i have my base case.
T(1) = 1
T(2) = 5
T(3) = 14
T(4) = 30
--> Proof by induction, T(k) <= 2k for each k<n
T(n) = T(n-1) + n^2 <= n^2(n-1) + n^2 = n^3 - n^2 + n^2 = n^3
^
induction
hypothesis
So this leads to O(N^3) ? not sure about this. Why not Omega or Theta.
When would my hypothesis/induction be <, > or >=.
Related
I am trying to find the time complexity (Big-Θ) of this algorithm:
Recursion(n):
while n > 1:
n = floor(n/2)
Recursion(n)
I have found an upper bound of O(n) by considering the worst case which is when n is a power of 2.
However, I am having trouble finding a lower bound (Big-Ω) for this. My intuition is that this is Ω(n) as well, but I am not sure how to show this with the floor function in the way.
Any suggestions? Thank you!
EDIT: the main thing I'm not convinced of is that T(n/2) is equivalent to T(floor(n/2)). How would one prove this for this algorithm?
floor function performs its operation in constant time, O(1). Therefore you can ignore it/see it as a constant. Let's analyze the time complexity of the algorithm below:
T(n) = T(n/2) + 1 (floor operation)
T(n/2) = T(n/4) + 1
...
T(2) = T(1) + 1 --> T(1) = constant
T(n) = T(n/4) + 2
T(n) = T(n/8) + 3
...
T(n) = T(n/2^k) + k 2^k = n therefore k=log(n)
T(n) = T(1) + log(n)
T(n) = log(n)
We can conclude that T(n) ∈ θ(log(n)).
I am wondering what is the time complexity of this recurrence relation.
Sorry if I'm writing with images, but I needed to include some math:
T(n) = T(n-1) + f(n)
Means
T(n) = T(0) + Sum_from_i=1_to_n_of( f(i) )
In your case, that's:
T(n) = T(0) + 02 + 12 + 22 ... (n-1)2
If you don't know immediately from discrete calculus that the sum comes out to O(n3), you can notice that there are n terms, with the largest being (n-1)2, and there more than (n/3) terms that are >= (n/2)2.
Hi I am having a tough time showing the run time of these three algorithms for T(n). Assumptions include T(0)=0.
1) This one i know is close to Fibonacci so i know it's close to O(n) time but having trouble showing that:
T(n) = T(n-1) + T(n-2) +1
2) This on i am stumped on but think it's roughly about O(log log n):
T(n) = T([sqrt(n)]) + n. n greater-than-or-equal to 1. sqrt(n) is lower bound.
3) i believe this one is in roughly O(n*log log n):
T(n) = 2T(n/2) + (n/(log n)) + n.
Thanks for the help in advance.
T(n) = T(n-1) + T(n-2) + 1
Assuming T(0) = 0 and T(1) = a, for some constant a, we notice that T(n) - T(n-1) = T(n-2) + 1. That is, the growth rate of the function is given by the function itself, which suggests this function has exponential growth.
Let T'(n) = T(n) + 1. Then T'(n) = T'(n-1) + T'(n-2), by the above recurrence relation, and we have eliminated the troublesome constant term. T(n) and U(n) differ by a constant factor of 1, so assuming they are both non-decreasing (they are) then they will have the same asymptotic complexity, albeit for different constants n0.
To show T'(n) has asymptotic growth of O(b^n), we would need some base cases, then the hypothesis that the condition holds for all n up to, say, k - 1, and then we'd need to show it for k, that is, cb^(n-2) + cb^(n-1) < cb^n. We can divide through by cb^(n-2) to simplify this to 1 + b <= b^2. Rearranging, we get b^2 - b - 1 > 0; roots are (1 +- sqrt(5))/2, and we must discard the negative one since we cannot use a negative number as the base for our exponent. So for b >= (1+sqrt(5))/2, T'(n) may be O(b^n). A similar thought experiment will show that for b <= (1+sqrt(5))/2, T'(n) may be Omega(b^n). Thus, for b = (1+sqrt(5))/2 only, T'(n) may be Theta(b^n).
Completing the proof by induction that T(n) = O(b^n) is left as an exercise.
T(n) = T([sqrt(n)]) + n
Obviously, T(n) is at least linear, assuming the boundary conditions require T(n) be nonnegative. We might guess that T(n) is Theta(n) and try to prove it. Base case: let T(0) = a and T(1) = b. Then T(2) = b + 2 and T(4) = b + 6. In both cases, a choice of c >= 1.5 will work to make T(n) < cn. Suppose that whatever our fixed value of c is works for all n up to and including k. We must show that T([sqrt(k+1)]) + (k+1) <= c*(k+1). We know that T([sqrt(k+1)]) <= csqrt(k+1) from the induction hypothesis. So T([sqrt(k+1)]) + (k+1) <= csqrt(k+1) + (k+1), and csqrt(k+1) + (k+1) <= c(k+1) can be rewritten as cx + x^2 <= cx^2 (with x = sqrt(k+1)); dividing through by x (OK since k > 1) we get c + x <= cx, and solving this for c we get c >= x/(x-1) = sqrt(k+1)/(sqrt(k+1)-1). This eventually approaches 1, so for large enough n, any constant c > 1 will work.
Making this proof totally rigorous by fixing the following points is left as an exercise:
making sure enough base cases are proven so that all assumptions hold
distinguishing the cases where (a) k + 1 is a perfect square (hence [sqrt(k+1)] = sqrt(k+1)) and (b) k + 1 is not a perfect square (hence sqrt(k+1) - 1 < [sqrt(k+1)] < sqrt(k+1)).
T(n) = 2T(n/2) + (n/(log n)) + n
This T(n) > 2T(n/2) + n, which we know is the recursion relation for the runtime of Mergesort, which by the Master theorem is O(n log n), s we know our complexity is no less than that.
Indeed, by the master theorem: T(n) = 2T(n/2) + (n/(log n)) + n = 2T(n/2) + n(1 + 1/(log n)), so
a = 2
b = 2
f(n) = n(1 + 1/(log n)) is O(n) (for n>2, it's always less than 2n)
f(n) = O(n) = O(n^log_2(2) * log^0 n)
We're in case 2 of the Master Theorem still, so the asymptotic bound is the same as for Mergesort, Theta(n log n).
For the following recurrence I have ti find the Big-O complexity O().
T(n) = 2T(n/2) + cn
T(1)=c1
I guessed its T(n) = O(n).
Induction hypothesis : T(n)<= ak for all k < n .
T(n) = 2T(n/2) + c*n
T(n) <= 2(a*(n/2)) + c*n
<= an +cn
=O(n)
I find it completely correct but my TA graded me 0 in this , Where do you think I went wrong ?
The so-called Master Theorem, case 2, states that
T(n) = Theta(n log n)
for your example. Furthermore, if T(n) = O(n) would be true, Quicksort (for which the above recurrence relation is satisfied) would have a linear runtime complexity, which is also not the case.
Concerning your argument, apparently you state that there exists a constant a such that
T(n) <= a*n
holds. Consequently, the induction hypothesis should be as follows.
T(k) <= a*k for each k < n
But even if this is assumed, the induction step proves that
T(n) <= (a+c)*n
holds; however, this does not prove the desired property as it does not prove that
T(n) <= a*n
holds.
I have been trying to solve a recurrence relation.
The recurrence is T(n) = T(n/3)+T(2n/3)+n^2
I solved the the recurrence n i got it as T(n)=nT(1)+ [ (9/5)(n^2)( (5/9)^(log n) ) ]
Can anyone tell me the runtime of this expression?
I think this recurrence works out to Θ(n2). To see this, we'll show that T(n) = Ω(n2) and that T(n) = O(n2).
Showing that T(n) = Ω(n2) is pretty straightforward - since T(n) has an n2 term in it, it's certainly Ω(n2).
Let's now show that T(n) = O(n2). We have that
T(n) = T(n / 3) + T(2n / 3) + n2
Consider this other recurrence:
S(n) = S(2n / 3) + S(2n / 3) + n2 = 2S(2n / 3) + n2
Since T(n) is increasing and T(n) ≤ S(n), any upper bound for S(n) should also be an upper-bound for T(n).
Using the Master Theorem on S(n), we have that a = 2, b = 3/2, and c = 2. Since logb a = log3/2 2 = 1.709511291... < c, the Master Theorem says that this will solve to O(n2). Since S(n) = O(n2), we also know that T(n) = O(n2).
We've shown that T(n) = Ω(n2) and that T(n) = O(n2), so T(n) = Θ(n2), as required.
Hope this helps!
(By the way - (5 / 9)log n = (2log 5/9)log n = 2log n log 5/9 = (2log n)log 5/9 = nlog 5/9. That makes it a bit easier to reason about.)
One can't tell about runtime from the T(n) OR the time complexity!It is simply an estimation of running time in terms of order of input(n).
One thing which I'd like to add is :-
I haven't solved your recurrence relation,but keeping in mind that your derived relation is correct and hence further putting n=1,in your given recurrence relation,we get
T(1)=T(1/3)+T(2/3)+1
So,either you'll be provided with the values for T(1/3) and T(2/3) in your question OR you have to understand from the given problem statement like what should be T(1) for Tower of Hanoi problem!
For a recurrence, the base-case is T(1), now by definition its value is as following:
T(1) = T(1/3) + T(2/3) + 1
Now since T(n) denotes the runtime-function, then the run-time of any input that will not be processed is always 0, this includes all terms under the base-case, so we have:
T(X < 1) = 0
T(1/3) = 0
T(2/3) = 0
T(1) = T(1/3) + T(2/3) + 1^2
T(1) = 0 + 0 + 1
T(1) = 1
Then we can substitute the value:
T(n) = n T(1) + [ (9/5)(n^2)( (5/9)^(log n) ) ]
T(n) = n + ( 9/5 n^2 (5/9)^(log n) )
T(n) = n^2 (9/5)^(1-log(n)) + n
We can approximate (9/5)^(1-log(n)) to 9/5 for asymptotic upper-bound, since (9/5)^(1-log(n)) <= 9/5:
T(n) ~ 9/5 n^2 + n
O(T(n)) = O(n^2)