Determining the running time for recurrence relation T(n) = T(n-1)+n - asymptotic-complexity

How do I determine the running time (in terms of Big-Theta) for the algorithm of input size n that satisfies recurrence relation T(n) = T(n-1)+n where n >= 1 and with initial condition T(1) = 1?
Edit: I was practicing a past exam paper. Got stuck on this question. Need guidance

Look at it this way: T(n) = T(n-1) + n = T(n-2) + (n-1) + n = T(n-3) + (n-2) + (n-1) + n. Which means if n >= 1 then you will get something like T(n) = 1 + 2 + 3 + ... + n. If you work out the pattern of this series you will see that (n+1)n/2. Therefore, Ө(n^2)

Related

What is the time complexity of the function T(n)=2T(n/4)+O(1)?(Without masters theorem)

Can anybody please explain the time complexity of T(n)=2T(n/4)+O(1) using recurrence tree? I saw somewhere it says O(n^1/2).
Just expand the equation for some iteration, and use the mathematical induction to prove the observed pattern:
T(n) = 2T(n/4) + 1 = 2(2T(n/4^2) + 1) + 1 = 2^2 T(n/4^2) + 2 + 1
Hence:
T(n) = 1 + 2 + 2^2 + ... + 2^k = 2^(k+1) - 1 \in O(2^(k+1))
What is k? from the expansion 4^k = n. So, k = 1/2 log(n). Thus, T(n) \in O(2^(1/2 log(n) + 1)) = O(sqrt(n)). Note that 2^log(n) = n.

Recursive Time Complexity With Fibonacci Numbers?

public static int recurC(int n) {
if(n==1)
return 1;
return n + recurC(n-1) + recurC(n-2);
}
So I need to find the formal equation for T(n). I set it up as a recurrence relation with T(n) = C + T(n-1) + T(n-2). However, when I tried to evaluate it out, I got nowhere. The relation with each subsequent recursive call isn't entirely clear to me. Any help would be appreciated, thanks!
Let's analyze the recurrence relation below:
T(n) = T(n-1) + T(n-2) + C
T(0) = T(1) = 1
Notice that T(n-1) ≈ T(n-2), that is to say:
Number of operations performed with input size n - 1 is approximately equal to the number of operations performed with input size n - 2.
Therefore, we can show that:
T(n) = T(n-1) + T(n-2) + C
T(n) ≈ 2T(n-2) + C
T(n) ≈ 4T(n-4) + 3C
T(n) ≈ 8T(n-6) + 7C
...
T(n) = 2^k*T(n-2k) + (2^k-1)*C
n-2k = 0 --> k = n/2
T(n) = 2^(n/2) + (2^(n/2)-1)*C
T(n) = (1 + C)*2^(N/2) - C
Therefore we can conclude that, T(n) ∈ O(2^(n/2))

Runtime of following algorithm?

A divide and conquer algorithm solves a problem of size n by dividing it into 2
subproblems, each of size n-1, and takes O(n) time to combine their solutions. What is the runtime of this algorithm?
I'm not quite sure how to structure this recurrence relation and determine what the runtime is. Is the following relation correct?
T(n) = 2T(n-1) + O(n)
How can I get the runtime from this, if so?
Thank you so much!
Yes, your recurrence relation correctly describes your problem. To make things concrete, let's say the recurrence relation is: T(n) = 2T(n-1) + n (that is +n rather than +O(n).
Then, telescoping the recurrence relation (and assuming T(0) = 0).
T(n) = n + 2(n-1) + 4(n-2) + 8(n-3) + ... + 2^n(n-n)
= (1 + 2 + 4 + ... + 2^n)n - (0*2^0 + 1*2^1 + ... + n*2^n)
= n*(2^(n+1)-1) - 2(n*2^n-2^n+1)
= 2^(n+1) - n - 2
Checking this is correct:
2T(n-1) + n
= 2(2^n - (n-1) - 2) + n
= (2^(n+1) - 2n + 2 - 4) + n
= 2^(n+1) - n - 2
= T(n)

What is the recurrence relation and big O for T(n) = 2T(n-1) + O(N)?

I thought it would be something like this...
T(n) = 2T(n-1) + O(n)
= 2(2T(n-2)+(n-1)) + (n)
= 2(2(2T(n-3)+(n-2))+(n-1))+(n)
= 8T(n-3) + 4(n-2) + 2(n-1) + n
Which ends up being something like the summation of 2i * (n-i), and my book says this ends up being O(2n). Could anybody explain this to me? I don't understand why it's 2n and not just O(n) as the (n-i) will continue n times.
This recurrence has already been solved on Math Stack Exchange. As I solve this recurrence, I get:
T(n) = n + 2(T(n-1))
= n + 2(n - 1 + 2T(n-2)) = 3n - 2 + 2^2(T(n-2))
= 3n - 2 + 4(n - 2 + 2(T(n-3))) = 7n - 10 + 2^3(T(n-3))
= 7n - 10 + 8(n - 3 + 2(T(n-4))) = 15n - 34 + 2^4(T(n-4))
= (2^4 - 1)n - 34 + 2^4(T(n-4))
...and so on.
Effectively the recurrence boils down to:
T(n) = (2n+1) * T(1) − n − 2
See the Math Stack Exchange link for how we arrive at this solution. Taking T(1) to be constant, the dominating factor in the above recurrence is (2(n + 1)).
Therefore, the rate of growth of given recurrence is O(2n).

Determination of computational complexity of sample code

I give you three short codes:
First code:
procedure Proc (n:integer)
begin
if n>0 then
begin
writeln('x')
Proc(n-2)
writeln('*');
Proc(n-2)
end
end
Second code:
procedure Proc (n:integer)
begin
if n>0 then
begin
writeln('*');
Proc(n-1)
end
end
Third code:
procedure Proc (n:integer)
begin
if n>0 then
begin
writeln('x')
Proc(n/2)
writeln('*');
Proc(n/2)
end
end
I would like to know how to determine the computational complexity of each code that I gave, cuz it will help me to better understand.. Can someone write an algorithm for determination of computational complexity of sample code step by step, and do it so that it was possible to apply this algorithm for another examples of codes?
First Question: Assume you know that for the value of n - 2, Proc is called T(n-1) times. Therefore, for the value of n, T(n) = 1 + 2T(n-2), as there would be one call to Proc(n) which would in turn call Proc(n-2) twice. T(n) = 1 + 2T(n-2) is a variant of Tower of Hanoi which is T(n) = 1+2T(n-1). There are proofs here http://en.wikipedia.org/wiki/Tower_of_Hanoi to Show that T(n) = 1+2T(n-1) = 2^n-1. Therefore T(n-1) = 1+2T((n-1)-1)= 1+2T(n-2) = 2^(n-1) -1. In your case T(n) = 1 + 2T(n-2) = 2^(n-1) -1. In other words, subtracting out every other term in the Tower of Hanoi problem saves about half the calls. 2^(n-1) - 1 = 2^n/2 - 1 which is O(2^n).
Second Question: This is easier. T(0) = 1 and T(n) = 1 + T(n-1). You can solve this many different ways, but one is via telescoping:
T(n) = 1 + T(n-1)
T(n-1) = 1 + (n-2)
...
T(1) = 1 + T(0)
Adding up both sides...
T(n) + T(n-1)+...+T(1) = 1 + T(n-1) + ... + 1 + T(0) = n + T(n-1)+...+T(0)
Subtract out like terms.
T(n) = n + T(0) = n + 1. So this is O(n).
Third Question: Similar to the first. T(0) = 1, say we know that for value of n-1, you can see that T(n) = 1 + 2 T(n/2). Note here that T(n) = 1 + 2T(n/2) < n + 2T(n/2).
So solve for 2T(n/2) + n with rolling out the recurrence:
T(n) = 2 T(n/2) + n
T(n/2) = 2 T(n/4) + n/2
So T(n) = 4T(n/4) + n + n
T(n/4) = 2T(n/8) + n/4
So T(n) = 8T(n/8) n + n + n
... It looks like T(n) = 2^kT(n/2^k)+kn for positive k.
Prove it by induction.
k = 1: T(n) = 2 T(n/2)+n which was given. This is our base case.
If true for k-1, show true for k:
T(n) = (2^(k-1))T(n/2^(k-1))+(k-1)n //Inductive hypothesis
T(n/2^(k-1)) = 2 T([n/2^(k-1))]/2)+n/2^(k-1)) //Given recurrence
= 2T(n/2^k)+n/2^(k-1)
=> T(n) = (2^k)T(n/2^k)+ n + (k-1)n = (2^k)T(n/2^k) + kn. So true for k.
T(n) = 2^kT(n/2^k)+kn, choose an appropriate positive k, such as k = ln(n).
T(n) = 2^ln(n) T(n/2^Ln(n)) + nln(n) = nT(1) +nln(n).
T(1) = 1 since Proc would just end. So n(T(1)) + nln(n) = nln(n) + n = O(nln(n)).
Unfortunately, there is not a one-size-fits all procedure for complexity. You have to take it on a case-by-case basis and figure out the problem.

Resources