What is the recurrence relation and big O for T(n) = 2T(n-1) + O(N)? - big-o

I thought it would be something like this...
T(n) = 2T(n-1) + O(n)
= 2(2T(n-2)+(n-1)) + (n)
= 2(2(2T(n-3)+(n-2))+(n-1))+(n)
= 8T(n-3) + 4(n-2) + 2(n-1) + n
Which ends up being something like the summation of 2i * (n-i), and my book says this ends up being O(2n). Could anybody explain this to me? I don't understand why it's 2n and not just O(n) as the (n-i) will continue n times.

This recurrence has already been solved on Math Stack Exchange. As I solve this recurrence, I get:
T(n) = n + 2(T(n-1))
= n + 2(n - 1 + 2T(n-2)) = 3n - 2 + 2^2(T(n-2))
= 3n - 2 + 4(n - 2 + 2(T(n-3))) = 7n - 10 + 2^3(T(n-3))
= 7n - 10 + 8(n - 3 + 2(T(n-4))) = 15n - 34 + 2^4(T(n-4))
= (2^4 - 1)n - 34 + 2^4(T(n-4))
...and so on.
Effectively the recurrence boils down to:
T(n) = (2n+1) * T(1) − n − 2
See the Math Stack Exchange link for how we arrive at this solution. Taking T(1) to be constant, the dominating factor in the above recurrence is (2(n + 1)).
Therefore, the rate of growth of given recurrence is O(2n).

Related

What is the time complexity of the function T(n)=2T(n/4)+O(1)?(Without masters theorem)

Can anybody please explain the time complexity of T(n)=2T(n/4)+O(1) using recurrence tree? I saw somewhere it says O(n^1/2).
Just expand the equation for some iteration, and use the mathematical induction to prove the observed pattern:
T(n) = 2T(n/4) + 1 = 2(2T(n/4^2) + 1) + 1 = 2^2 T(n/4^2) + 2 + 1
Hence:
T(n) = 1 + 2 + 2^2 + ... + 2^k = 2^(k+1) - 1 \in O(2^(k+1))
What is k? from the expansion 4^k = n. So, k = 1/2 log(n). Thus, T(n) \in O(2^(1/2 log(n) + 1)) = O(sqrt(n)). Note that 2^log(n) = n.

Solving recurrence using the iteration method

I need help in solving T(n) = T(n/4) + T(n/3) + 2n using iteration method (recursion tree. I am thinking it would be either Θ(2n) or Θ(n)?
It's straightforward. We have two following inequalities:
T(n) > 2T(n/4) + 2n
and
T(n) < 2T(n/3) + 2n
Now, try to find upper bound and lower bound by expansion. Based on both cases, you will find that T(n) = Theta(n).
For example, for T'(n) = 2T(n/3) + 2n we have the following expansion:
T'(n) = 2T(n/3) + 2n = 2^2 T(n/3^2) + (1 + 2/3) * 2n
By induction we can show that:
T'(n) = 2^log_3(n) T(0) + (1 + 2/3 + 2^2/3^2 + ...) * 2n
< n + 6 * n = 7n
Because 2^log_3(n) < 2^log_2(n) = n and (1 + 2/3 + 2^2/3^2 + ...) is a geometric sum with factor 2/3. Hence, the sum will be 1/(1-2/3) = 3 when n goes to infinity.
You can do the same analysis for the lower bound of T(n).
Therefore, as c1 * n <= T(n) <= c_2 * n, we can conclude that T(n) is in Theta(n).

Runtime of following algorithm?

A divide and conquer algorithm solves a problem of size n by dividing it into 2
subproblems, each of size n-1, and takes O(n) time to combine their solutions. What is the runtime of this algorithm?
I'm not quite sure how to structure this recurrence relation and determine what the runtime is. Is the following relation correct?
T(n) = 2T(n-1) + O(n)
How can I get the runtime from this, if so?
Thank you so much!
Yes, your recurrence relation correctly describes your problem. To make things concrete, let's say the recurrence relation is: T(n) = 2T(n-1) + n (that is +n rather than +O(n).
Then, telescoping the recurrence relation (and assuming T(0) = 0).
T(n) = n + 2(n-1) + 4(n-2) + 8(n-3) + ... + 2^n(n-n)
= (1 + 2 + 4 + ... + 2^n)n - (0*2^0 + 1*2^1 + ... + n*2^n)
= n*(2^(n+1)-1) - 2(n*2^n-2^n+1)
= 2^(n+1) - n - 2
Checking this is correct:
2T(n-1) + n
= 2(2^n - (n-1) - 2) + n
= (2^(n+1) - 2n + 2 - 4) + n
= 2^(n+1) - n - 2
= T(n)

Determining the running time for recurrence relation T(n) = T(n-1)+n

How do I determine the running time (in terms of Big-Theta) for the algorithm of input size n that satisfies recurrence relation T(n) = T(n-1)+n where n >= 1 and with initial condition T(1) = 1?
Edit: I was practicing a past exam paper. Got stuck on this question. Need guidance
Look at it this way: T(n) = T(n-1) + n = T(n-2) + (n-1) + n = T(n-3) + (n-2) + (n-1) + n. Which means if n >= 1 then you will get something like T(n) = 1 + 2 + 3 + ... + n. If you work out the pattern of this series you will see that (n+1)n/2. Therefore, Ө(n^2)

Confused on recurrence and Big O

I know that
T(n) = T(n/2) + θ(1) can be result to O(Log N)
and my book said this is a case of Binary Search.
But, how do you know that? Is it just by the fact that Binary Search cuts the problem in half each time so it is O(Log N)?
And T(n) = 2T(n/2) + θ(1)
why is it that the result is O(N) and not O(Log N) when the algorithm divides in half each time as well.
Then T(n) = 2T(n/2) + θ(n)
can be result to O(N Log N)? I see the O(N) is from θ(n) and O(Log N) is from T(n/2)
I am really confused about how to determine the Big O of an algorithm that I don't even know how to word it properly. I hope my question is making sense.
Thanks in advance!
an intuitive solution for these problems is to see the result when unfolding the recursive formula:
Let's assume Theta(1) is actually 1 and Theta(n) is n, for simplicity
T(n) = T(n/2) + 1 = T(n/4) + 1 + 1 = T(n/8) + 1 + 1 + 1 = ... =
= T(0) + 1 + ... + 1 [logN times] = logn
T'(n) = 2T'(n/2) + 1 = 2(2T'(n/4) + 1) + 1 = 4T'(n/4) + 2 + 1 =
= 8T'(n/4) + 4 + 2 + 1 = ... = 2^(logn) + 2^(logn-1) + ... + 1 = n + n/2 + ... + 1 =
= 2n-1
T''(n) = 2T(n/2) + n = 2(2T''(n/2) + n/2) + n = 4T''(n/4) + 2* (n/2) + n =
= 8T''(n/8) + 4*n/4 + 2*n/2 + n = .... = n + n + .. + n [logn times] = nlogn
To formally prove these equations, you should use induction. Assume T(n/2) = X, and using it - prove T(n) = Y, as expected.
For example, for the first formula [T(n) = T(n/2) + 1] - and assume base is T(1) = 0
Base trivially holds for n = 1
Assume T(n) <= logn for any k <= n-1, and prove it for k = n
T(n) = T(n/2) + 1 <= (induction hypothesis) log(n/2) + 1 = log(n/2) + log(2) = log(n/2*2) = log(n)
I find an easy way to understand these is to consider the time the algorithm spends on each step of the recurrence, and then add them up to find the total time. First, let's consider
T(n) = T(n/2) + O(1)
where n=64. Let's add up how much the algorithm takes at each step:
T(64) = T(32) + 1 ... 1 so far
T(32) = T(16) + 1 ... 2 so far
T(16) = T(08) + 1 ... 3 so far
T(08) = T(04) + 1 ... 4 so far
T(04) = T(02) + 1 ... 5 so far
T(02) = T(01) + 1 ... 6 so far
T(01) = 1 ... 7 total
So, we can see that the algorithm took '1' time at each step. And, since each step divides the input in half, the total work is the number of times the algorithm had to divide the input in two... which is log2 n.
Next, let's consider the case where
T(n) = 2T(n/2) + O(1)
However, to make things simpler, we'll build up from the base case T(1) = 1.
T(01) = 1 ... 1 so far
now we have to do T(01) twice and then add one, so
T(02) = 2T(01) + 1 ... (1*2)+1 = 3
now we have to do T(02) twice, and then add one, so
T(04) = 2T(02) + 1 ... (3*2)+1 = 7
T(08) = 2T(04) + 1 ... (7*2)+1 = 15
T(16) = 2T(08) + 1 ... (15*2)+1 = 31
T(32) = 2T(16) + 1 ... (32*2)+1 = 63
T(64) = 2T(32) + 1 ... (65*2)+1 = 127
So we can see that here the algorithm has done 127 work - which is equal to the input multiplied by a constant (2) and plus a constant (-1), which is O(n). Basically this recursion corresponds to the infinite sequence (1 + 1/2 + 1/4 + 1/8 + 1/16) which sums to 2.
Try using this method on T(n) = 2T(n/2) + n and see if it makes more sense to you.
One visual solution to find the T(n) for a recursive equation is to sketch it with a tree then:
T(n) = number of nodes * time specified on each node.
In your case T(n) = 2T(n/2) + 1
I write the one in the node itself and expand it to two node T(n/2)
Note T(n/2) = 2T(n/4) + 1, and again I do the same for it.
T(n) + 1
/ \
T(n/2)+1 T(n/2)+1
/ \ / \
T(n/4)+1 T(n/4)+1 T(n/4)+1 T(n/4)+1
... ... .. .. .. .. .. ..
T(1) T(1) .......... ............T(1)
In this tree the number of nodes equals
2*height of tree = 2*log(n) = n
Then T(n) = n * 1 = n = O(n)

Resources