A divide and conquer algorithm solves a problem of size n by dividing it into 2
subproblems, each of size n-1, and takes O(n) time to combine their solutions. What is the runtime of this algorithm?
I'm not quite sure how to structure this recurrence relation and determine what the runtime is. Is the following relation correct?
T(n) = 2T(n-1) + O(n)
How can I get the runtime from this, if so?
Thank you so much!
Yes, your recurrence relation correctly describes your problem. To make things concrete, let's say the recurrence relation is: T(n) = 2T(n-1) + n (that is +n rather than +O(n).
Then, telescoping the recurrence relation (and assuming T(0) = 0).
T(n) = n + 2(n-1) + 4(n-2) + 8(n-3) + ... + 2^n(n-n)
= (1 + 2 + 4 + ... + 2^n)n - (0*2^0 + 1*2^1 + ... + n*2^n)
= n*(2^(n+1)-1) - 2(n*2^n-2^n+1)
= 2^(n+1) - n - 2
Checking this is correct:
2T(n-1) + n
= 2(2^n - (n-1) - 2) + n
= (2^(n+1) - 2n + 2 - 4) + n
= 2^(n+1) - n - 2
= T(n)
Related
Can anybody please explain the time complexity of T(n)=2T(n/4)+O(1) using recurrence tree? I saw somewhere it says O(n^1/2).
Just expand the equation for some iteration, and use the mathematical induction to prove the observed pattern:
T(n) = 2T(n/4) + 1 = 2(2T(n/4^2) + 1) + 1 = 2^2 T(n/4^2) + 2 + 1
Hence:
T(n) = 1 + 2 + 2^2 + ... + 2^k = 2^(k+1) - 1 \in O(2^(k+1))
What is k? from the expansion 4^k = n. So, k = 1/2 log(n). Thus, T(n) \in O(2^(1/2 log(n) + 1)) = O(sqrt(n)). Note that 2^log(n) = n.
I am trying to solve the following recurrence:
$T(n) = 3T(n^{\frac{2}{3}}) + \log n$
but am not sure how to do so, since Master theorem does not apply. I tried to draw the recursion tree as follows:
but am not sure where to go from there, such as trying to figure out the height of the tree or the number of nodes in the last layer. Any guidance on how to find the overall big theta of the recurrence would be appreciated.
As you expand the formula, we will have:
T(n) = 3 log(n^{2/3}) + 3^2 log(n^((2/3)^2)) + ... + 3^k log(n^((2/3)^k)) + log(n)
In the above equation, k is the height of the tree. If we suppose n = 2 ^ ((3/2)^k), finally we will have 2 in n^((2/3)^k). Hence, k = log_{3/2)(log(n)). Also, we know that log(n^a) = a log(n):
T(n) = 2 log(n) + 2^2 log(n) + ... + 2^k log(n) + log(n) =
log(n) (1 + 2 + 2^2 + ... + 2^k) =
(2^(k+1) - 1) log(n)
Hence, as 2^k = O(log^2(n)), T(n) = O(log^2(n) * log(n)) = \O(log^3(n)).
How can I solve T(n) = T(n-3)+n^2 using iteration?By master theorem answer is O(n^3) but I am having trouble solving it by iteration.
By direct resolution of the recurrence:
This is a linear recurrence of the first order. We first solve the homogeneous part,
T(n) = T(n - 3)
which is solved by a constant (more precisely three constants as three intertwined sequences form the solution).
Now for the non-homogeneous part, we use the Ansatz T(n) = an³ + bn² + cn + d, because we know that the difference of two cubic polynomials is a quadratic one.
Then
a(n³ - (n-3)³) + b(n² - (n-3)²) + c(n - (n-3)) = 9an² + 3(-9a + 2b)n + 3(9a - 3b + c) = n²
gives
a = 1/9, b = 1/2, c = 1/2.
Finally
T(n) = (2n³ + 9n² + 9n)/18 + T(0)
and similarly for the two other sequences.
Just try to expand the equation:
T(n) = n^2 + (n-3)^2 + (n-6)^2 + ... + 1 = \Theta(n^3)
T(3) = T(0) + 3²
T(6) = T(3) + 6² = T(0) + 3² + 6²
T(9) = T(6) + 9² = T(0) + 3² + 6² + 9²
...
More generally, T(3N) is the sum of T(0) and nine times the sum of the squared naturals up to N. The well-known Faulhaber formula justifies O(N³).
Similar results hold for T(3N+1) and T(3N+2).
How do I determine the running time (in terms of Big-Theta) for the algorithm of input size n that satisfies recurrence relation T(n) = T(n-1)+n where n >= 1 and with initial condition T(1) = 1?
Edit: I was practicing a past exam paper. Got stuck on this question. Need guidance
Look at it this way: T(n) = T(n-1) + n = T(n-2) + (n-1) + n = T(n-3) + (n-2) + (n-1) + n. Which means if n >= 1 then you will get something like T(n) = 1 + 2 + 3 + ... + n. If you work out the pattern of this series you will see that (n+1)n/2. Therefore, Ө(n^2)
I thought it would be something like this...
T(n) = 2T(n-1) + O(n)
= 2(2T(n-2)+(n-1)) + (n)
= 2(2(2T(n-3)+(n-2))+(n-1))+(n)
= 8T(n-3) + 4(n-2) + 2(n-1) + n
Which ends up being something like the summation of 2i * (n-i), and my book says this ends up being O(2n). Could anybody explain this to me? I don't understand why it's 2n and not just O(n) as the (n-i) will continue n times.
This recurrence has already been solved on Math Stack Exchange. As I solve this recurrence, I get:
T(n) = n + 2(T(n-1))
= n + 2(n - 1 + 2T(n-2)) = 3n - 2 + 2^2(T(n-2))
= 3n - 2 + 4(n - 2 + 2(T(n-3))) = 7n - 10 + 2^3(T(n-3))
= 7n - 10 + 8(n - 3 + 2(T(n-4))) = 15n - 34 + 2^4(T(n-4))
= (2^4 - 1)n - 34 + 2^4(T(n-4))
...and so on.
Effectively the recurrence boils down to:
T(n) = (2n+1) * T(1) − n − 2
See the Math Stack Exchange link for how we arrive at this solution. Taking T(1) to be constant, the dominating factor in the above recurrence is (2(n + 1)).
Therefore, the rate of growth of given recurrence is O(2n).