Recursion Tree Method to solve a Recurrence equation - algorithm

How to solve the recurrence relation T(n)= 2T(n/2) + nlogn using recursion tree?

After expansion we will have:
T(n) = 2(2T(n/2^2) + n/2 log(n/2)) + nlog(n) = 2^2 T(n/2^2) + n log(n/2) + n log(n)
= 2^2(2T(n/2^3) + n/2^2 + log(n/2^2)
= 2^3T(n/2^3) + nlog(n/2^2) + n log(n/2) + n log(n)
Hence, using induction, we will have:
T(n) = n ( log(n) + log(n/2) + log(n/2^2) + ... + log(n/2^log(n)))
= n log(n * n/2 * n/2^2 * ... * n/2^log(n))
= n log(n^log(n) / 2^(1 + 2 + ... + log(n)))
= n log(n^log(n) / 2^(log(n)*(log(n)+1)/2)
= n log((n^2 / 2^(log(n)+1)) ^ (log(n)/2))
= n (log(n)/2) log(n^2 / 2n) = Theta(n (log(n))^2)

Related

Complexity of T(n) = 2T(n/2) + n/2 (without master's theorem)?

I am looking at the best case running time for merge sort, and have found the following recurrence relation: T(n) = 2T(n/2) + n/2. I am aware of the fact that merge sort is theta(nlogn) in all cases. In attempting to solve this recurrence relation, I use telescoping:
T(n) = 2*T(n/2) + n/2
T(n) = 2^2*T(n/4) + n/4 + n/2
T(n) = 2^k*T(1) + (n/2 + n/4 + ... + n/2^k)
2^k = n -> log_2(n) = k
T(n) = n + n(1/2 + 1/4 + ... + 1/n)
I am unsure how to solve the summation in the last part... I'm not even sure if that is correct. My thinking is that there would be log_2(n) total items being added in the summation? I am unsure how to derive that 2T(n/2) + n/2 is theta(nlogn) without using the master's theorem please...
As pointed out in the comment, your calculation seems to be wrong.
T(n) = 2*T(n/2) + n/2
T(n) = 2*(2*T(n/4) + n/4) + n/2 = 4*T(n/4) + 2*(n/4) + n/2 = 4*T(n/4) + 2*(n/2)
T(n) = 4*(2*T(n/8) + n/8) + 2*(n/2) = 8*T(n/8) + (n/2) + 2*(n/2) = 8*T(n/8) + 3*(n/2)
...
T(n) = 2^k * T(n / 2^k) + k*(n/2), 2^k = n ---> k = log(n)
T(n) = log(n) * T(1) + log(n) * (n/2)
T(n) = logn + n*log(n)/2
Therefore time complexity of merge sort = O(n*log(n))

Time complexity using recursion tree method

I've been trying to solve the given problem using recursion tree method but my answer has not been coming of the same form
T(n)=8T(n/2)+n^2
The answer of the given problem is Theta(n^3)
Try to expand the equation:
T(n) = 8 T(n/2) + n^2
T(n) = 8(8T(n/4) +(n/2)^2) + n^2 = 8^2T(n/4) + n^2 + 8 (n/2)^2
T(n) = 8^3T(n/8) + n^2 + 8 (n/2)^2 + 8^2 (n/4)^2
Now you can generalize the above sum:
T(n) = sum 8^(i) (n/2^i)^2 for i from 0 to log(n)
Simplify:
T(n) = sum 2^(3i) n^2/2^(2i) for i from 0 to log(n)
T(n) = sum 2^i n^2 for i from 0 to log(n)
T(n) = n^2 (sum 2^i for i from 0 to log(n))
T(n) = n^2 * (2^(log(n)+1) - 1) = n^2 * (2n - 1) = Theta(n^3)
In the above, you should be aware that sum 2^i for i from 0 to log(n) is 1 + 2 + 2^2 + ... + 2^(log(n)) = 2^(log(n) + 1) - 1 = 2n - 1.

solving this relation using the Recurrence method?

I don't know how to continue in this recurrence cause I don't see any pattern, any help??
T(n) = 2n + T(n/2)
= 3n + T(n/4)
= 7n/2 + T(n/8)
= 15n/4 + T(n/16)
and so on...
As I understand it's just simple reccurence.
T(n) = 2n + T(n/2)
Your notation can makes someone think different. For me it should be:
T(n) = 2n + T(n/2) ....(1)
T(n/2) = 2(n/2) + T(n/2/2) = n + T(n/4)
T(n) = 2n + n + T(n/4) = 3n + T(n/4) ....(2)
T(n/4) = 2(n/4) + T(n/4/2) = n/2 + T(n/8)
T(n) = 2n + n + n/2 + T(n/8) = 7n/2 + T(n/8) ....(3)
T(n/8) = 2(n/8) + T(n/8/2) = n/4 + T(n/16)
T(n) = 2n + n + n/2 + n/4 + T(n/16) = 15n/4 + T(n/16) ....(4)
T(n/16) = 2(n/16) + T(n/16/2) = n/8 + T(n/32)
T(n) = 15n/4 + n/8 + T(n/32) = 31n/4 + T(n/32) ....(5)
and so on...
This is a usual recurrence relation - if you are a CS student, you will soon know the result by heart.
If you want to find the result by hand, make a geometric sum appears from the recurrence:
T(n) = 2n + n + n/2 + ... + n/2^(k+1) + T(0)
= 2n(1 + 1/2+ ... + 1/2^(k+2)) + T(0)
Where k = INT(log2(n))
You can see a geometric sum of general term 1/2 appears
1 + 1/2 + ... + 1/2^(k+2) = (1 - 1/2^(k+3)) / (1 - 1/2)
Observing that 2^(k+2) = 8 * 2^(log2(n)) = 8n and simplifying
T(n) = 4n + T(0) - 1/2 = Theta(4n)
In addition to the expanding series till T(0) way shown by Alexandre Dupriez, You can also apply master theorem to solve it
For the recurrence equation
T(n) = 2n + T(n/2)
Master Theorem:
For recurrences of form,
T(n) = a T(n/b) + f(n)
where a >= 1 and b > 1
If f(n) is O(nc) then
If c < logba, then T(n) = O(n * logba)
If c = logba, then T(n) = O(nc * log n)
if c > logba, then T(n) = O(nc)
we have a = 1, b = 2, c = 1 and c > logba (case 3)[as c = 1 and log 1 (any base) = 0]
Therefore, T(n) = O (n1)
T(n) = O (n)

Finding the complexity T(n) = 4T(n/2) + (n^2)*logn using the iteration method

I need to find complexity of this recursion using the iteration method only:
T(n) = 4T(n/2) + (n^2)*logn
I know that you can solve this using the master method and the complexity is (n^2)(logn)^2, but I tried solving it using the iteration method and I got something else:
T(n) = 4 * T(n/2) + (n^2) * log(n)
T(n/2) = 4 * T (n/4) + ((n/2)^2) * log(n/2)
T(n/4) = 4 * T(n/8) + ((n/4)^2) * log(n/4)
T(n) = 4 * (4 * (4 * T(n/8) + (n/4)^2 * log(n/4)) + (n/2)^2 * log(n/2)) + (n^2) * log(n)
T(n) = 64T(n/8) + 16((n/4)^2) * log(n/4) + 4((n/2)^2) * log(n/2) + (n^2)log(n)
T(n) = (4^i) * T(n/(2^i)) + 4^(i-1) * (n/(2^(i-1)))^2 * log(n/(2^(i-1)))
After using i = logn I get that the algorithm has a complexity of 2^n.. which is incorrect.
If you will carefully unwind the recursion, you will get: .
Now the complicated sum becomes
This recursion will exhaust itself when n/2^k = 1 or k = log(n). Substituting it back in the equation you get:
, where c = T(1).
So everything is dominated by n^2 log^2(n) and this is the complexity of your recursion.
P.S. actually no need to approximate to sum, it is easy to calculate it with elementary math.
One can go further in distributing equivalent terms on both sides.
T(n)/n^2 = T(n/2)/(n/2)^2 + log(n)
was already found. Now to get a term in log(n) on the left and the same term in log(n/2)=log(n)-1 on the right, consider the squares of both, by the binomial formula
(log(n)-1)^2 = log(n)^2 - 2*log(n) + 1
so that
T(n)/n^2 - log(n)^2/2 = T(n/2)/(n/2)^2 - log(n/2)^2/2 - 1
T(n)/n^2 - log(n)^2/2 + log(n) = T(n/2)/(n/2)^2 - log(n/2)^2/2 + log(n/2)
As now equivalence of terms is reached one can conclude that the expression on the left side is constant.
T(n) = n^2 * (1/2*log(n)^2 - log(n) + C)
This does not form any of the three cases of Master Theorem straight away. But we can come up with an upper and lower bound based on Master Theorem.
T (n) = n^2 * log n
Maybe this will help you.
mastermethod
if T(n/2) = 4T(n/(2^2)) + ((n/2)^2)*log (n/2) ----> 1,
T(n/4) = 4T(n/(2^3)) + ((n/4)^2)*log (n/4) ----> 2
and
T(n/8) = 4T(n/(2^)4) + ((n/8)^2)*log (n/8) ----> 3,
T(n) = 4T(n/2) + (n^2)*log n
T(n) = 4[4T(n/(2^2)) + ((n/2)^2)*log (n/2)] + (n^2)*log n ----> replace 1 with T(n/2)
T(n) = (4^2)T(n/4) + (n^2)*log (n/2) + (n^2)*log n
T(n) = (4^2)[4T(n/(2^3)) + ((n/4)^2)*log (n/4)] + (n^2)*log (n/2) + (n^2)*log n ----> replace 2 with T(n/4)
T(n) = (4^3)T(n/8) + (n^2)*log (n/4) + (n^2)*log (n/2) + (n^2)*log n ----> replace 3 with T(n/8)
T(n) = (4^3)[4T(n/(2^)4) + ((n/8)^2)*log (n/8)] + (n^2)*log (n/4) + (n^2)*log (n/2) + (n^2)*log n
T(n) = (4^4)T(n/16) + (n^2)*log (n/8) + (n^2)*log (n/4) + (n^2)*log (n/2) + (n^2)*log n
if this goes till k,
T(n) = (4^k)T(n/(2^k)) + (n^2) (log (n/8) + log (n/4) + log (n/2) + log n)
if n/(2^k) = 1, n = 2^k, k = log n and T(1) = 1,
T(n) = (n^2)T(1) + (n^2) (log ((n/(2^k)......(n/(2^3)) * (n/(2^2)) * (n/(2^1)) * n)
T(n) = (n^2)T(1) + (n^2) (log ((n/(2^log n)......(n/(2^3)) * (n/(2^2)) * (n/(2^1)) * n)
T(n) = (n^2) + (n^2) (log (2^logn)) (Using geometric series)
T(n) = O(n^2 log n)

solving recurrence T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2

Need some help on solving this runtime recurrence, using Big-Oh:
T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2
I don't quite get how to use the Master Theorem here
For n big enough you can assume T(n/2 - 1) == T(n/2), so you can change
T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2
into
T(n) = 2*T(n/2) + n/2 + 2
And use Master Theorem (http://en.wikipedia.org/wiki/Master_theorem) for
T(n) = a*T(n/b) + f(n)
a = 2
b = 2
f(n) = n/2 + 2
c = 1
k = 0
log(a, b) = 1 = c
and so you have (case 2, since log(a, b) = c)
T(n) = O(n**c * log(n)**(k + 1))
T(n) = O(n * log(n))

Resources