calculate the complexity of an algorithm using Back Substitution - algorithm

I have the function :
int ps (int n)
{ if (n == 1) return 1;
else return (Extra(n) + Ps (n/4) + Ps (n/4)); }
Extra(n) is O(n)
I have tried to find T(n) of this function which is T(n)=T(n)+2 T(n/4) and I have calculated the complexity using the master theorem it is O(n)
but I don't know how to find the complexity of it using back substitution

First, you are wrong in terms of complexity. You didn't mention Extra(n) in writing the time complexity. So, T(n) = 2 T(n/4) + n. Now, I think the new recurrent complexity term is easy to solve by substitution:
T(n) = 2T(n/4) + n = 2 (2 T(n/8) + n/4) + n = 2^2 T(n/8) + n/2 + n =
2^2 (2 T(n/16) + n/8) + n/2 + n = 2^3 T(n/16) + n/4 + n/2 + n
Now, by mathematical induction, if we suppose n = 2^k, you can find that:
T(n) = n + n/2 + n/4 + n/8 + ... + n/2^k = n (1 + 1/2 + 1/4 + ... + 1/2^k) <= 2n
The last part of the above analysis comes from the being geometric series of the sum with factor 1/2. Hence, T(n) is in Theta(n) and O(n).

Related

What is the time complexity of the function T(n)=2T(n/4)+O(1)?(Without masters theorem)

Can anybody please explain the time complexity of T(n)=2T(n/4)+O(1) using recurrence tree? I saw somewhere it says O(n^1/2).
Just expand the equation for some iteration, and use the mathematical induction to prove the observed pattern:
T(n) = 2T(n/4) + 1 = 2(2T(n/4^2) + 1) + 1 = 2^2 T(n/4^2) + 2 + 1
Hence:
T(n) = 1 + 2 + 2^2 + ... + 2^k = 2^(k+1) - 1 \in O(2^(k+1))
What is k? from the expansion 4^k = n. So, k = 1/2 log(n). Thus, T(n) \in O(2^(1/2 log(n) + 1)) = O(sqrt(n)). Note that 2^log(n) = n.

Solving recurrence using the iteration method

I need help in solving T(n) = T(n/4) + T(n/3) + 2n using iteration method (recursion tree. I am thinking it would be either Θ(2n) or Θ(n)?
It's straightforward. We have two following inequalities:
T(n) > 2T(n/4) + 2n
and
T(n) < 2T(n/3) + 2n
Now, try to find upper bound and lower bound by expansion. Based on both cases, you will find that T(n) = Theta(n).
For example, for T'(n) = 2T(n/3) + 2n we have the following expansion:
T'(n) = 2T(n/3) + 2n = 2^2 T(n/3^2) + (1 + 2/3) * 2n
By induction we can show that:
T'(n) = 2^log_3(n) T(0) + (1 + 2/3 + 2^2/3^2 + ...) * 2n
< n + 6 * n = 7n
Because 2^log_3(n) < 2^log_2(n) = n and (1 + 2/3 + 2^2/3^2 + ...) is a geometric sum with factor 2/3. Hence, the sum will be 1/(1-2/3) = 3 when n goes to infinity.
You can do the same analysis for the lower bound of T(n).
Therefore, as c1 * n <= T(n) <= c_2 * n, we can conclude that T(n) is in Theta(n).

Complexity of T(n) = 2T(n/2) + n/2 (without master's theorem)?

I am looking at the best case running time for merge sort, and have found the following recurrence relation: T(n) = 2T(n/2) + n/2. I am aware of the fact that merge sort is theta(nlogn) in all cases. In attempting to solve this recurrence relation, I use telescoping:
T(n) = 2*T(n/2) + n/2
T(n) = 2^2*T(n/4) + n/4 + n/2
T(n) = 2^k*T(1) + (n/2 + n/4 + ... + n/2^k)
2^k = n -> log_2(n) = k
T(n) = n + n(1/2 + 1/4 + ... + 1/n)
I am unsure how to solve the summation in the last part... I'm not even sure if that is correct. My thinking is that there would be log_2(n) total items being added in the summation? I am unsure how to derive that 2T(n/2) + n/2 is theta(nlogn) without using the master's theorem please...
As pointed out in the comment, your calculation seems to be wrong.
T(n) = 2*T(n/2) + n/2
T(n) = 2*(2*T(n/4) + n/4) + n/2 = 4*T(n/4) + 2*(n/4) + n/2 = 4*T(n/4) + 2*(n/2)
T(n) = 4*(2*T(n/8) + n/8) + 2*(n/2) = 8*T(n/8) + (n/2) + 2*(n/2) = 8*T(n/8) + 3*(n/2)
...
T(n) = 2^k * T(n / 2^k) + k*(n/2), 2^k = n ---> k = log(n)
T(n) = log(n) * T(1) + log(n) * (n/2)
T(n) = logn + n*log(n)/2
Therefore time complexity of merge sort = O(n*log(n))

Proving the complexity of recursive determinant

I want to prove why Laplace determinant or recursive algorithm complexity is n!. Can anyone prove it for me? I don't know how could it then be n!, given that the equation T(n)=nT(n-1)+3n-1 only involves multiplication and addition.
Try to expand:
T(n) = n T(n-1) + 3n-1 =
n ((n-1)T(n-2) + 3(n-1)-1) + 3n-1 =
n (n-1) T(n-2) + 3 n (n-1) - n + (3n - 1)
Now by induction you can show that (if T(1) = 1):
T(n) = n (n-1) (n-2) ... 1 + 3(n + n (n-1) + ... + n!) -
(1 + n + n (n-1) + ... + n (n-1) ... n * (n-1) * ... * (n-2))
= Theta(n!)

Time complexity using recursion tree method

I've been trying to solve the given problem using recursion tree method but my answer has not been coming of the same form
T(n)=8T(n/2)+n^2
The answer of the given problem is Theta(n^3)
Try to expand the equation:
T(n) = 8 T(n/2) + n^2
T(n) = 8(8T(n/4) +(n/2)^2) + n^2 = 8^2T(n/4) + n^2 + 8 (n/2)^2
T(n) = 8^3T(n/8) + n^2 + 8 (n/2)^2 + 8^2 (n/4)^2
Now you can generalize the above sum:
T(n) = sum 8^(i) (n/2^i)^2 for i from 0 to log(n)
Simplify:
T(n) = sum 2^(3i) n^2/2^(2i) for i from 0 to log(n)
T(n) = sum 2^i n^2 for i from 0 to log(n)
T(n) = n^2 (sum 2^i for i from 0 to log(n))
T(n) = n^2 * (2^(log(n)+1) - 1) = n^2 * (2n - 1) = Theta(n^3)
In the above, you should be aware that sum 2^i for i from 0 to log(n) is 1 + 2 + 2^2 + ... + 2^(log(n)) = 2^(log(n) + 1) - 1 = 2n - 1.

Resources