Problems Solving Recurrence T(n) = 4T(n/4) + 3log n - performance

I'm really getting frustrated about solving the Recurrence above. I was trying to solve it by using the Master Method, but I just didn't get it done...
I'm having a recursive algorithm that takes 3log n times (three binary searches) to identify four sub-problems, each with with a size of n/4, and then solves them individually until n is smaller than some constant given by input. So I got this recurrence as a result:
T(n) = 4*T(n/4) + 3*log(n)
Base-Case if n < c (c = some constant given by program input):
T(n) = 1
I'm trying to find the asymptotic running time of my recursive program, and wanted to solve it by using the master theorem. Can anybody tell me if it's possible to use the master theorem with this recurrence, and if yes, which case of the master theorem is it?
All help is appreciated, thanks.

T(n) = O(n), because a logarithm of 4 base 4 is 1 and 3 * log(n) is O(n ^ 0.5)(0.5 < 1). It corresponds to the first case of the Master theorem as described here.

Related

Solving by Masters theorem and recursion tree gives different answers

Suppose we have been given a recurrence relation T(n) = 2T(n/3) + n. And need to find the time complexity.
My problem is why I am getting different answers by master theorem and using recursion tree method.
By Master Theorem for equation T(n) = aT(n/b) + O(nklogpn)
Here a=2,b=3,p=0,k=1 and as a < bk here i.e 2 < 31 by masters theorem it gives us T(n) = O(n) time complexity.
Here is my recursion tree method
And at the end I seriously don't know what did I found out. It looks something like O(n.2logn).
Surely its not the answer and I have messed it up. But I don't get where I am wrong? What is right approach?
Also I wanted to ask if recurrence relations T(n) = 2T(n/3) + n and T(n) = 2T(n/3) + C are both same? where C is a constant in 2nd equation.

Find solution to recurrence: T(N) = 2 T(N/4 + √N) + (√10) N

While solving a complex recurrence equation like this T(N) = 2 T(N/4 + √N) + (√10) N ;T(1) = 1
I tried to make some change of variables to make it easy and solve it by master theorem but i failed ,so i take the dominant one so it will be:
T(N) = 2 T(N/4) + (√10) N so it is T(N)=Θ(N). Is that true or not ?
Trying to unroll recursion or to make a substitution left me nowhere. So the only thing I was able to do is to see that for any sufficiently large n (above 64). You can select any number (not just 8), bigger than 4.
So you end up with
Solving this with master's theorem you see that it falls in the first case with .
Therefore the solution is Θ(N) which is the same as you wondered.

Solving recurrence equations with fractions using Recursion Tree Method

I'm trying to figure out how to solve recurrence equations, and I can do them easily using the recursion tree method if the equation is something like this, for example:
T(1) = 1;
T(n) = n + 2T(n/2) for n > 1
But I'm having trouble understanding how to solve equations for which the recurrence is modified by a fraction, like this for example:
T(1) = 1;
T(n) = n + 3/2T(.9n) for n > 1
How can there be 3/2th of a branch in a tree? Is it impossible to solve this using recursion trees? Can anyone explain exactly how this would work in the recursion tree method? Or is there another method that would be easier for this form of equation?
How can there be 3/2 th of a branch?
Easy: you have 4 branches on a step x, then on a step x + 1 you will have 4 * 3 / 2 = 6 branches (if you can't divide the numbers, use floor).
Can anyone explain exactly how this would work in the recursion tree
method?
You unroll the recursion, create a huge sum, spot the similarity and converge the sum.
Is there another method that would be easier for this form of equation?
Yes, people have done what I described in the previous step for a general recursion T(n) = a T(n/b) + f(n) and created a theorem. All you need is to remember it (actually you need to understand it) and you can solve any sort of this recursions.

Solving T (n) = √2*T(n/2) + log n using master theorem

The question is :
T(n) = √2*T(n/2) + log n
I'm not sure whether the master theorem works here, and kinda stuck.
This looks more like the Akra-Bazzi theorem: http://en.wikipedia.org/wiki/Akra%E2%80%93Bazzi_method#The_formula with k=1, h=0, g(n)=log n, a=(2)^{1/2}, b=1/2. In that case, p=1/2 and you need to evaluate the integral \int_1^x log(u)/u^{3/2} du. You can use integration by parts, or a symbolic integrator. Wolfram Alpha tells me the indefinite integral is -2(log u + 2)/u^{1/2} + C, so the definite integral is 4 - 2(log x + 2)/x^{1/2}. Adding 1 and multiplying by x^{1/2}, we get T(x) = \Theta(5x^{1/2} - 2 log x - 4).
Master theorem have only constrains on your a and b which holds for your case. The fact that a is irrational and you have log(n) as your f(n) has no relation to it.
So in your case your c = log2(sqrt(2)) = 1/2. Since n^c grows faster than your log(n), the complexity of the recursion is O(sqrt(n)).
P.S. solution of Danyal is wrong as the complexity is not nlogn and the solution of Edward Doolittle is correct, also it is an overkill in this simple case.
As per master theorem, f(n) should be polynomial but here
f(n) = logn
which is not a polynomial so it can not be solved by master theorem as per rules. I read somewhere about the fourth case as well. I must mention that as well.
It is also discussed here:
Master's theorem with f(n)=log n
However, there is a limited "fourth case" for the master theorem, which allows it to apply to polylogarithmic functions.
If
f(n) = O(nlogba logk n), then T(n) = O(nlogba log k+1 n).
In other words, suppose you have T(n) = 2T (n/2) + n log n. f(n) isn't a polynomial, but f(n)=n log n, and k = 1. Therefore, T(n) = O(n log2 n)
See this handout for more information: http://cse.unl.edu/~choueiry/S06-235/files/MasterTheorem-HandoutNoNotes.pdf

Finding time complexity of relation T (n) =T (n-1)+T(n-2)+T(n-3) if n > 3

Somewhat similar to fibonacci sequence
Running time of an algorithm is given by
T (n) =T (n-1)+T(n-2)+T(n-3) if n > 3
= n otherwise the order of this algorithm is?
if calculated by induction method then
T(n) = T(n-1) + T(n-2) + T(n-3)
Let us assume T(n) to be some function aⁿ
then aⁿ = an-1 + an-2 + an-3
=> a³ = a² + a + 1
which give complex solutions also roots of above equation according to my calculations are
a = 1.839286755
a = 0.419643 - i ( 0.606291)
a = 0.419643 + i ( 0.606291)
Now, how can I proceed further or is there any other method for this?
If I remember correctly, when you have determined the roots of the characteristic equation, then the T(n) can be the linear combination of the powers of those Roots
T(n)=A1*root1^n+A2*root2^n+A3*root3^n
So I guess the maximum complexity here will be
(maxroot)^n where maxroot is the maximum absolute value of your roots. So for your case it is ~ 1.83^n
Asymptotic analysis is done for running times of programs which give us how the running time will grow with the input.
For Recurrence relations (like the one you mentioned), we use a two step process:
Estimate the running time using the recursion tree method.
Validate(Confirm) the estimate using the substitution method.
You can find explanation of these methods in any algorithm text (eg. Cormen).
it can be aproximated like 3+9+27+......3^n which is O(3^n)

Resources