I'm trying to figure out how to solve recurrence equations, and I can do them easily using the recursion tree method if the equation is something like this, for example:
T(1) = 1;
T(n) = n + 2T(n/2) for n > 1
But I'm having trouble understanding how to solve equations for which the recurrence is modified by a fraction, like this for example:
T(1) = 1;
T(n) = n + 3/2T(.9n) for n > 1
How can there be 3/2th of a branch in a tree? Is it impossible to solve this using recursion trees? Can anyone explain exactly how this would work in the recursion tree method? Or is there another method that would be easier for this form of equation?
How can there be 3/2 th of a branch?
Easy: you have 4 branches on a step x, then on a step x + 1 you will have 4 * 3 / 2 = 6 branches (if you can't divide the numbers, use floor).
Can anyone explain exactly how this would work in the recursion tree
method?
You unroll the recursion, create a huge sum, spot the similarity and converge the sum.
Is there another method that would be easier for this form of equation?
Yes, people have done what I described in the previous step for a general recursion T(n) = a T(n/b) + f(n) and created a theorem. All you need is to remember it (actually you need to understand it) and you can solve any sort of this recursions.
Related
Suppose we have been given a recurrence relation T(n) = 2T(n/3) + n. And need to find the time complexity.
My problem is why I am getting different answers by master theorem and using recursion tree method.
By Master Theorem for equation T(n) = aT(n/b) + O(nklogpn)
Here a=2,b=3,p=0,k=1 and as a < bk here i.e 2 < 31 by masters theorem it gives us T(n) = O(n) time complexity.
Here is my recursion tree method
And at the end I seriously don't know what did I found out. It looks something like O(n.2logn).
Surely its not the answer and I have messed it up. But I don't get where I am wrong? What is right approach?
Also I wanted to ask if recurrence relations T(n) = 2T(n/3) + n and T(n) = 2T(n/3) + C are both same? where C is a constant in 2nd equation.
While solving a complex recurrence equation like this T(N) = 2 T(N/4 + √N) + (√10) N ;T(1) = 1
I tried to make some change of variables to make it easy and solve it by master theorem but i failed ,so i take the dominant one so it will be:
T(N) = 2 T(N/4) + (√10) N so it is T(N)=Θ(N). Is that true or not ?
Trying to unroll recursion or to make a substitution left me nowhere. So the only thing I was able to do is to see that for any sufficiently large n (above 64). You can select any number (not just 8), bigger than 4.
So you end up with
Solving this with master's theorem you see that it falls in the first case with .
Therefore the solution is Θ(N) which is the same as you wondered.
I'm really getting frustrated about solving the Recurrence above. I was trying to solve it by using the Master Method, but I just didn't get it done...
I'm having a recursive algorithm that takes 3log n times (three binary searches) to identify four sub-problems, each with with a size of n/4, and then solves them individually until n is smaller than some constant given by input. So I got this recurrence as a result:
T(n) = 4*T(n/4) + 3*log(n)
Base-Case if n < c (c = some constant given by program input):
T(n) = 1
I'm trying to find the asymptotic running time of my recursive program, and wanted to solve it by using the master theorem. Can anybody tell me if it's possible to use the master theorem with this recurrence, and if yes, which case of the master theorem is it?
All help is appreciated, thanks.
T(n) = O(n), because a logarithm of 4 base 4 is 1 and 3 * log(n) is O(n ^ 0.5)(0.5 < 1). It corresponds to the first case of the Master theorem as described here.
I have a recurrence relation given by:
T(n)=4T(n-1) - 3T(n-2)
How do I solve this?
Any detailed explanation:
What I tried was that I substituted for T(n-1) on the right hand side using the relation and I got this:
=16T(n-2)-12T(n-3)-3T(n-2)
But I don't know where and how to end this.
Not only you can easily get the time complexity of this recursion, but you can even solve it exactly. This is thanks to the exhaustive theory behind linear recurrence relations and the one you called here is a specific case of homogeneous linear recurrence.
To solve it you need to write a characteristic polynomial: t^2 -4t +3 and find it's roots which are t=1 and t=3. Which means that your solution is of the form:
T(n) = c1 + 3^n * c2.
You can get c1 and c2 if you have a boundary conditions, but for your case it is enough to claim O(3^n) time complexity.
While it's obviously O(4^n) (because T(n)<=4*T(n-1)), it looks like a smaller limit can be proved:
T(n) = 4*T(n-1) - 3*T(n-2)
T(n) - T(n-1) = 3*T(n-1) - 3*T(n-2)
D(n) = T(n) - T(n-1)
D(n) = 3*D(n-1)
D(n) = D(0) * 3^n
if D(0)=0, T(n)=const=O(1)
otherwise since the difference is exponential, the resulting function will be exponential as well:
T(n) = O(3^n)
NOTE :- Generally, these kind of recurrence relations (where number of recurrence function calls are repeated , e.g-recurrence relation for a fibonacci sequence for value n ) will result into an exponential time complexity.
First of all, your question is incomplete . It does not provide a termination condition ( a condition for which the recurrence will terminate ). I assume that it must be
T(n) = 1 for n=1 and 2 for n=2
Based on this assumption I start breaking down the above recurrence relation
On substituting T(n) into T(n-1) I get this :
16T(n-2) - 24T(n-3) + 9T(n-4)
this forms a polynomial in the power of 2
{(4^2)T(n-2) - 2.4.3 T(n-3) + (3^2) T(n-4)}
again breaking the above recurrence further we get :
64T(n-3) -144T(n-4) + 108T(n-5) -27T(n-6)
which is a polynomial of power 3
on breaking down the relation for n-1 terms we will get :
(4^n-1) T(1) - ............. something like that
we can clearly see that in the above expansion all the remaining terms will be less than 4^n-1 so, we can take the asymptotic notation as :
O(4^n)
As an exercise you can either expand the polynomial for few more terms and also draw the recursion tree to find out what's actually happening .
Trying T(n) = x^n gives you a quadratic equation: x^2 = 4x - 3. This has solutions x=1 and x=3, so the general form for T(n) is a + b*3^n. The exact values of a and b depend on the initial conditions (for example, the values of T(0) and T(1)).
Depending on the initial conditions, the solution is going to be O(1) or O(3^n).
Somewhat similar to fibonacci sequence
Running time of an algorithm is given by
T (n) =T (n-1)+T(n-2)+T(n-3) if n > 3
= n otherwise the order of this algorithm is?
if calculated by induction method then
T(n) = T(n-1) + T(n-2) + T(n-3)
Let us assume T(n) to be some function aⁿ
then aⁿ = an-1 + an-2 + an-3
=> a³ = a² + a + 1
which give complex solutions also roots of above equation according to my calculations are
a = 1.839286755
a = 0.419643 - i ( 0.606291)
a = 0.419643 + i ( 0.606291)
Now, how can I proceed further or is there any other method for this?
If I remember correctly, when you have determined the roots of the characteristic equation, then the T(n) can be the linear combination of the powers of those Roots
T(n)=A1*root1^n+A2*root2^n+A3*root3^n
So I guess the maximum complexity here will be
(maxroot)^n where maxroot is the maximum absolute value of your roots. So for your case it is ~ 1.83^n
Asymptotic analysis is done for running times of programs which give us how the running time will grow with the input.
For Recurrence relations (like the one you mentioned), we use a two step process:
Estimate the running time using the recursion tree method.
Validate(Confirm) the estimate using the substitution method.
You can find explanation of these methods in any algorithm text (eg. Cormen).
it can be aproximated like 3+9+27+......3^n which is O(3^n)