Difference between two complexity recurrence relations - algorithm

Following are two recurrence relations
T(n)= T(n/2)+T(n/2) + C
T(n)= T(n/2)*T(n/2) + C
Will both the have the same time complexity? Can I write both recurrence relations like this?
T(n) = 2T(n/2) + C

(1) is obviously the same as (3): T(n/2) + T(n/2) = 2 T(n/2). That's elementary math.
(1) is not the same as (2), and it shouldn't be difficult to see that the solution to these relations is completely different. (1) = (3) means that for data that's twice as large, the complexity measure is about twice as large — linear complexity. (2) means that for data that's twice as large, the complexity is squared — exponential complexity.

Related

Dropping the less significant terms in the middle of calculating time complexity?

We know that for some algorithm with time complexity of lets say T(n) = n^2 + n + 1 we can drop the less significant terms and say that it has a worst case of O(n^2).
What about when we're in the middle of calculating time complexity of an algorithm such as T(n) = 2T(n/2) + n + log(n)? Can we just drop the less significant terms and just say T(n) = 2T(n/2) + n = O(n log(n))?
In this case, yes, you can safely discard the dominated (log n) term. In general, you can do this any time you only need the asymptotic behaviour rather than the exact formula.
When you apply the Master theorem to solve a recurrence relation like
T(n) = a T(n/b) + f(n)
asymptotically, then you don't need an exact formula for f(n), just the asymptotic behaviour, because that's how the Master theorem works.
In your example, a = 2, b = 2, so the critical exponent is c = 1. Then the Master theorem tells us that T(n) is in Θ(n log n) because f(n) = n + log n, which is in Θ(nc) = Θ(n).
We would have reached the same conclusion using f(n) = n, because that's also in Θ(n). Applying the theorem only requires knowing the asymptotic behaviour of f(n), so in this context it's safe to discard dominated terms which don't affect f(n)'s asymptotic behaviour.
First of all you need to understand that T(n) = n^2 + n + 1 is a closed form expression, in simple terms it means you can inject some value for n and you will get the value of this whole expression.
on the other hand T(n) = 2T(n/2) + n + log(n) is a recurrence relation, it means this expression is defined recursively, to get a closed form expression you will have to solve the recurrence relation.
Now to answer your question, in general we drop lower order terms and coefficients when we can clearly see the highest order term, in T(n) = n^2 + n + 1 its n^2. but in a recurrence relation there is no such highest order term, because its not a closed form expression.
but one thing to observe is that highest order term in the closed form expression of a recurrence relation would be result of depth of recurrence tree multiplied with the highest order term in recurrence relation, so in your case it would be depthOf(2T(n/2)) * n, this would result in something like logn*n, so you can say that in terms of big O notation its O(nlogn).

Recurrence Relation for Divide and Conquer

Describe the recurrence running time T(n) on an input of size n?
A divide and conquer algorithm takes an array of n elements and divides into three sub arrays of size n/4 each taking Θ(n) time to do the subdivision. The time taken to combine outputs of each sub-problem is Θ(1).
I came with this recurrence relation, but it's not correct
T(n) = 3T(n/4) + Θ(1)
Can someone knows what I am doing wrong on this?
You missed taking Θ(n) time to do the subdivision part.
So relation should include subdivision + working on smaller parts + combining
T(n)= Θ(n) + 3T(n/4) + Θ(1) = 3T(n/4) + Θ(n)

Randomized selection complexity

after analyzing the algorithm complexity I have a few questions:
For the best case complexity - the recurrence relation is T(n) = T(n/2) + dn which implies that the complexity is Θ(n).
So by the master theory I can clearly see why this is true , but when I draw the algorithm recursive calls as a tree I don't fully understand the final result. (It seems like I have one branch in height of log(n) which in each level I operate a partition O(n) - so it suppose to be nlog(n) .
(just for memorizing - this is very similar to the best case of mergeSort algorithem , but here we ignore the unwanted sub-array after partition).
Thanks!
It is as Yves Daoust wrote. Image it with real numbers, i.e. n=1024
T(n) = T(n/2) + dn
T(1024) = T(512) + 1024
T(512) = T(256) + 512
....
T(2) = T(1) + 2 -> this would be the last operation
Therefore you get 1024+512+256+...+1 <= 2048, which is 2n
You must think about that dn is as big as n, but in recurrence relation the n is not global variable, it is local variable based on method you call.
So there is log(n) calls but they do not take n-time everyone, they take less and less time.

Master theorem for subproblems of different sizes

The Master theorem's generic form mentions that:
it is assumed that all subproblems are essentially the same size
The Akra–Bazzi method is applied when:
the sub-problems have substantially different sizes
But what are the criteria for substantially different? For example I have a recurrence relation like:
T(n) = T(n/4) + T(3n/4) + cn
(c is some constant)
Can I still use the master theorem to solve this relation (for instance approximating it as T(n) = 2T(3n/4) + cn)? Or, in other words, are these subproblem sizes "essentially the same" or are they already "substantially different"?
Assuming c is some constant, you have: T(n) = T(n/4) + T(3n/4) + O(n)
Solving this with the Akra-Bazzi method gives O(n^2)
Solving it by assuming T(n) = 2T(3n/4) + O(n) gives O(n^2.4094) (exp. rounded to 4 dp)
So just by trying it out, you can confirm that they are already substantially different.

Get time complexity of the recursion: T(n)=4T(n-1) - 3T(n-2)

I have a recurrence relation given by:
T(n)=4T(n-1) - 3T(n-2)
How do I solve this?
Any detailed explanation:
What I tried was that I substituted for T(n-1) on the right hand side using the relation and I got this:
=16T(n-2)-12T(n-3)-3T(n-2)
But I don't know where and how to end this.
Not only you can easily get the time complexity of this recursion, but you can even solve it exactly. This is thanks to the exhaustive theory behind linear recurrence relations and the one you called here is a specific case of homogeneous linear recurrence.
To solve it you need to write a characteristic polynomial: t^2 -4t +3 and find it's roots which are t=1 and t=3. Which means that your solution is of the form:
T(n) = c1 + 3^n * c2.
You can get c1 and c2 if you have a boundary conditions, but for your case it is enough to claim O(3^n) time complexity.
While it's obviously O(4^n) (because T(n)<=4*T(n-1)), it looks like a smaller limit can be proved:
T(n) = 4*T(n-1) - 3*T(n-2)
T(n) - T(n-1) = 3*T(n-1) - 3*T(n-2)
D(n) = T(n) - T(n-1)
D(n) = 3*D(n-1)
D(n) = D(0) * 3^n
if D(0)=0, T(n)=const=O(1)
otherwise since the difference is exponential, the resulting function will be exponential as well:
T(n) = O(3^n)
NOTE :- Generally, these kind of recurrence relations (where number of recurrence function calls are repeated , e.g-recurrence relation for a fibonacci sequence for value n ) will result into an exponential time complexity.
First of all, your question is incomplete . It does not provide a termination condition ( a condition for which the recurrence will terminate ). I assume that it must be
T(n) = 1 for n=1 and 2 for n=2
Based on this assumption I start breaking down the above recurrence relation
On substituting T(n) into T(n-1) I get this :
16T(n-2) - 24T(n-3) + 9T(n-4)
this forms a polynomial in the power of 2
{(4^2)T(n-2) - 2.4.3 T(n-3) + (3^2) T(n-4)}
again breaking the above recurrence further we get :
64T(n-3) -144T(n-4) + 108T(n-5) -27T(n-6)
which is a polynomial of power 3
on breaking down the relation for n-1 terms we will get :
(4^n-1) T(1) - ............. something like that
we can clearly see that in the above expansion all the remaining terms will be less than 4^n-1 so, we can take the asymptotic notation as :
O(4^n)
As an exercise you can either expand the polynomial for few more terms and also draw the recursion tree to find out what's actually happening .
Trying T(n) = x^n gives you a quadratic equation: x^2 = 4x - 3. This has solutions x=1 and x=3, so the general form for T(n) is a + b*3^n. The exact values of a and b depend on the initial conditions (for example, the values of T(0) and T(1)).
Depending on the initial conditions, the solution is going to be O(1) or O(3^n).

Resources