Time complexity of this recurrence - algorithm

Is T (n) = T (n/2) + 2^n possible on master method?
How to measure the time complexity of this recurrence.Thanks

Well, do a few expansion:
T(n) = 2^n + T(n/2)
= 2^n + 2^(n/2) + T(n/4)
= 2^n + 2^(n/2) + 2^(n/4) + T(n/8)
...etc.
Obviously, the complexity is at least Ω(2^n), since there's a 2^n in the expansion.
Furthermore, the terms added in each expansion get small really fast. You probably already know that:
2^n + 2^(n-1) + 2^(n-2)... = O(2^n)
You have:
2^n + 2^(n/2) + 2^(n/4)...
and those terms, except for the first, are all smaller until n<2 (around a base case), so your sequence is in O(2^n) as well, i.e., the lower bound is tight and the recurrence is in Θ(2^n)

Related

Recursive algorithm time complexity (maximum independent set)

I have an assignment to analyse and implement an algorithm for finding the maximum independent set of a graph, and I've come to the time complexity part of said algorithm. I've deduced it is as follows:
T(n) = T(n-1) + T(n-3) + O(n^3) + O(n^2)
I've figured some other similar recursive algorithms out before, but I got stuck several times with this one. I think it's supposed to turn out to be ~O(2^n).
Can someone help me get the solution?
Edit: I just double checked it the solution is supposed to be O(1.8^n). Any way to get to that solution?
Since O(n³) + O(n²) = O(n³) = O(f(n)) where f(n) = n³ − 12n² + 30n − 28,
there exists some constant γ > 0 such that we can overestimate the
original recurrence T as a concrete linear non-homogeneous recurrence
T′:
T′(n) = T′(n−1) + T′(n−3) + γf(n).
We follow the usual playbook, focusing on the homogeneous part first
(same recurrence but without γn³). The characteristic polynomial is x³ −
x² − 1. Letting α be any zero of this polynomial, the function
αn satisfies
αn = αn−1 + αn−3,
Assuming that the base case values are positive, the growth rate of the
homogeneous part will be Θ(αn) where α = 1.46557… is the zero
of maximum argument.
We also have to deal with the non-homogeneous part, but it turns out not
to matter asymptotically. By design we have f(n) = (n−1)³ + (n−3)³ − n³,
so letting U(n) = βαn − γn3, we can verify that
U(n−1) + U(n−3) + γn³ = β(αn−1 + αn−3) −
γ((n−1)³ + (n−3)³ − f(n)) = βαn − γn³ = U(n)
is a solution for T′, keeping us in the same asymptotic class.
If we draw the recursion tree, it will look something like this, For simplicity, lets every node has 2 child nodes and the height of this tree is n until it reaches the base case. As it's a binary tree the number of nodes will be (2^n)-1.
So the complexity for T(n) = T(n-1) + T(n-3) is O(2^n).
Now the overall complexity is O(2^N) + O(n^3) + O (n^2) = O(2^N)

Calculating the Recurrence Relation T(n) = sqrt(n * T(sqrt(n)) + n)

I think the complexity of this recursion is O(n^2/3)` by change variable and induction. but I'm not sure. Is this solution correct?
This is a fascinating recurrence and it does not solve to Θ(n). Rather, it appears to solve to Θ(n2/3).
To give an intuition for why this isn't likely to be Θ(n), let's imagine that we're dealing with a really, really large value of n. Then since
T(n) = (nT(√n) + n)1/2
under the assumption that T(√n) ≈ √n, we'd get that
T(n) = (n√n + n)1/2
= (n3/2 + n)1/2
≈ n3/4.
In other words, assuming that T(n) = Θ(n) would give us a different value of T(n) as n gets large.
On the other hand, let's assume that T(n) = Θ(n2/3). Then the same calculation gives us that
T(n) = (nT(n) + n)1/2
= (n · n2/3 + n)1/2
&approx; (n4/3)1/2
= n2/3,
which is consistent with itself.
To validate this, I wrote a short program that printed out different values of T(n) given different inputs and plotted the results. Here's the version of T(n) that I wrote up:
double T(double n) {
if (n <= 2) return n;
return sqrt(n * T(sqrt(n)) + n);
}
I decided to use 2 as a base case, since repeatedly taking square roots will never let n drop to one. I also decided to use real-valued arguments rather than discrete integer values just to make the math easier.
If you plot the values of T(n), you get this curve:
.
This doesn't look like what I'd expect from a linear plot. To figure out what this was, I plotted it on a log/log plot, which has the nice property that all polynomial functions get converted to straight lines whose slope is equal to the exponent. Here's the result:
I consulted my Handy Neighborhood Regression Software and asked it to determine the slope of this line. Here's what it gave back:
Slope: 0.653170918815869
R2: 0.999942627574643
That's a very good fit, and the slope of 0.653 is pretty close to 2/3. So that's more empirical evidence supporting that the recurrence solves to Θ(n2/3).
All that's left to do now is to work out the math. We'll solve this recurrence using a series of substitutions.
First, I'm generally not that comfortable working with exponents in the way that this recurrence uses them, so let's take the log of both sides. (Throughout this exposition, I'll use lg n to mean log2 n).
lg T(n) = lg (nT(√n) + n)1/2
= (1/2) lg (nT(√n) + n)
= (1/2) lg(T(√n) + 1) + (1/2)lg n
≈ (1/2) lg T(√n) + (1/2) lg n
Now, let's define S(n) = lg T(n). Then we have
S(n) = lg T(n)
≈ (1/2) lg T(√ n) + (1/2) lg n
= (1/2) S(√ n) + (1/2) lg n
That's a lot easier to work with, though we still have the problem of the recurrence shrinking by powers each time. To address this, let's do one more substitution, which is a fairly common one when working with these sorts of expressions. Let's define R(n) = S(2n). Then we have that
R(n) = S(2n)
&approx; (1/2)S(√2n) + (1/2) lg 2n
= (1/2)S(2n/2) + (1/2) n
= (1/2) R(n / 2) + (1/2) n
Great! All that's left to do now is to solve R(n).
Now, there is a slight catch here. We could immediately use the Master Theorem to conclude that R(n) = Θ(n). The problem with this is that just knowing that R(n) = Θ(n) won't allow us to determine what T(n) is. Specifically, let's suppose that we just know R(n) = Θ(n). Then we could say that
S(n) = S(2lg n) = R(lg n) = Θ(log n)
to get that S(n) = Θ(log n). However, we get stuck when trying to solve for T(n) in terms of S(n). Specifically, we know that
T(n) = 2S(n) = 2Θ(log n),
but we cannot go from this to saying that T(n) = Θ(n). The reason is that the hidden coefficient in the Θ(log n) is significant here. Specifically, if S(n) = k lg n, then we have that
2k lg n = 2lg nk = nk,
so the leading coefficient of the logarithm will end up determining the exponent on the polynomial. As a result, when solving R, we need to determine the exact coefficient of the linear term, which translates into the exact coefficient of the logarithmic term for S.
So let's jump back to R(n), which we know is
R(n) &approx; (1/2) R(n/2) + (1/2)n.
If we iterate this a few times, we see this pattern:
R(n) &approx; (1/2) R(n/2) + (1/2)n
&approx; (1/2)((1/2) R(n/4) + (1/4)n) + (1/2)n
&approx; (1/4)R(n/4) + (1/8)n + (1/2)n
&approx; (1/4)((1/2)R(n/8) + n/8) + (1/8)n + (1/2)n
&approx; (1/8)R(n/8) + (1/32)n + (1/8)n + (1/2)n.
The pattern appears to be that, after k iterations, we get that
R(n) &approx; (1/2k)R(n/2k) + n(1/2 + 1/8 + 1/32 + 1/128 + ... + 1/22k+1).
This means we should look at the sum
(1/2) + (1/8) + (1/32) + (1/128) + ...
This is
(1/2)(1 + 1/4 + 1/16 + 1/64 + ... )
which, as the sum of a geometric series, solves to
(1/2)(4/3)
= 2/3.
Hey, look! It's the 2/3 we were talking about earlier. This means that R(n) works out to approximately (2/3)n + c for some constant c that depends on the base case of the recurrence. Therefore, we see that
T(n) = 2S(n)
= 2S(2lg n)
= 2R(lg n)
&approx; 2(2/3)lg n + c
= 2lg n2/3 + c
= 2c 2lg n2/3
= 2c n2/3
= Θ(n2/3)
Which matches the theoretically predicted and empirically observed values from earlier.
This was a very fun problem to work through and I'll admit I'm surprised by the answer! I am a bit nervous, though, that I may have missed something when going from
lg T(n) = (1/2) lg (T(√n) + 1) + (1/2) lg n
to
lg T(n) &approx; (1/2) lg T(√ n) + (1/2) lg n.
It's possible that this +1 term actually introduces some other term into the recurrence that I didn't recognize. For example, is there an O(log log n) term that arises as a result? That wouldn't surprise me, given that we have a recurrence that shrinks by a square root. However, I've done some simple data explorations and I'm not seeing any terms in there that look like there's a double log involved.
Hope this helps!
We know that:
T(n) = sqrt(n) * sqrt(T(sqrt(n)) + 1)
Hence:
T(n) < sqrt(n) * sqrt(T(sqrt(n)) + T(sqrt(n)))
1 is replaced by T(sqrt(n)). So,
T(n) < sqrt(2) * sqrt(n) * sqrt(T(sqrt(n))
Now, to find an upper bound we need to solve the following recurrent relation:
G(n) = sqrt(2n) * sqrt(G(sqrt(n))
To solve this, we need to expand it (suppose n = 2^{2^k} and T(1) = 1):
G(n) = (2n)^{1/2} * (2n)^{1/8} * (2n)^{1/32} * ... * (2n)^(1/2^k) =>
G(n) = (2n)^{1/2 + 1/8 + 1/32 + ... + 1/2^k} =
If we take a factor 1/2 from 1/2 + 1/8 + 1/32 + ... + 1/2^k we will have 1/2 * (1 + 1/4 + 1/8 + ... + 1/2^{k-1}).
As we know that 1 + 1/4 + 1/8 + ... + 1/2^{k-1} is a geometric series with a ratio 1/4, it is equal to 4/3 at infinity. Therefore G(n) = Theta(n^{2/3}) and T(n) = O(n^{2/3}).
Notice that as sqrt(n) * sqrt(T(sqrt(n)) < T(n), we can show similar to the previous case that T(n) = Omega(n^{2/3}). It means T(n) = Theta(n^{2/3}).

Calculate the time complexity of recurrence relation f(n) = f(n/2) + f(n/3)

How to calculate time complexity of recurrence relation f(n) = f(n/2) + f(n/3). We have base case at n=1 and n=0.
How to calculate time complexity for general case i.e f(n) = f(n/x) + f(n/y), where x<n and y<n.
Edit-1 :(after first answer posted) every number considered is integer.
Edit-2 :(after first answer posted) I like the answer given by Mbo but is it possible to answer this without using any fancy theorem like master theorem etc.Like by making tree etc.
However users are free to answer the way they like and i will try to understand.
In "layman terms" you can get dependence with larger coefficient:
T(n) = T(n/2) + T(n/2) + O(1)
build call tree for n=2^k and see that the last tree level contains 2^k items, higher level 2^k-1 items, next one 2^k-2 and so on. Sum of sequence (geometric progression)
2^k + 2^k-1 + 2^k-2 + ... + 1 = 2^(k+1) = 2*n
so complexity for this dependence is linear too.
Now get dependence with smaller (zero) second coefficient:
T(n) = T(n/2) + O(1)
and ensure in linear complexity too.
Seems clear that complexity of recurrence in question lies between complexities for these simpler examples, and is linear.
In general case recurrences with complex branching might be solved with Aktra-Bazzi method (more general approach than Master theorem)
I assume that dependence is
T(n) = T(n/2) + T(n/3) + O(1)
In this case g=1, to find p we should numerically solve
(1/2)^p + (1/3)^p = 1
and get p~0.79, then integrate
T(x) = Theta(x^0.79 * (1 + Int[1..x]((1/u^0.79)*du))) =
Theta(x^0.79 * (1 + 4.8*x^0.21 - 4.8) =
Theta(x^0.79 + 4.8*x) =
Theta(x)
So complexity is linear

How to solve recurrence T(n) = T(n-c) +T(c) + n^2

I have the recurrence function
T(n) = T(n-c) +T(c) + n²
Can you explain how can I calculate the height of the recurrence tree when:
c has an indefinite value
c = n/3 => T(n) = T(n-(n/3)) +T(n/3) + n²
I think in the first case T(n) costs θ(n³) and in the second case θ(n²), is it right?
1) if c is a constant then you can ignore the T(c) term, and it will indeed be θ(n³).
2) when it is n/3 or some other factor, look for the T() term with the largest coefficient on n - this leads to the longest branch. Then the upper bound on the time complexity is given by replacing all other T terms with this one.
Example: T(2n/3) + T(n/3) + n² < 2T(2n/3) + n², and from the Master theorem this is indeed θ(n²).

Calculating Big O complexity of Recursive Algorithms

Somehow, I find that it is much harder to derive Big O complexities for recursive algorithms compared to iterative algorithms. Do provide some insight about how I should go about solving these 2 questions.
*assume that submethod has linear complexity
def myMethod(n)
if (n>0)
submethod(n)
myMethod(n/2)
end
end
def myMethod(k,n)
if(n>0)
submethod(k)
myMethod(k,n/2)
end
end
For your first problem, the recurrence will be:
T(n) = n + T(n/2)
T(n/2) = n/2 + T(n/4)
...
...
...
T(2) = 2 + T(1)
T(1) = 1 + T(0) // assuming 1/2 equals 0(integer division)
adding up we get:
T(n) = n + n/2 + n/4 + n/8 + ..... 1 + T(0)
= n(1 + 1/2 + 1/4 + 1/8 .....) + k // assuming k = T(0)
= n*1/(1 - 1/2) ( sum of geometric series a/(1-r) when n tends to infinity)
= 2n + k
Therefore, T(n) = O(n). Remember i have assumed n tends to infinity ,cause this is what we do in Asymptotic analysis.
For your second problem its easy to see that, we perform k primitive operations everytime till n becomes 0. This happens log(n) times. Therefore, T(n) = O(k*log(n))
All you need to do is count how many times a basic operation is executed. This is true for analysing any kind of algorithm. In your case, we will count the number of times submethod is called.
You could break-down the running time of call myMethod(n) to be 1 + myMethod(n / 2). Which you can further break down to 1 + (1 + myMethod(n / 4)). At some point you will reach the base case, in log(n)th step. That gives you an algorithm of log(n).
The second one is no different, since k is constant all the time, it will again take log(n) time, assuming submethod takes constant time regardless of its input.

Resources