I've been working on these recurrence relations but I'm stumped on this one.
T(n) = 2T(n/4) + T(n/2) + n^2
I've seen them with one recursive call but not with two.
These sort of recurrences are solved with Akra-Bazzi method.
In your case a1 = 2, a2 = 1, b1 = 1/4, b2 = 1/2. So you have to solve the equation:
2/4^p + 1/2^p = 1
whose solution is p=1. Now because your g(x) = n^2, your integral from 1 to x will be just x - 1. Therefore you complexity is just x^p( 1 + x - 1) = O(x^2)
Related
T(n) = T(n/2) + T(n/4) + n
I'm not sure if this recurrences relation can be solved by Master theorem, but I found a easier way to do is using Akra–Bazzi method. Here, we have a1 = 1, b1 = 1/2, a2 = 1, b2 = 1/4, so solving p we have (1/2)^p + (1/4)^p = 1. So, p = 1, then using the formula, the asymptotic behavior can be determined as T(n) = θ(n + nlogn) = θ(nlogn)?
It is not true. In the current state, p = log(1 + sqrt(5))/log(2) - 1. Hence,it would be T(n) = Theta(n) (see the details here).
How can I solve T(n) = T(n-3)+n^2 using iteration?By master theorem answer is O(n^3) but I am having trouble solving it by iteration.
By direct resolution of the recurrence:
This is a linear recurrence of the first order. We first solve the homogeneous part,
T(n) = T(n - 3)
which is solved by a constant (more precisely three constants as three intertwined sequences form the solution).
Now for the non-homogeneous part, we use the Ansatz T(n) = an³ + bn² + cn + d, because we know that the difference of two cubic polynomials is a quadratic one.
Then
a(n³ - (n-3)³) + b(n² - (n-3)²) + c(n - (n-3)) = 9an² + 3(-9a + 2b)n + 3(9a - 3b + c) = n²
gives
a = 1/9, b = 1/2, c = 1/2.
Finally
T(n) = (2n³ + 9n² + 9n)/18 + T(0)
and similarly for the two other sequences.
Just try to expand the equation:
T(n) = n^2 + (n-3)^2 + (n-6)^2 + ... + 1 = \Theta(n^3)
T(3) = T(0) + 3²
T(6) = T(3) + 6² = T(0) + 3² + 6²
T(9) = T(6) + 9² = T(0) + 3² + 6² + 9²
...
More generally, T(3N) is the sum of T(0) and nine times the sum of the squared naturals up to N. The well-known Faulhaber formula justifies O(N³).
Similar results hold for T(3N+1) and T(3N+2).
I'm familiar with solving recurrences with iteration:
t(1) = c1
t(2) = t(1) + c2 = c1 + c2
t(3) = t(2) + c2 = c1 + 2c2
...
t(n) = c1 + (n-1)c2 = O(n)
But what if I had a recurrence with no base case? How would I solve it using the three methods mentioned in the title?
t(n) = 2t(n/2) + 1
For Master Theorem I know the first step, find a, b, and f(n):
a = 2
b = 2
f(n) = 1
But not where to go from here. I'm at a standstill because I'm not sure how to approach the question.
I know of 2 ways to solve this:
(1) T(n) = 2T(n/2) + 1
(2) T(n/2) = 2T(n/4) + 1
now replace T(n/2) from (2) into (1)
T(n) = 2[2T(n/4) + 1] + 1
= 2^2T(n/4) + 2 + 1
T(n/4) = 2T(n/8) + 1
T(n) = 2^2[2T(n/8) + 1] + 2 + 1
= 2^3T(n/8) + 4 + 2 + 1
You would just keep doing this until you can generalize. Eventually you will spot that:
T(n) = 2^kT(n/2^k) + sum(2^(k-1))
You want T(1) so set n/2^k = 1 and solve for k. When you do this you will find that, k = lgn
Substitute lgn for k you will end up with
T(n) = 2^lgnT(n/2^lgn) + (1 - 2^lgn) / (1 - 2)
2^lgn = n so,
T(n) = nT(1) + n - 1
T(n) = n + n - 1 where n is the dominant term.
For Master Theorem its really fast
Consider, T(n) = aT(n/b) + n^c for n>1
There are three cases (note that b is the log base)
(1) if logb a < c, T(n)=Θ(n^c),
(2) if logb a = c, T (n) = Θ(n^c log n),
(3) if logb a > c, T(n) = Θ(n^(logb a)).
In this case a = 2, b = 2, and c = 0 (n^0 = 1)
A quick check shows case 3.
n^(log2 2)
note log2 2 is 1
So by master theorem this is Θ(n)
Apart from the Master Theorem, the Recursion Tree Method and the Iterative Method there is also the so
called "Substitution Method".
Often you will find people talking about the
substitution method, when in fact they mean the iterative method (especially on Youtube).
I guess this stems from the fact that in the iterative method you are also substituting
something, namely the n+1-th recursive call into the n-th one...
The standard reference work about algorithms
(CLRS)
defines it as follows:
Substitution Method
Guess the form of the solution.
Use mathematical induction to find the constants and show that the solution works.
As example let's take your recurrence equation: T(n) = 2T(ⁿ/₂)+1
We guess that the solution is T(n) ∈ O(n²), so we have to prove that
T(n) ≤ cn² for some constant c.
Also, let's assume that for n=1 you are doing some constant work c.
Given:
T(1) ≤ c
T(n) = 2T(ⁿ/₂)+1
To prove:
∃c > 0, ∃n₀ ∈ ℕ, ∀n ≥ n₀, such that T(n) ≤ cn² is true.
Base Case:
n=1: T(1) ≤ c
n=2: T(2) ≤ T(1) + T(1) + 1 ≤ 4c
(≤c) (≤c) (cn²)
Induction Step:
As inductive hypothesis we assume T(n) ≤ cn² for all positive numbers smaller than n
especially for (ⁿ/₂).
Therefore T(ⁿ/₂) ≤ c(ⁿ/₂)², and hence
T(n) ≤ 2c(ⁿ/₂)² + 1 ⟵ Here we're substituting c(ⁿ/₂)² for T(ⁿ/₂)
= (¹/₂)cn² + 1
≤ cn² (for c ≥ 2, and all n ∈ ℕ)
So we have shown, that there is a constant c, such that T(n) ≤ cn² is true for all n ∈ ℕ.
This means exactly T(n) ∈ O(n²). ∎
(for Ω, and hence Θ, the proof is similar).
I am not familiar with recurrence-solving techniques outside of the master theorem, recursion trees, and the substitution method. I am guessing that solving the following recurrence for a big-O bound does not utilize one of those methods:
T(n) = T(n-1) + 2T(n-2) + 1
We can make the substitution U(n) = T(n) + 1/2 and then get a recurrence
U(n) = T(n) + 1/2
= T(n-1) + 2T(n-2) + 1 + 1/2
= T(n-1) + 1/2 + 2(T(n-2) + 1/2)
= U(n-1) + 2U(n-2),
which is a little magic but, as templatetypedef mentions, the magic can be created with the annihilator method. Now we just have to solve the linear homogeneous recurrence. The characteristic polynomial x^2 - x - 2 factors as (x+1)(x-2), so the solutions are U(n) = a(-1)^n + b2^n where a and b are any constants. Equivalently, T(n) = a(-1)^n + b2^n - 1/2, which is Theta(2^n) except in special cases.
This recursion is called non-homogeneous linear recurrence. and it is solved by converting it to a homogeneous one:
T(n) = T(n-1) + 2T(n-2) + 1
T(n+1) = T(n) + 2T(n-1) + 1
Subtracting 1 from 2 and changing the base, you get T(n) = 2 T(n-1) + T(n-2) - 2 T(n-3). The corresponding characteristic equation is:
x^3 - 2x^2 - x + 2 = 0
which has solutions x = {-1, 1, 2}. This means that the recursion looks like:
c1 * (-1)^n + c2 * 2^n + c3 * 1^n = c1 * 2^n + c2 (-1)^n + c3
Where all these constants can be found knowing T(0) and T(1). For your complexity analysis it is clear that this is exponential O(2^n).
I have been trying to solve a recurrence relation.
The recurrence is T(n) = T(n/3)+T(2n/3)+n^2
I solved the the recurrence n i got it as T(n)=nT(1)+ [ (9/5)(n^2)( (5/9)^(log n) ) ]
Can anyone tell me the runtime of this expression?
I think this recurrence works out to Θ(n2). To see this, we'll show that T(n) = Ω(n2) and that T(n) = O(n2).
Showing that T(n) = Ω(n2) is pretty straightforward - since T(n) has an n2 term in it, it's certainly Ω(n2).
Let's now show that T(n) = O(n2). We have that
T(n) = T(n / 3) + T(2n / 3) + n2
Consider this other recurrence:
S(n) = S(2n / 3) + S(2n / 3) + n2 = 2S(2n / 3) + n2
Since T(n) is increasing and T(n) ≤ S(n), any upper bound for S(n) should also be an upper-bound for T(n).
Using the Master Theorem on S(n), we have that a = 2, b = 3/2, and c = 2. Since logb a = log3/2 2 = 1.709511291... < c, the Master Theorem says that this will solve to O(n2). Since S(n) = O(n2), we also know that T(n) = O(n2).
We've shown that T(n) = Ω(n2) and that T(n) = O(n2), so T(n) = Θ(n2), as required.
Hope this helps!
(By the way - (5 / 9)log n = (2log 5/9)log n = 2log n log 5/9 = (2log n)log 5/9 = nlog 5/9. That makes it a bit easier to reason about.)
One can't tell about runtime from the T(n) OR the time complexity!It is simply an estimation of running time in terms of order of input(n).
One thing which I'd like to add is :-
I haven't solved your recurrence relation,but keeping in mind that your derived relation is correct and hence further putting n=1,in your given recurrence relation,we get
T(1)=T(1/3)+T(2/3)+1
So,either you'll be provided with the values for T(1/3) and T(2/3) in your question OR you have to understand from the given problem statement like what should be T(1) for Tower of Hanoi problem!
For a recurrence, the base-case is T(1), now by definition its value is as following:
T(1) = T(1/3) + T(2/3) + 1
Now since T(n) denotes the runtime-function, then the run-time of any input that will not be processed is always 0, this includes all terms under the base-case, so we have:
T(X < 1) = 0
T(1/3) = 0
T(2/3) = 0
T(1) = T(1/3) + T(2/3) + 1^2
T(1) = 0 + 0 + 1
T(1) = 1
Then we can substitute the value:
T(n) = n T(1) + [ (9/5)(n^2)( (5/9)^(log n) ) ]
T(n) = n + ( 9/5 n^2 (5/9)^(log n) )
T(n) = n^2 (9/5)^(1-log(n)) + n
We can approximate (9/5)^(1-log(n)) to 9/5 for asymptotic upper-bound, since (9/5)^(1-log(n)) <= 9/5:
T(n) ~ 9/5 n^2 + n
O(T(n)) = O(n^2)