Can someone please help me with this ?
Use iteration method to solve it. T(n)= 2T(n-1) + (2^n) , T(0) = 1
Explanation of steps would be greatly appreciated.
I tried to solve the recursion as follows
T(n)= 2T(n-1) + 2^n
T(n-1) = 2T(n-2) + 2^(n-1)
T(n-2) = 2T(n-3) + 2^(n-2)
T(n-3) = 2T(n-4) + 2^(n-3)
...
T(0) = 1
Then:
T(n) = 2^k * T(n-k) + ... Here's where I get stuck.
Well, let's compute some values for small n:
T(0) = 1
T(1) = 4
T(2) = 12
T(3) = 32
T(4) = 80
T(5) = 192
the function seems to be exponetial; we have 2^n term that's why let's check if
T(n) = f(n) * 2^n
where f(n) is some unknown function. If we divide by 2^n we have f(n) = T(n) / 2^n
T(0) / 2^0 = 1
T(1) / 2^1 = 2
T(2) / 2^2 = 3
T(3) / 2^3 = 4
Looks quite clear that f(n) = n + 1 and
T(n) = (n + 1) * 2^n
Now let's prove it by induction.
Base: it holds for n = 0: (0 + 1) * 2^0 = 1
Step: from T(n - 1) = n * 2^(n - 1) we have
T(n) = 2 * T(n - 1) + 2^n =
= 2 * n * 2^(n - 1) + 2^n =
= n * 2^n + 2^n =
= (n + 1) * 2^n
So, if T(n - 1) holds, T(n) holds as well.
Q.E.D.
Closed formula for
T(n) = 2T(n-1) + (2^n)
T(0) = 1
Is
T(n) = (n + 1) * 2^n
Cheating way: try oeis (on-line encyclopedia of integer sequences) and you'll find A001787
Related
How do you calculate the time complexity of the recursive tribonacci function F(n) = F(n-1) + F(n-2) + F(n-3) with base cases F(0) = 0, F(1) = F(2) = 1?
It's easier to use induction to prove it's O(1.84^(n-1)).
T(n) = 1 when n <= 2 and T(n) = T(n-1) + T(n-2) + T(n-3) when n > 2.
Base case:
T(3) = 1 + 1 + 1 = 3
T(3) = 1.84^2 ≈ 3
T(3) = O(1.84^(n-1))
Inductive case: Assume T(n-1) = 1.84^(n-2). Then,
T(n) = T(n-1) + T(n-2) + T(n-3)
T(n) = 1.84^(n-2) + 1.84^(n-3) + 1.84^(n-4)
T(n) ≈ 1.84^(n-1)
T(n) = O(1.84^(n-1))
If you want it to be exact, use the tribonacci constant instead, but it's tedious to show it is equal. However, I can edit this to show it if you want.
I don't know how to continue in this recurrence cause I don't see any pattern, any help??
T(n) = 2n + T(n/2)
= 3n + T(n/4)
= 7n/2 + T(n/8)
= 15n/4 + T(n/16)
and so on...
As I understand it's just simple reccurence.
T(n) = 2n + T(n/2)
Your notation can makes someone think different. For me it should be:
T(n) = 2n + T(n/2) ....(1)
T(n/2) = 2(n/2) + T(n/2/2) = n + T(n/4)
T(n) = 2n + n + T(n/4) = 3n + T(n/4) ....(2)
T(n/4) = 2(n/4) + T(n/4/2) = n/2 + T(n/8)
T(n) = 2n + n + n/2 + T(n/8) = 7n/2 + T(n/8) ....(3)
T(n/8) = 2(n/8) + T(n/8/2) = n/4 + T(n/16)
T(n) = 2n + n + n/2 + n/4 + T(n/16) = 15n/4 + T(n/16) ....(4)
T(n/16) = 2(n/16) + T(n/16/2) = n/8 + T(n/32)
T(n) = 15n/4 + n/8 + T(n/32) = 31n/4 + T(n/32) ....(5)
and so on...
This is a usual recurrence relation - if you are a CS student, you will soon know the result by heart.
If you want to find the result by hand, make a geometric sum appears from the recurrence:
T(n) = 2n + n + n/2 + ... + n/2^(k+1) + T(0)
= 2n(1 + 1/2+ ... + 1/2^(k+2)) + T(0)
Where k = INT(log2(n))
You can see a geometric sum of general term 1/2 appears
1 + 1/2 + ... + 1/2^(k+2) = (1 - 1/2^(k+3)) / (1 - 1/2)
Observing that 2^(k+2) = 8 * 2^(log2(n)) = 8n and simplifying
T(n) = 4n + T(0) - 1/2 = Theta(4n)
In addition to the expanding series till T(0) way shown by Alexandre Dupriez, You can also apply master theorem to solve it
For the recurrence equation
T(n) = 2n + T(n/2)
Master Theorem:
For recurrences of form,
T(n) = a T(n/b) + f(n)
where a >= 1 and b > 1
If f(n) is O(nc) then
If c < logba, then T(n) = O(n * logba)
If c = logba, then T(n) = O(nc * log n)
if c > logba, then T(n) = O(nc)
we have a = 1, b = 2, c = 1 and c > logba (case 3)[as c = 1 and log 1 (any base) = 0]
Therefore, T(n) = O (n1)
T(n) = O (n)
I want to find the complexity of an algorithm that involves the recurrence:
T(n) = T(n-2) + T(2) + n
T(n) is the time it takes to solve a problem of size n. I want to use recursion tree but my problem is T(2), can we consider T(2) will be dominated by T(n-2).
Say you start with
T(n) = T(n - 2) + T(2) + n.
Then
T(n) =
T(n - 2) + T(2) + n =
T(n - 4) + 2T(2) + n + (n - 2) =
T(n - 6) + 3T(2) + n + (n - 2) + (n - 4) =
...
T(k) + Θ(n) T(2) + ∑i = n, n - 2, ..., k[i]
where k is some constant.
In the last expression,
T(2) is a constant, so Θ(n) T(2) = Θ(n). Also
∑i = n, n - 2, ..., k[i] = Θ(n2), since it's an arithmetic series.
Altogether, T(n) = Θ(n2).
I got stuck in this recurrence:
T(n) = T(n − 1) + lg(1 + 1/n), T(1) = 1?
for a while and it seems the master method cannot be applied on this one.
We have:
lg(1 + 1/n) = lg((n + 1) / n) = lg(n+1) - lg(n)
Hence:
T(n) - T(n - 1) = lg(n + 1) - lg(n)
T(n-1) - T(n - 2) = lg(n) - lg(n - 1)
...
T(3) - T(2) = lg(3) - lg(2)
T(2) - T(1) = lg(2) - lg(1)
Adding and eliminating, we get:
T(n) - T(1) = lg(n + 1) - lg(1) = lg(n + 1)
or T(n) = 1 + lg(n + 1)
Hence T(n) = O(lg(n))
Same answer as the other correct answer here, just proved differently.
All the following equations are created from the given recurrence:
T(n) = T(n-1) + Log((n+1)/n)
T(n-1) = T(n-2) + Log(n/(n-1))
.
.
.
T(2) = T(1) + Log(3/2)
Summing all RHS and LHS in the above equations results in:
T(n) = T(1) + Log(3/2) + Log(4/3) + ... + Log((n+1)/n)
Since Log(a) + Log(b) = Log(ab),
T(n) = 1 + Log((n+1)/2)
T(n) = Log(10) + Log((n+1)/2) = Log(5n + 5) assuming that base was 10 and using 1 = Log1010
Therefore T(n) = O(Log(5n + 5)) = O(Log(n))
It is not linear as some people claim. It is O(log(n)). Here is mathematical analysis:
If you will start unrolling recursion you will get:
if you will do this till the end you will have
or in a short form:
Once you approximate the sum with an integral, you will get:
Finally if you will take a limit x -> infinity:
You will see that the first part is
Which gives you a final solution log(x + 1) which is O(log(n))
Need some help on solving this runtime recurrence, using Big-Oh:
T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2
I don't quite get how to use the Master Theorem here
For n big enough you can assume T(n/2 - 1) == T(n/2), so you can change
T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2
into
T(n) = 2*T(n/2) + n/2 + 2
And use Master Theorem (http://en.wikipedia.org/wiki/Master_theorem) for
T(n) = a*T(n/b) + f(n)
a = 2
b = 2
f(n) = n/2 + 2
c = 1
k = 0
log(a, b) = 1 = c
and so you have (case 2, since log(a, b) = c)
T(n) = O(n**c * log(n)**(k + 1))
T(n) = O(n * log(n))