How to calculate the upper bound time complexity (`"big O`") of a recursive function? - algorithm

Suppose I have a recursive function T, and I want to calculate the upper bound timer complexity of this function.
T(1) = 3
T(n) = 3T(n/3) + 3.
How can I find the upper bound of the time complexity of T(n)?

Use the master theorem case where a=3, b=3, c=0.
I highly recommend the MIT lectures on Algorithms. You can learn more about the Master theorem in lecture 2

assume that, n = 3^k
F(0) = 3
F(k) = 3 * F(k-1) + 3
= 3^2 * F(k-2) + 3^2 + 3
= ...
= 3^k * F(0) + 3^k + 3^(k-1) + ... + 3
= 3^(k+1) + 3^k + ... + 3^2 + 3
= [3^(k+2) - 3] / 2
T(n = 3^k) = F(k) = (9 * n - 3) / 2 = O(n)

Related

The Recurrence T(n)= 2T(n-1) + (2^n)

Can someone please help me with this ?
Use iteration method to solve it. T(n)= 2T(n-1) + (2^n) , T(0) = 1
Explanation of steps would be greatly appreciated.
I tried to solve the recursion as follows
T(n)= 2T(n-1) + 2^n
T(n-1) = 2T(n-2) + 2^(n-1)
T(n-2) = 2T(n-3) + 2^(n-2)
T(n-3) = 2T(n-4) + 2^(n-3)
...
T(0) = 1
Then:
T(n) = 2^k * T(n-k) + ... Here's where I get stuck.
Well, let's compute some values for small n:
T(0) = 1
T(1) = 4
T(2) = 12
T(3) = 32
T(4) = 80
T(5) = 192
the function seems to be exponetial; we have 2^n term that's why let's check if
T(n) = f(n) * 2^n
where f(n) is some unknown function. If we divide by 2^n we have f(n) = T(n) / 2^n
T(0) / 2^0 = 1
T(1) / 2^1 = 2
T(2) / 2^2 = 3
T(3) / 2^3 = 4
Looks quite clear that f(n) = n + 1 and
T(n) = (n + 1) * 2^n
Now let's prove it by induction.
Base: it holds for n = 0: (0 + 1) * 2^0 = 1
Step: from T(n - 1) = n * 2^(n - 1) we have
T(n) = 2 * T(n - 1) + 2^n =
= 2 * n * 2^(n - 1) + 2^n =
= n * 2^n + 2^n =
= (n + 1) * 2^n
So, if T(n - 1) holds, T(n) holds as well.
Q.E.D.
Closed formula for
T(n) = 2T(n-1) + (2^n)
T(0) = 1
Is
T(n) = (n + 1) * 2^n
Cheating way: try oeis (on-line encyclopedia of integer sequences) and you'll find A001787

What is the recurrence relation and big O for T(n) = 2T(n-1) + O(N)?

I thought it would be something like this...
T(n) = 2T(n-1) + O(n)
= 2(2T(n-2)+(n-1)) + (n)
= 2(2(2T(n-3)+(n-2))+(n-1))+(n)
= 8T(n-3) + 4(n-2) + 2(n-1) + n
Which ends up being something like the summation of 2i * (n-i), and my book says this ends up being O(2n). Could anybody explain this to me? I don't understand why it's 2n and not just O(n) as the (n-i) will continue n times.
This recurrence has already been solved on Math Stack Exchange. As I solve this recurrence, I get:
T(n) = n + 2(T(n-1))
= n + 2(n - 1 + 2T(n-2)) = 3n - 2 + 2^2(T(n-2))
= 3n - 2 + 4(n - 2 + 2(T(n-3))) = 7n - 10 + 2^3(T(n-3))
= 7n - 10 + 8(n - 3 + 2(T(n-4))) = 15n - 34 + 2^4(T(n-4))
= (2^4 - 1)n - 34 + 2^4(T(n-4))
...and so on.
Effectively the recurrence boils down to:
T(n) = (2n+1) * T(1) − n − 2
See the Math Stack Exchange link for how we arrive at this solution. Taking T(1) to be constant, the dominating factor in the above recurrence is (2(n + 1)).
Therefore, the rate of growth of given recurrence is O(2n).

solving recurrence T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2

Need some help on solving this runtime recurrence, using Big-Oh:
T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2
I don't quite get how to use the Master Theorem here
For n big enough you can assume T(n/2 - 1) == T(n/2), so you can change
T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2
into
T(n) = 2*T(n/2) + n/2 + 2
And use Master Theorem (http://en.wikipedia.org/wiki/Master_theorem) for
T(n) = a*T(n/b) + f(n)
a = 2
b = 2
f(n) = n/2 + 2
c = 1
k = 0
log(a, b) = 1 = c
and so you have (case 2, since log(a, b) = c)
T(n) = O(n**c * log(n)**(k + 1))
T(n) = O(n * log(n))

Solve the recurrence: T(n)=2T(n/2)+n/logn

I can find the sum of each row (n/log n-i) and also I can draw its recursive tree but I can't calculate sum of its rows.
T(n)=2T(n/2)+n/logn
T(1) = 1
Suppose n = 2^k;
We know for harmonic series (euler formula):
Sum[i = 1 to n](1/i) ~= log(n) [n -> infinity]
t(n) = 2t(n/2) + n/log(n)
= 2(2t(n/4) + n/2/log(n/2)) + n/log(n)
= 4t(n/4) + n/log(n/2) + n/log(n)
= 4(2t(n/8) + n/4/log(n/4)) + n/log(n/2) + n/log(n)
= 8t(n/8) + n/log(n/4) + n/log(n/2) + n/log(n)
= 16t(n/16) + n/log(n/8) + n/log(n/4) + n/log(n/2) + n/log(n)
= n * t(1) + n/log(2) + n/log(4) + ... + n/log(n/2) + n/log(n)
= n(1 + Sum[i = 1 to log(n)](1/log(2^i)))
= n(1 + Sum[i = 1 to log(n)](1/i))
~= n(1 + log(log(n)))
= n + n*log(log(n)))
~= n*log(log(n)) [n -> infinity]
When you start unrolling the recursion, you will get:
Your base case is T(1) = 1, so this means that n = 2^k. Substituting you will get:
The second sum behaves the same as harmonic series and therefore can be approximated as log(k). Now that k = log(n) the resulting answer is:
Follow Extended Masters Theorem Below.
Using Extended Masters Theorem T(n)=2T(n/2)+n/logn can be solved easily as follows.
Here n/log n part can be rewritten as n * (logn)^-1,
Effictively maaking value of p=-1.
Now Extended Masters Theorem can be applied easily, it will relate to case 2b of Extended Masters Theorem .
T(n)= O(nloglogn)
Follow this for more detailed explanation
https://www.youtube.com/watch?v=Aude2ZqQjUI

time complexity of following recurrence?

Find out the time complexity (Big Oh Bound) of the recurrence T(n) = T(⌊n⌋) + T(⌈n⌉) + 1.
How the time complexity of this comes out to be O(n)??
You probably ment T(n)=T(⌊n/2⌋)+ T(⌈n/2⌉) + 1.
Lets calculate first few values of T(n).
T(1) = 1
T(2) = 3
T(3) = 5
T(4) = 7
We can guess that T(n) = 2 * n - 1.
Lets prove that by mathematical induction
Basis
T(1) = 1
T(2) = 3
T(3) = 5
T(4) = 7
Inductive step
T(2*n) = T(⌊2*n/2⌋)+ T(⌈2*n/2⌉) + 1
= T(⌊n⌋)+ T(⌈n⌉) + 1
= (2*n - 1) + (2*n - 1) + 1
= 4*n - 1
= 2 * (2*n) - 1
T(2*n+1) = T(⌊(2*n+1)/2⌋)+ T(⌈(2*n+1)/2⌉) + 1
= T(n)+ T(n+1) + 1
= (2*n - 1) + (2*(n+1) - 1) + 1 =
= 4*n + 1 =
= (2*n+1)*2 - 1
Since both the basis and the inductive step have been proved, it has now been proved by mathematical induction that T(n) holds for all natural 2*n - 1.
T(n) = 2*n - 1 = O(n)
What you have currently does not make sense. Since n is usually taken to be a natural number, then n=⌊n⌋=⌈n⌉. The recurrence then reads: break down a problem of size n into two problems of size n and spend time 1 doing that. The two new problems you just created will be split in turn, and so on- all you are doing is creating more work for yourself.

Resources