Solving the recurrence T(n) = T(n/2) + lg n? [closed] - algorithm

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
I am having some issues on how to solve recurrence relations.
T(n) = T(n/2) + log2(n), T(1) = 1, where n is a power of 2
This is a homework problem, so don't just give me the answer. I was just wondering how to start the problem.
In class we went over the Master theorem. But I don't think that would be the best way to solve this particular relation.
I don't really know how to start the problem... should I just be going
T(n) = T(n/2) + log_base2(n)
T(n/2) = [T(n/4)+log_base2(n/2)]
T(n) = [T(n/4)+log_base2(n/2)] + log_base2(n)
And just keep working my way down to get something I can see makes a basic equation?

This recurrence solves to Θ((log n)2). Here are two ways to see this.
Some Substitutions
If you know that n is a perfect power of two (that is, n = 2k), you can rewrite the recurrence as
T(2k) = T(2k-1) + k
Let's define a new recurrence S(k) = T(2k). Then we get that
S(k) = S(k - 1) + k
If we expand out this recurrence, we get that
S(k) = S(k - 1) + k
= S(k - 2) + (k - 1) + k
= S(k - 3) + (k - 2) + (k - 1) + k
= S(k - 4) + (k - 3) + (k - 2) + (k - 1) + k
...
= S(0) + 1 + 2 + 3 + ... + k
= S(0) + Θ(k2)
Assuming S(0) = 1, then this recurrence solves to Θ(k2).
Since S(k) = T(2k) = T(n), we get that T(n) = Θ(k2) = Θ(log2 n).
Iterating the Recurrence
Another option here is to expand out a few terms of the recurrence and to see if any nice patterns emerge. Here’s what we get:
T(n) = T(n / 2) + lg n
= T(n / 4) + lg (n / 2) + lg n
= T(n / 8) + lg (n / 4) + lg (n / 2) + lg n
...
Eventually, after lg n layers, this recurrence bottoms out and we’re left with this expression:
lg n + lg (n / 2) + lg (n / 4) + ... + lg (n / 2lg n)
Using properties of logarithms, we can rewrite this as
lg n + (lg n - 1) + (lg n - 2) + (lg n - 3) + ... + (lg n - lg n)
Or, written in reverse, this is the sum
0 + 1 + 2 + 3 + ... + lg n
That sum is Gauss’s sum up to lg n, which evaluates to (lg n)(lg n + 1) / 2 = Θ((log n)2).
Hope this helps!

If n is a power of 2 then you can just expand out the recurrence and solve exactly, using that lg(a/b) = lg(a) - lg(b).
T(n) = lg(n) + lg(n/2) + lg(n/4) + ... + lg(1) + 1
= (lg(n) - 0) + (lg(n) - 1) .... + (lg(n) - lg(n)) + 1
= lg(n)*lg(n) - lg(n)*(lg(n)+1)/2 + 1
= lg(n)*lg(n)/2 - lg(n)/2 + 1

This can be done with the Akra-Bazzi theorem. See the third example in http://people.mpi-inf.mpg.de/~mehlhorn/DatAlg2008/NewMasterTheorem.pdf.

This can be solved with Master theorem. Your a=1 and b=2 and f(n) = log(n). Then c = log2(1) = 0. Because of your c and f(n) you fall into the second case (where k=1).
So the solution is Θ(log2 n)

Related

How to solve T(n)=T(n-1)+ (n-1) by Iteration Method?

Please can anyone help me with this:
Solve using iteration Method T (n) = T (n - 1) + (n - 1)
And prove that T (n) ∈Θ (n²)
Please, if you can explain step by step I would be grateful.
I solved an easy way :
T (n) = T (n - 1) + (n - 1)-----------(1)
//now submit T(n-1)=t(n)
T(n-1)=T((n-1)-1)+((n-1)-1)
T(n-1)=T(n-2)+n-2---------------(2)
now submit (2) in (1) you will get
i.e T(n)=[T(n-2)+n-2]+(n-1)
T(n)=T(n-2)+2n-3 //simplified--------------(3)
now, T(n-2)=t(n)
T(n-2)=T((n-2)-2)+[2(n-2)-3]
T(n-2)=T(n-4)+2n-7---------------(4)
now submit (4) in (2) you will get
i.e T(n)=[T(n-4)+2n-7]+(2n-3)
T(n)=T(n-4)+4n-10 //simplified
............
T(n)=T(n-k)+kn-10
now, assume k=n-1
T(n)=T(n-(n-1))+(n-1)n-10
T(n)=T(1)+n^2-n-10
According to the complexity 10 is constant
So , Finally O(n^2)
T(n) = T(n - 1) + (n - 1)
= (T(n - 2) + (n - 2)) + (n - 1)
= (T(n - 3) + (n - 3)) + (n - 2) + (n - 1)
= ...
= T(0) + 1 + 2 + ... + (n - 3) + (n - 2) + (n - 1)
= C + n * (n - 1) / 2
= O(n2)
Hence for sufficient large n, we have:
n * (n - 1) / 3 ≤ T(n) ≤ n2
Therefore we have T(n) = Ω(n²) and T(n) = O(n²), thus T(n) = Θ (n²).
T(n)-T(n-1) = n-1
T(n-1)-T(n-2) = n-2
By substraction
T(n)-2T(n-1)+T(n-2) = 1
T(n-1)-2T(n-2)+T(n-3) = 1
Again, by substitution
T(n)-3T(n-1)+3T(n-2)-T(n-3) = 0
Characteristic equation of the recursion is
x^3-3x^2+3x-1 = 0
or
(x-1)^3 = 0.
It has roots x_1,2,3 = 1,
so general solution of the recursion is
T(n) = C_1 1^n + C_2 n 1^n + C_3 n^2 1^n
or
T(n) = C_1 + C_2 n + C_3 n^2.
So,
T(n) = Θ(n^2).

What is the recurrence relation and big O for T(n) = 2T(n-1) + O(N)?

I thought it would be something like this...
T(n) = 2T(n-1) + O(n)
= 2(2T(n-2)+(n-1)) + (n)
= 2(2(2T(n-3)+(n-2))+(n-1))+(n)
= 8T(n-3) + 4(n-2) + 2(n-1) + n
Which ends up being something like the summation of 2i * (n-i), and my book says this ends up being O(2n). Could anybody explain this to me? I don't understand why it's 2n and not just O(n) as the (n-i) will continue n times.
This recurrence has already been solved on Math Stack Exchange. As I solve this recurrence, I get:
T(n) = n + 2(T(n-1))
= n + 2(n - 1 + 2T(n-2)) = 3n - 2 + 2^2(T(n-2))
= 3n - 2 + 4(n - 2 + 2(T(n-3))) = 7n - 10 + 2^3(T(n-3))
= 7n - 10 + 8(n - 3 + 2(T(n-4))) = 15n - 34 + 2^4(T(n-4))
= (2^4 - 1)n - 34 + 2^4(T(n-4))
...and so on.
Effectively the recurrence boils down to:
T(n) = (2n+1) * T(1) − n − 2
See the Math Stack Exchange link for how we arrive at this solution. Taking T(1) to be constant, the dominating factor in the above recurrence is (2(n + 1)).
Therefore, the rate of growth of given recurrence is O(2n).

Solving recurrence for T(n-1) + sqrt(n)

I'm hoping that I'm going about this problem the correct way. It asks to solve the recurrence:
T(n) = T(n-1) + sqrt(n)
So far I have researched and been able to get to this point:
T(n) = T(n-2) + (n-1) + sqrt(n)
T(n) = T(n-3) + (n-2) + (n-1) + sqrt(n)
T(n) = T(0) + 1 + 2 + ... + (n-2) + (n-1) + sqrt(n)
I'm having trouble understanding what the pattern may be to solve for 1+2+...+sqrt(n)
You start with unrolling the recursion and you should receive a sum of square roots. The sum of square roots is a generalized harmonic number and yours one can be approximated with:
The second line is already wrong.
If T (n) = T (n - 1) + sqrt (n), then T (n - 1) = T (n - 2) + sqrt (n - 1), therefore
T (n) = T (n - 2) + sqrt (n - 1) + sqrt (n)
T (n) = T (n - 3) + sqrt (n - 2) + sqrt (n - 1) + sqrt (n)
T (n) = T (n - 4) + sqrt (n - 3) + sqrt (n - 2) + sqrt (n - 1) + sqrt (n)
and so on.
The sum of the square roots from 1 to n is about the same as the integral of sqrt (x) from 1 to n.

solving recurrence T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2

Need some help on solving this runtime recurrence, using Big-Oh:
T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2
I don't quite get how to use the Master Theorem here
For n big enough you can assume T(n/2 - 1) == T(n/2), so you can change
T(n) = T(n/2) + T(n/2 - 1) + n/2 + 2
into
T(n) = 2*T(n/2) + n/2 + 2
And use Master Theorem (http://en.wikipedia.org/wiki/Master_theorem) for
T(n) = a*T(n/b) + f(n)
a = 2
b = 2
f(n) = n/2 + 2
c = 1
k = 0
log(a, b) = 1 = c
and so you have (case 2, since log(a, b) = c)
T(n) = O(n**c * log(n)**(k + 1))
T(n) = O(n * log(n))

How to solve: T(n) = T(n - 1) + n

I have the following worked out:
T(n) = T(n - 1) + n = O(n^2)
Now when I work this out I find that the bound is very loose. Have I done something wrong or is it just that way?
You need also a base case for your recurrence relation.
T(1) = c
T(n) = T(n-1) + n
To solve this, you can first guess a solution and then prove it works using induction.
T(n) = (n + 1) * n / 2 + c - 1
First the base case. When n = 1 this gives c as required.
For other n:
T(n)
= (n + 1) * n / 2 + c - 1
= ((n - 1) + 2) * n / 2 + c - 1
= ((n - 1) * n / 2) + (2 * n / 2) + c - 1
= (n * (n - 1) / 2) + c - 1) + (2 * n / 2)
= T(n - 1) + n
So the solution works.
To get the guess in the first place, notice that your recurrence relationship generates the triangular numbers when c = 1:
T(1) = 1:
*
T(2) = 3:
*
**
T(3) = 6:
*
**
***
T(4) = 10:
*
**
***
****
etc..
Intuitively a triangle is roughly half of a square, and in Big-O notation the constants can be ignored so O(n^2) is the expected result.
Think of it this way:
In each "iteration" of the recursion you do O(n) work.
Each iteration has n-1 work to do, until n = base case. (I'm assuming base case is O(n) work)
Therefore, assuming the base case is a constant independant of n, there are O(n) iterations of the recursion.
If you have n iterations of O(n) work each, O(n)*O(n) = O(n^2).
Your analysis is correct. If you'd like more info on this way of solving recursions, look into Recursion Trees. They are very intuitive compared to the other methods.
The solution is pretty easy for this one. You have to unroll the recursion:
T(n) = T(n-1) + n = T(n-2) + (n - 1) + n =
= T(n-3) + (n-2) + (n-1) + n = ... =
= T(0) + 1 + 2 + ... + (n-1) + n
You have arithmetic progression here and the sum is 1/2*n*(n-1). Technically you are missing the boundary condition here, but with any constant boundary condition you see that the recursion is O(n^2).
Looks about right, but will depend on the base case T(1). Assuming you will do n steps to get T(n) to T(0) and each time the n term is anywhere between 0 and n for an average of n/2 so n * n/2 = (n^2)/2 = O(n^2).

Resources