Solve the recurrence: T(n)=2T(n/2)+n/logn - time

I can find the sum of each row (n/log n-i) and also I can draw its recursive tree but I can't calculate sum of its rows.
T(n)=2T(n/2)+n/logn
T(1) = 1

Suppose n = 2^k;
We know for harmonic series (euler formula):
Sum[i = 1 to n](1/i) ~= log(n) [n -> infinity]
t(n) = 2t(n/2) + n/log(n)
= 2(2t(n/4) + n/2/log(n/2)) + n/log(n)
= 4t(n/4) + n/log(n/2) + n/log(n)
= 4(2t(n/8) + n/4/log(n/4)) + n/log(n/2) + n/log(n)
= 8t(n/8) + n/log(n/4) + n/log(n/2) + n/log(n)
= 16t(n/16) + n/log(n/8) + n/log(n/4) + n/log(n/2) + n/log(n)
= n * t(1) + n/log(2) + n/log(4) + ... + n/log(n/2) + n/log(n)
= n(1 + Sum[i = 1 to log(n)](1/log(2^i)))
= n(1 + Sum[i = 1 to log(n)](1/i))
~= n(1 + log(log(n)))
= n + n*log(log(n)))
~= n*log(log(n)) [n -> infinity]

When you start unrolling the recursion, you will get:
Your base case is T(1) = 1, so this means that n = 2^k. Substituting you will get:
The second sum behaves the same as harmonic series and therefore can be approximated as log(k). Now that k = log(n) the resulting answer is:

Follow Extended Masters Theorem Below.
Using Extended Masters Theorem T(n)=2T(n/2)+n/logn can be solved easily as follows.
Here n/log n part can be rewritten as n * (logn)^-1,
Effictively maaking value of p=-1.
Now Extended Masters Theorem can be applied easily, it will relate to case 2b of Extended Masters Theorem .
T(n)= O(nloglogn)
Follow this for more detailed explanation
https://www.youtube.com/watch?v=Aude2ZqQjUI

Related

Solving a recurrence relation using Smoothness Rule

Consider this recurrence relation: x(n) = x(n/2) + n, for n > 1 and x(1) = 0.
Now here the method of back substitution will struggle for values of n not powers of 2, so it is best known here is to use the smoothness rule to solve this types of questions, and when we use the smoothness rule, where we will solve for n = 2^k (for n = values powers of 2) we will have a solution of x(n) = 2n - 1.
However, if we use the method of backward substitution, this recurrence relation will have a solution!
x(n) = x(n/2) + n = x(n/4) + n/2 + n = x(n/8) + n/4 + n/2 + n = x(n/16) + n/8 + n/4 + n/2 + n = ....
where the pattern is
x(n) = x(n/i) + n/(i/2) + n/(i/4) + n/(i/8) + n/(i/16) + ...
which will stop when n = 1 (i.e when i = n) and in this case
x(n) = x(n/n) + n/(n/2) + n/(n/4) + n/(n/8) + n/(n/16) + ... = 1 + 2 + 4 + 8 + 16 + ... = 2^(n+1) - 1
which is two different answers!
So please I am so confused here because in the textbook (Introduction to Analysis and Design of Algorithms by Anany Levitin) it is mention that we should use here the smoothness rule, but as you can see I have solved it exactly by the method of backward substitution where the method was expected here to struggle but nothing has happened!
The transition 1 + 2 + 4 + 8 + 16 + ... = 2^(n+1) - 1 is false.
That is since the number of elements in the left series is log n so the sum is 2^(log n + 1) - 1, which is exactly 2n - 1.
The reason there are log n elements is that n/(2^i) = 1 (the last element of the series is 1) when i = log n.

How to solve T(n) = T(n-3)+n^2 using iteration?

How can I solve T(n) = T(n-3)+n^2 using iteration?By master theorem answer is O(n^3) but I am having trouble solving it by iteration.
By direct resolution of the recurrence:
This is a linear recurrence of the first order. We first solve the homogeneous part,
T(n) = T(n - 3)
which is solved by a constant (more precisely three constants as three intertwined sequences form the solution).
Now for the non-homogeneous part, we use the Ansatz T(n) = an³ + bn² + cn + d, because we know that the difference of two cubic polynomials is a quadratic one.
Then
a(n³ - (n-3)³) + b(n² - (n-3)²) + c(n - (n-3)) = 9an² + 3(-9a + 2b)n + 3(9a - 3b + c) = n²
gives
a = 1/9, b = 1/2, c = 1/2.
Finally
T(n) = (2n³ + 9n² + 9n)/18 + T(0)
and similarly for the two other sequences.
Just try to expand the equation:
T(n) = n^2 + (n-3)^2 + (n-6)^2 + ... + 1 = \Theta(n^3)
T(3) = T(0) + 3²
T(6) = T(3) + 6² = T(0) + 3² + 6²
T(9) = T(6) + 9² = T(0) + 3² + 6² + 9²
...
More generally, T(3N) is the sum of T(0) and nine times the sum of the squared naturals up to N. The well-known Faulhaber formula justifies O(N³).
Similar results hold for T(3N+1) and T(3N+2).

Use substitution to find out the running time of T(1) = 1 ; T(n) = 4T(n/3) + n

I've got to this point 4^logn + 3n[(4/3)^logn -1] and am having trouble finishing it.
Log is base 3.(Wasn't sure how to do subscripts and exponents.)
Thanks .
Masters theorem is general tool for such problems , but if you want the specific solution by substitution then it as follows :
T(n) = 4T(n/3) + n
T(n) = 4(4T(n/9) + n/3) + n = 4^2T(n/9) + (4/3)n + n
T(n) = 4^2(4T(n/27) + n/3^2) + (4/3)n + n = 4^3T(n/27) + (4/3)^2n + (4/3)n + n
T(n) = 4^kT(n/3^k) + (4/3)^(k-1)n + (4/3)^(k-2)n....
boundary condition
n/3^k = 1, k = log3(n)
T(1) = 1
geometric series summation
T(n) = 4^log3(n) + 1((4/3)^log3(n) - 1)/(log3(n)-1)*n
using log rules
T(n) = n^(log3(4)) + n^(1+log3(4/3))/(log3(n)-1) - n/(log3(n)-1)
T(n) = n^(log3(4)) + n^(1+log3(4)-log(3))/(log3(n)-1) - n/(log3(n)-1)
T(n) = n^(log3(4)) + n^(log3(4))/(log3(n)-1) - n/(log3(n)-1)
T(n) = O(n^log3(4))
This is a perfect spot to use the Master Theorem. In your case, we want to write the recurrence in the form
T(n) = aT(n / b) + O(nd).
With your recurrence, you get
a = 4
b = 3
c = 1
Since logba > d, the Master Theorem says that this solves to O(nlog3 4).
Hope this helps!

Solving the similar recurrence: T(n) = 3T(n/3) + n/3

Given..
T(0) = 3 for n <= 1
T(n) = 3T(n/3) + n/3 for n > 1
So the answer's suppose to be O(nlogn).. Here's how I did it and it's not giving me the right answer:
T(n) = 3T(n/3) + n/3
T(n/3) = 3T(n/3^2) + n/3^2
Subbing this into T(n) gives..
T(n) = 3(3T(n/3^2) + n/3^2) + n/3
T(n/3^2) = 3(3(3T(n/3^3) + n/3^3) + n/3^2) + n/3
Eventually it'll look like..
T(n) = 3^k (T(n/3^k)) + cn/3^k
Setting k = lgn..
T(n) = 3^lgn * (T(n/3^lgn)) + cn/3^lgn
T(n) = n * T(0) + c
T(n) = 3n + c
The answer's O(n) though..What is wrong with my steps?
T(n) = 3T(n/3) + n/3
T(n/3) = 3T(n/9) + n/9
T(n) = 3(3T(n/9) + n/9) + n/3
= 9T(n/9) + 2*n/3 //statement 1
T(n/9)= 3T(n/27) + n/27
T(n) = 9 (3T(n/27)+n/27) + 2*n/3 // replacing T(n/9) in statement 1
= 27 T (n/27) + 3*(n/3)
T(n) = 3^k* T(n/3^k) + k* (n/3) // eventually
replace k with log n to the base 3.
T(n) = n T(1) + (log n) (n/3);
// T(1) = 3
T(n) = 3*n + (log n) (n/3);
Hence , O (n* logn)
Eventually it'll look like.. T(n) = 3^k (T(n/3^k)) + cn/3^k
No. Eventually it'll look like
T(n) = 3^k * T(n/3^k) + k*n/3
You've opened the parenthesis inaccurately.
These types of problems are easily solved using the masters theorem. In your case a = b = 3, c = log3(3) = 1 and because n^c grows with the same rate as your f(n) = n/3, you fall in the second case.
Here you have your k=1 and therefore the answer is O(n log(n))
This question can be solved by Master Theorem:
In a recursion form :
where a>=1, b>1, k >=0 and p is a real number, then:
if a > bk, then
if a = bk
a.) if p >-1, then
b.) if p = -1, then
c.) if p < -1, then
3. if a < bk
a.) if p >=0, then
b.) if p<0, then T(n) = O(nk)
So, the above equation
T(n) = 3T(n/3) + n/3
a = 3, b = 3, k =1, p =0
so it fall into 2.a case, where a = bk
So answer will be
O(n⋅log(n))

Confused on recurrence and Big O

I know that
T(n) = T(n/2) + θ(1) can be result to O(Log N)
and my book said this is a case of Binary Search.
But, how do you know that? Is it just by the fact that Binary Search cuts the problem in half each time so it is O(Log N)?
And T(n) = 2T(n/2) + θ(1)
why is it that the result is O(N) and not O(Log N) when the algorithm divides in half each time as well.
Then T(n) = 2T(n/2) + θ(n)
can be result to O(N Log N)? I see the O(N) is from θ(n) and O(Log N) is from T(n/2)
I am really confused about how to determine the Big O of an algorithm that I don't even know how to word it properly. I hope my question is making sense.
Thanks in advance!
an intuitive solution for these problems is to see the result when unfolding the recursive formula:
Let's assume Theta(1) is actually 1 and Theta(n) is n, for simplicity
T(n) = T(n/2) + 1 = T(n/4) + 1 + 1 = T(n/8) + 1 + 1 + 1 = ... =
= T(0) + 1 + ... + 1 [logN times] = logn
T'(n) = 2T'(n/2) + 1 = 2(2T'(n/4) + 1) + 1 = 4T'(n/4) + 2 + 1 =
= 8T'(n/4) + 4 + 2 + 1 = ... = 2^(logn) + 2^(logn-1) + ... + 1 = n + n/2 + ... + 1 =
= 2n-1
T''(n) = 2T(n/2) + n = 2(2T''(n/2) + n/2) + n = 4T''(n/4) + 2* (n/2) + n =
= 8T''(n/8) + 4*n/4 + 2*n/2 + n = .... = n + n + .. + n [logn times] = nlogn
To formally prove these equations, you should use induction. Assume T(n/2) = X, and using it - prove T(n) = Y, as expected.
For example, for the first formula [T(n) = T(n/2) + 1] - and assume base is T(1) = 0
Base trivially holds for n = 1
Assume T(n) <= logn for any k <= n-1, and prove it for k = n
T(n) = T(n/2) + 1 <= (induction hypothesis) log(n/2) + 1 = log(n/2) + log(2) = log(n/2*2) = log(n)
I find an easy way to understand these is to consider the time the algorithm spends on each step of the recurrence, and then add them up to find the total time. First, let's consider
T(n) = T(n/2) + O(1)
where n=64. Let's add up how much the algorithm takes at each step:
T(64) = T(32) + 1 ... 1 so far
T(32) = T(16) + 1 ... 2 so far
T(16) = T(08) + 1 ... 3 so far
T(08) = T(04) + 1 ... 4 so far
T(04) = T(02) + 1 ... 5 so far
T(02) = T(01) + 1 ... 6 so far
T(01) = 1 ... 7 total
So, we can see that the algorithm took '1' time at each step. And, since each step divides the input in half, the total work is the number of times the algorithm had to divide the input in two... which is log2 n.
Next, let's consider the case where
T(n) = 2T(n/2) + O(1)
However, to make things simpler, we'll build up from the base case T(1) = 1.
T(01) = 1 ... 1 so far
now we have to do T(01) twice and then add one, so
T(02) = 2T(01) + 1 ... (1*2)+1 = 3
now we have to do T(02) twice, and then add one, so
T(04) = 2T(02) + 1 ... (3*2)+1 = 7
T(08) = 2T(04) + 1 ... (7*2)+1 = 15
T(16) = 2T(08) + 1 ... (15*2)+1 = 31
T(32) = 2T(16) + 1 ... (32*2)+1 = 63
T(64) = 2T(32) + 1 ... (65*2)+1 = 127
So we can see that here the algorithm has done 127 work - which is equal to the input multiplied by a constant (2) and plus a constant (-1), which is O(n). Basically this recursion corresponds to the infinite sequence (1 + 1/2 + 1/4 + 1/8 + 1/16) which sums to 2.
Try using this method on T(n) = 2T(n/2) + n and see if it makes more sense to you.
One visual solution to find the T(n) for a recursive equation is to sketch it with a tree then:
T(n) = number of nodes * time specified on each node.
In your case T(n) = 2T(n/2) + 1
I write the one in the node itself and expand it to two node T(n/2)
Note T(n/2) = 2T(n/4) + 1, and again I do the same for it.
T(n) + 1
/ \
T(n/2)+1 T(n/2)+1
/ \ / \
T(n/4)+1 T(n/4)+1 T(n/4)+1 T(n/4)+1
... ... .. .. .. .. .. ..
T(1) T(1) .......... ............T(1)
In this tree the number of nodes equals
2*height of tree = 2*log(n) = n
Then T(n) = n * 1 = n = O(n)

Resources