The Recurrence W(n)= 2W(floor(n/2)) + 3 - performance

I have this recurrence:
W(n)= 2W(floor(n/2)) + 3
W(2)=2
My try is as follow:
the tree is like this:
W(n) = 2W(floor(n/2)) + 3
W(n/2) = 2W(floor(n/4)) + 3
W(n/4) = 2W(floor(n/8)) + 3
...
the hight of the tree : I assume its lgn because the tree has 2 branches at every expanding process, not sure though :S
the cost of the last level : 2^lgn * W(2) = 2n
the cost of all levels until level h-1 : 3 * sigma from 0 to lgn-1 of (2^i), which is a geometric series = 3 (n-1)
So, T(n) = 5n - 3 which belong to Theta(n)
my question is: Is that right?

I don't think it's exactly 5n-3 except n is 2t, but your theta is right if you look at Master Theorem, there is no need to calculate it (but its good for startup):
assume you have:
T(n) = aT(n/b) + f(n), where a>=1, b>1 then:
if f(n) = nlogba-eps for any eps > 0 then T(n) = nlogba like your case, in which a=b=2, f(n) = O(1).
f(n) = Theta(nlogba * logkn) then T(n)=Theta(nlogba * logk+1n).
Otherwise is Theta(f(n)). (see detail of constraint in this case in CLRS or wiki, ...)
for detail see wiki.

Well, if you calculate W(4), you find W(4) = 2*W(2) + 3 = 2*2 + 3 = 7, but 5*4 - 3 = 17, so your result for T(n) is not correct. It is close, though, there's just a minor slip in your reasoning (or possibly in a certain other place).
Edit: To be specific, your calculation would work if W(1) was given, but it's W(2) in the question. Either the latter is a typo or you're off by one with the height. (and of course, what Saeed Amiri said.)

Related

Towers of Hanoi Closed Form Solution

So I'm trying to find the closed form solution for the Towers of Hanoi problem. I understand that the recurrence relation is T(n) = 2T(n-1) + 1, because it takes T(n-1) to move the top of the tower back and forth which is why there are two, and the "+ 1" is to move the base. However, I cannot understand why the closed solution is 2^n - 1.
When I am trying to solve for the answer and I use back substitution, I get as far as: T(n) = 8T(n-3) + 4 + 2 + 1, which is T(n) = 2^k (T(n - k)) + 2^k-1 + 2^k-2 + 2^k-3 where k is the step? I know the last part is also geometric series, which means it is 2^(n + 1) - 1/(2-1). But I just can't understand where the answer comes from.
edit:
is it because the geometric series part is not 2^k + 2^k-1 + ... + 2^k-k? which means that the geometric series is not 2^n + 1 - 1, but rather 2^n - 1. and we use H(0) as the base case --> so H(n - k), use k = n?
You can easily prove it by induction. Let's assume that T(n) = 2^n - 1 is true for a given n. Then:
T(n+1) = 2*T(n) + 1
= 2*(2^n-1) + 1
= 2^(n+1) - 2 + 1
= 2^(n+1) - 1
As we know that T(0) = 0 = 2^0 - 1 it proves that for any n the equality T(n) = 2^n - 1 is true.
A trick which sometimes works is to find another function for which the relation is simpler.
So we start from T(n)=2*T(n-1)+1.
It looks similar to T(n)=2*T(n-1) which has an obvious solution.
So we should transform the equation so that +1 is inside of 2*(...).
In this case it's T(n)+1=2*T(n-1)+2=2*(T(n-1)+1).
So T(n)+1=2^n*(T(0)+1) and T(n)=2^n*(T(0)+1)-1.

Expanding Recurrence Relation and Finding Closed Form

I have a snippet of algorithm and must find the worst-case recurrence and find its closed form. So far I have the worst-case recurrence:
T(n)= 2T(n/4) + C for n > 1.
I tried expanding it, and I have this form currently:
T(n) = 2kT(n/4k) + Ck
with k = log4(n) or k = (log2(n))/2.
I have T(1) = 1000.
I am at a loss on what to do next, or how to find its closed form exactly. I still cannot see a pattern in the algorithm or my expansion of T(n). Any insight would be great, thank you.
What you can get is a closed formula when n = 4^k:
T(4^k) = 2^k x 10^3 + C + 2C + ... + 2^(k-1)C
= 2^k x 10^3 + (2^k - 1)C
Where the last eqaulity comes from the geometric series formula.
For all other n, I think the best you can do is to apply the master theorem
Your equation falls in case 1 of the theorem (you have a = 2, b = 4, c = 0).
Therefore:
log_b(a) = 1 / 2
and
T(n) = O(sqrt(n))
I'm not sure if it admits an unique closed form.

solving the recurrence T(floor[n/2]) + T(ceil[n/2]) + n - 1

I have the following recurrence:
T(n) = c for n = 1.
T(n) = T(floor[n/2]) + T(ceil[n/2]) + n - 1 for n > 1.
It looks like merge sort to me so i guess that the solution to the recurrence is Θ(nlogn). According to the master method i have:
a) Θ(1) for n = 1 (constant time).
b) If we drop the floor and ceil we have: (step1)
T(N) = 2T(N/2) + n - 1 => a = 2, b = 2.
logb(a) (base b) = lg(2) = 1 so n^lg(2) = n^1 = n
Having a closer look we know that we have case 2 of master method:
if f(n) = Θ(log(b)a) our solution to the recurrence is T(n) = Θ(log(b)a log(2)n)
The solution is indeed T(n) = Θ(nlogn) but we are off my a constant factor 1.
My first question is:
at step 1 we dropped of ceil and floor. Is this correct ? The second question is how do i get rid of the constant factor 1 ? do i drop it ? or should i name it d and prove that n - 1 is indeed n (if so how do i prove it ?). Lastly is it better to prove it with the substitution method ?
Edit: if we use the substitution method we get:
We guess that the solution is O(n). We need to show that T(n) <= cn.
Substitutting in the recurrence we obtein
T(n) <= c(floor[n/2]) + c(ceil[n/2]) + n/2 - 1 = cn + n/2 - 1
So it is not merge sort ? What do i miss?
It was long time ago, but here goes
Step 1 we dropped of ceil and floor. Is this correct ?
I would rather say
T(floor(n/2)) + T(floor[n/2)) <= T(floor(n/2)) + T(ceil[n/2))
T(floor(n/2)) + T(ceil[n/2)) <= T(ceil(n/2)) + T(ceil[n/2))
in case they are not equal they differ by 1 (and you can ignore any constant)
The second question is how do i get rid of the constant factor 1 ?
You ignore it. Reasoning behind it is : even if constant is huge 10^100 it will be small compared to the size when n grows larger. In real life you can't really ignore really big constants, but that is how real life and theory differs. In any case 1 makes smallest amount of difference.
Lastly is it better to prove it with the substitution method
You can prove how you like, some are just simpler. Simpler are usually better, but other then that 'better' has no meaning. So my answer is no.

Can not figure out complexity of this recurrence

I am refreshing on Master Theorem a bit and I am trying to figure out the running time of an algorithm that solves a problem of size n by recursively solving 2 subproblems of size n-1 and combine solutions in constant time.
So the formula is:
T(N) = 2T(N - 1) + O(1)
But I am not sure how can I formulate the condition of master theorem.
I mean we don't have T(N/b) so is b of the Master Theorem formula in this case b=N/(N-1)?
If yes since obviously a > b^k since k=0 and is O(N^z) where z=log2 with base of (N/N-1) how can I make sense out of this? Assuming I am right so far?
ah, enough with the hints. the solution is actually quite simple. z-transform both sides, group the terms, and then inverse z transform to get the solution.
first, look at the problem as
x[n] = a x[n-1] + c
apply z transform to both sides (there are some technicalities with respect to the ROC, but let's ignore that for now)
X(z) = (a X(z) / z) + (c z / (z-1))
solve for X(z) to get
X(z) = c z^2 / [(z - 1) * (z-a)]
now observe that this formula can be re-written as:
X(z) = r z / (z-1) + s z / (z-a)
where r = c/(1-a) and s = - a c / (1-a)
Furthermore, observe that
X(z) = P(z) + Q(z)
where P(z) = r z / (z-1) = r / (1 - (1/z)), and Q(z) = s z / (z-a) = s / (1 - a (1/z))
apply inverse z-transform to get that:
p[n] = r u[n]
and
q[n] = s exp(log(a)n) u[n]
where log denotes the natural log and u[n] is the unit (Heaviside) step function (i.e. u[n]=1 for n>=0 and u[n]=0 for n<0).
Finally, by linearity of z-transform:
x[n] = (r + s exp(log(a) n))u[n]
where r and s are as defined above.
so relabeling back to your original problem,
T(n) = a T(n-1) + c
then
T(n) = (c/(a-1))(-1+a exp(log(a) n))u[n]
where exp(x) = e^x, log(x) is the natural log of x, and u[n] is the unit step function.
What does this tell you?
Unless I made a mistake, T grows exponentially with n. This is effectively an exponentially increasing function under the reasonable assumption that a > 1. The exponent is govern by a (more specifically, the natural log of a).
One more simplification, note that exp(log(a) n) = exp(log(a))^n = a^n:
T(n) = (c/(a-1))(-1+a^(n+1))u[n]
so O(a^n) in big O notation.
And now here is the easy way:
put T(0) = 1
T(n) = a T(n-1) + c
T(1) = a * T(0) + c = a + c
T(2) = a * T(1) + c = a*a + a * c + c
T(3) = a * T(2) + c = a*a*a + a * a * c + a * c + c
....
note that this creates a pattern. specifically:
T(n) = sum(a^j c^(n-j), j=0,...,n)
put c = 1 gives
T(n) = sum(a^j, j=0,...,n)
this is geometric series, which evaluates to:
T(n) = (1-a^(n+1))/(1-a)
= (1/(1-a)) - (1/(1-a)) a^n
= (1/(a-1))(-1 + a^(n+1))
for n>=0.
Note that this formula is the same as given above for c=1 using the z-transform method. Again, O(a^n).
Don't even think about Master's Theorem. You can only use Masther's Theorem when you're given master's theorem when b > 1 from the general form T(n) = aT(n/b) + f(n).
Instead, think of it this way. You have a recursive call that decrements the size of input, n, by 1 at each recursive call. And at each recursive call, the cost is constant O(1). The input size will decrement until it reaches 1. Then you add up all the costs that you used to make the recursive calls.
How many are they? n. So this would take O(2^n).
Looks like you can't formulate this problem in terms of the Master Theorem.
A good start is to draw the recursion tree to understand the pattern, then prove it with the substitution method. You can also expand the formula a couple of times and see where it leads.
See also this question which solves 2 subproblems instead of a:
Time bound for recursive algorithm with constant combination time
May be you could think of it this way
when
n = 1, T(1) = 1
n = 2, T(2) = 2
n = 3, T(3) = 4
n = 4, T(4) = 8
n = 5, T(5) = 16
It is easy to see that this is a geometric series 1 + 2+ 4+ 8 + 16..., the sum of which is
first term (ratio^n - 1)/(ratio - 1). For this series it is
1 * (2^n - 1)/(2 - 1) = 2^n - 1.
The dominating term here is 2^n, therefore the function belongs to Theta(2^n). You could verify it by doing a lim(n->inf) [2^n / (2^n - 1)] = +ve constant.
Therefore the function belongs to Big Theta (2^n)

Can someone help solve this recurrence relation? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
T(n) = 2T(n/2) + 0(1)
T(n) = T(sqrt(n)) + 0(1)
In the first one I use substitution method for n, logn, etc; all gave me wrong answers.
Recurrence trees: I don't know if I can apply as the root will be a constant.
Can some one help?
Let's look at the first one. First of all, you need to know T(base case). You mentioned that it's a constant, but when you do the problem it's important that you write it down. Usually it's something like T(1) = 1. I'll use that, but you can generalize to whatever it is.
Next, find out how many times you recur (that is, the height of the recursion tree). n is your problem size, so how many times can we repeatedly divide n by 2? Mathematically speaking, what's i when n/(2^i) = 1? Figure it out, hold onto it for later.
Next, do a few substitutions, until you start to notice a pattern.
T(n) = 2(2(2T(n/2*2*2) + θ(1)) + θ(1)) + θ(1)
Ok, the pattern is that we multiply T() by 2 a bunch of times, and divide n by 2 a bunch of times. How many times? i times.
T(n) = (2^i)*T(n/(2^i)) + ...
For the big-θ terms at the end, we use a cute trick. Look above where we have a few substitutions, and ignore the T() part. We want the sum of the θ terms. Notice that they add up to (1 + 2 + 4 + ... + 2^i) * θ(1). Can you find a closed form for 1 + 2 + 4 + ... + 2^i? I'll give you that one; it's (2^i - 1). It's a good one to just memorize, but here's how you'd figure it out.
Anyway, all in all we get
T(n) = (2^i) * T(n/(2^i)) + (2^i - 1) * θ(1)
If you solved for i earlier, then you know that i = log_2(n). Plug that in, do some algebra, and you get down to
T(n) = n*T(1) + (n - 1)*θ(1). T(1) = 1. So T(n) = n + (n - 1)*θ(1). Which is n times a constant, plus a constant, plus n. We drop lower order terms and constants, so it's θ(n).
Prasoon Saurav is right about using the master method, but it's important that you know what the recurrence relation is saying. The things to ask are, how much work do I do at each step, and what is the number of steps for an input of size n?
Use Master Theorem to solve such recurrence relations.
Let a be an integer greater than or equal to 1 and b be a real number greater than
1. Let c be a positive real number and
d a nonnegative real number. Given a recurrence of the form
T (n) = a T(n/b) + nc .. if n > 1
T(n) = d .. if n = 1
then for n a power of b,
if logb a < c, T (n) = Θ(nc),
if logb a = c, T (n) = Θ(nc log n),
if logb a > c, T (n) = Θ(nlogb a).
1) T(n) = 2T(n/2) + 0(1)
In this case
a = b = 2;
logb a = 1; c = 0 (since nc =1 => c= 0)
So Case (3) is applicable. So T(n) = Θ(n) :)
2) T(n) = T(sqrt(n)) + 0(1)
Let m = log2 n;
=> T(2m) = T( 2m / 2 ) + 0(1)
Now renaming K(m) = T(2m) => K(m) = K(m/2) + 0(1)
Apply Case (2).
For part 1, you can use Master Theorem as #Prasoon Saurav suggested.
For part 2, just expand the recurrence:
T(n) = T(n ^ 1/2) + O(1) // sqrt(n) = n ^ 1/2
= T(n ^ 1/4) + O(1) + O(1) // sqrt(sqrt(n)) = n ^ 1/4
etc.
The series will continue to k terms until n ^ 1/(2^k) <= 1, i.e. 2^k = log n or k = log log n. That gives T(n) = k * O(1) = O(log log n).
Let's look at the first recurrence, T(n) = 2T(n/2) + 1. The n/2 is our clue here: each nested term's parameter is half that of its parent. Therefore, if we start with n = 2^k then we will have k terms in our expansion, each adding 1 to the total, before we hit our base case, T(0). Hence, assuming T(0) = 1, we can say T(2^k) = k + 1. Now, since n = 2^k we must have k = log_2(n). Therefore T(n) = log_2(n) + 1.
We can apply the same trick to your second recurrence, T(n) = T(n^0.5) + 1. If we start with n = 2^2^k we will have k terms in our expansion, each adding 1 to the total. Assuming T(0) = 1, we must have T(2^2^k) = k + 1. Since n = 2^2^k we must have k = log_2(log_2(n)), hence T(n) = log_2(log_2(n)) + 1.
Recurrence relations and recursive functions as well should be solved by starting at f(1). In case 1, T(1) = 1; T(2) = 3; T(4) = 7; T(8) = 15; It's clear that T(n) = 2 * n -1, which in O notation is O(n).
In second case T(1) = 1; T(2) = 2; T(4) = 3; T(16) = 4; T(256) = 5; T(256 * 256) =6; It will take little time to find out that T(n) = log(log(n)) + 1 where log is in base 2. Clearly this is O(log(log(n)) relation.
Most of the time the best way to deal with recurrence is to draw the recurrence tree and carefully handle the base case.
However here I will give you slight hint to solve using substitution method.
In recurrence first try substitution n = 2^k
In recurrence second try substitution n = 2^2^k

Resources