I've been having trouble with an assignment I received with the course I am following.
The assignment in question:
Use induction to prove that when n >= 2 is an exact power of 2, the solution of
the recurrence:
T(n) = {2 if n = 2,
2T(n/2)+n if n =2^k with k > 1 }
is T(n) = nlog(n)
NOTE: the logarithms in the assignment have base 2.
The base case here is obvious, when n = 2, we have that 2 = 2log(2)
However, I am stuck on the step here and I am not sure how to solve this.
Step. Let us assume that the statement holds for 2^m for all m <= k and let us show it for 2^{k+1}.
Then, T(2^{k+1}) = 2T(2^k) + 2^{k+1}.
By the inductive assumption T(2^k) = 2^k*log(2^k), i.e., T(2^k) = k*2^k (since the logarithms have base 2 here).
Hence, T(2^{k+1}) = 2*k*2^k + 2^{k+1} = 2^{k+1}*(k+1), which can be written as 2^{k+1}*log(2^{k+1}), completing the proof.
Related
I had the following recurrence relations on a test and I got them wrong, I am not sure why.
1. T(n) = 2T(n/4) + O(n^0.5)
Using MT: a = 2, b = 4, f(n) = n^0.5
Comparing n^(log_4(2)) to n^0.5 => n^0.5 == n^0.5
Thus, case 3: Θ(n log n)
Apparently thats wrong, don't know why.
2. T(n) = 3T(n/4) + O(n^0.75)
Using MT: a = 3, b = 4, f(n) = n^0.75
Comparing n^(log_4(3)) to n^0.75
Thus, case 1: Θ(n^log_4(3))
3. T(n) = T(n/2) + T(n/3) + O(n log n)
This isn't in a form that can be solved with MT and I cannot easily find a p-value without aid. Thus, I took a stab in the dark and was wrong. No clue where to begin with this one.
4. T(n) = 2T(n/2) + n log n
Using MT: a = 2, b = 2, f(n) = n log n
Comparing n^log_2(2) to n log n => n^1 to n log n
Case 2: Θ(n log n)
You may have misread or omitted some details of the Master theorem. Will refer to the Wikipedia article.
1)
The second case states that:
Since c_crit = 0.5 and k = 0, the final complexity is:
You just missed out the exponent on the n in front.
2)
This is correct.
4)
You missed another detail here: k = 1, and there needs to be an additional factor of log n:
3)
This is slightly trickier. Using the Akra-Bazzi method:
To solve for the exponent p, just use Newton-Raphson on your calculator - gives p = 0.787885.... Performing the integration by parts:
Substituting in:
I face a question: T(N) = T(sqrt(N)) + 5.
I am wondering can I solve it in this way?
T(N) = O(sqrt(N)) + O(5)
Since O(5) = O(1) is a constant, we can ignore it.
So the big O notation of T(N) is O(N^(1/2)).
Or can I just say its notation is O(N) as there is no big difference between O(N) and O(sqrt(N)).
Thank you!
(For neatness, let's replace the 5 with a constant c)
Substituting this function into itself multiple times, we can spot a pattern emerging:
When do we stop iterating? When the stopping condition is met. Take this to be n = 2 (not 1 as is usually the case, since the argument is asymptotic to n = 1):
So the final cost of this function is:
Note that the constant c (= 5) does not matter in terms of asymptotic complexity. (And also that the result is not simply log n but log log n)
EDIT: if you were to choose a different stopping condition n = a, a > 1, then the above step would become:
Which only differs by a constant from the original result.
Edit: I made a mistake in the original answer, assuming that n is a power of 2 and reducing the recurrence to 1, 2, 4, ... n, which is wrong. I apologize for the misleading. Here is the updated answer.
From,
T(n) = T(sqrt(n)) + 5,
we can also write it as:
T(n) = T(n^(1/2)) + 5,
then by recurrence:
T(n^(1/2)) = T(n^(1/4)) + 5,
T(n^(1/4)) = T(n^(1/8)) + 5,
...
T(n^(2^-m)) = T(n^(2^-(m+1)) + 5,
this doesn't show a constant where we can stop. Therefore we need to substitute n.
Try:
n = 2^(2^m),
where we have
m = log log n
starting from m = 0, which is n = 2,
then we have:
T(n) = T(2) + 5 + 5 + ... + 5,
how many 5s are there?
We count like this:
2^(2^0), 2^(2^1), 2^(2^2), ... 2^(2^m)
So there are m 5s, where m = log log n. So
T(n) = T(2) + 5 log log n,
which is,
T(n) = O(log log n).
I want to get the tighter bound for this recurrence in which we have two variables m and n.
From my previous answer here, we can derive a binomial summation formula for T(n):
Where
C is such that n = C is the stopping condition for T(n).
In your specific example, the constants are: c1 = 1, c2 = 1, a = 2, b = 4, f(n) = O(m). Since O(m) has no dependence on n, we can simply replace the f term with it.
How do we evaluate the inner sum? Recall the binomial expansion for integer powers:
Setting a = b = 1 we get:
Thus:
I am refreshing on Master Theorem a bit and I am trying to figure out the running time of an algorithm that solves a problem of size n by recursively solving 2 subproblems of size n-1 and combine solutions in constant time.
So the formula is:
T(N) = 2T(N - 1) + O(1)
But I am not sure how can I formulate the condition of master theorem.
I mean we don't have T(N/b) so is b of the Master Theorem formula in this case b=N/(N-1)?
If yes since obviously a > b^k since k=0 and is O(N^z) where z=log2 with base of (N/N-1) how can I make sense out of this? Assuming I am right so far?
ah, enough with the hints. the solution is actually quite simple. z-transform both sides, group the terms, and then inverse z transform to get the solution.
first, look at the problem as
x[n] = a x[n-1] + c
apply z transform to both sides (there are some technicalities with respect to the ROC, but let's ignore that for now)
X(z) = (a X(z) / z) + (c z / (z-1))
solve for X(z) to get
X(z) = c z^2 / [(z - 1) * (z-a)]
now observe that this formula can be re-written as:
X(z) = r z / (z-1) + s z / (z-a)
where r = c/(1-a) and s = - a c / (1-a)
Furthermore, observe that
X(z) = P(z) + Q(z)
where P(z) = r z / (z-1) = r / (1 - (1/z)), and Q(z) = s z / (z-a) = s / (1 - a (1/z))
apply inverse z-transform to get that:
p[n] = r u[n]
and
q[n] = s exp(log(a)n) u[n]
where log denotes the natural log and u[n] is the unit (Heaviside) step function (i.e. u[n]=1 for n>=0 and u[n]=0 for n<0).
Finally, by linearity of z-transform:
x[n] = (r + s exp(log(a) n))u[n]
where r and s are as defined above.
so relabeling back to your original problem,
T(n) = a T(n-1) + c
then
T(n) = (c/(a-1))(-1+a exp(log(a) n))u[n]
where exp(x) = e^x, log(x) is the natural log of x, and u[n] is the unit step function.
What does this tell you?
Unless I made a mistake, T grows exponentially with n. This is effectively an exponentially increasing function under the reasonable assumption that a > 1. The exponent is govern by a (more specifically, the natural log of a).
One more simplification, note that exp(log(a) n) = exp(log(a))^n = a^n:
T(n) = (c/(a-1))(-1+a^(n+1))u[n]
so O(a^n) in big O notation.
And now here is the easy way:
put T(0) = 1
T(n) = a T(n-1) + c
T(1) = a * T(0) + c = a + c
T(2) = a * T(1) + c = a*a + a * c + c
T(3) = a * T(2) + c = a*a*a + a * a * c + a * c + c
....
note that this creates a pattern. specifically:
T(n) = sum(a^j c^(n-j), j=0,...,n)
put c = 1 gives
T(n) = sum(a^j, j=0,...,n)
this is geometric series, which evaluates to:
T(n) = (1-a^(n+1))/(1-a)
= (1/(1-a)) - (1/(1-a)) a^n
= (1/(a-1))(-1 + a^(n+1))
for n>=0.
Note that this formula is the same as given above for c=1 using the z-transform method. Again, O(a^n).
Don't even think about Master's Theorem. You can only use Masther's Theorem when you're given master's theorem when b > 1 from the general form T(n) = aT(n/b) + f(n).
Instead, think of it this way. You have a recursive call that decrements the size of input, n, by 1 at each recursive call. And at each recursive call, the cost is constant O(1). The input size will decrement until it reaches 1. Then you add up all the costs that you used to make the recursive calls.
How many are they? n. So this would take O(2^n).
Looks like you can't formulate this problem in terms of the Master Theorem.
A good start is to draw the recursion tree to understand the pattern, then prove it with the substitution method. You can also expand the formula a couple of times and see where it leads.
See also this question which solves 2 subproblems instead of a:
Time bound for recursive algorithm with constant combination time
May be you could think of it this way
when
n = 1, T(1) = 1
n = 2, T(2) = 2
n = 3, T(3) = 4
n = 4, T(4) = 8
n = 5, T(5) = 16
It is easy to see that this is a geometric series 1 + 2+ 4+ 8 + 16..., the sum of which is
first term (ratio^n - 1)/(ratio - 1). For this series it is
1 * (2^n - 1)/(2 - 1) = 2^n - 1.
The dominating term here is 2^n, therefore the function belongs to Theta(2^n). You could verify it by doing a lim(n->inf) [2^n / (2^n - 1)] = +ve constant.
Therefore the function belongs to Big Theta (2^n)
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
T(n) = 2T(n/2) + 0(1)
T(n) = T(sqrt(n)) + 0(1)
In the first one I use substitution method for n, logn, etc; all gave me wrong answers.
Recurrence trees: I don't know if I can apply as the root will be a constant.
Can some one help?
Let's look at the first one. First of all, you need to know T(base case). You mentioned that it's a constant, but when you do the problem it's important that you write it down. Usually it's something like T(1) = 1. I'll use that, but you can generalize to whatever it is.
Next, find out how many times you recur (that is, the height of the recursion tree). n is your problem size, so how many times can we repeatedly divide n by 2? Mathematically speaking, what's i when n/(2^i) = 1? Figure it out, hold onto it for later.
Next, do a few substitutions, until you start to notice a pattern.
T(n) = 2(2(2T(n/2*2*2) + θ(1)) + θ(1)) + θ(1)
Ok, the pattern is that we multiply T() by 2 a bunch of times, and divide n by 2 a bunch of times. How many times? i times.
T(n) = (2^i)*T(n/(2^i)) + ...
For the big-θ terms at the end, we use a cute trick. Look above where we have a few substitutions, and ignore the T() part. We want the sum of the θ terms. Notice that they add up to (1 + 2 + 4 + ... + 2^i) * θ(1). Can you find a closed form for 1 + 2 + 4 + ... + 2^i? I'll give you that one; it's (2^i - 1). It's a good one to just memorize, but here's how you'd figure it out.
Anyway, all in all we get
T(n) = (2^i) * T(n/(2^i)) + (2^i - 1) * θ(1)
If you solved for i earlier, then you know that i = log_2(n). Plug that in, do some algebra, and you get down to
T(n) = n*T(1) + (n - 1)*θ(1). T(1) = 1. So T(n) = n + (n - 1)*θ(1). Which is n times a constant, plus a constant, plus n. We drop lower order terms and constants, so it's θ(n).
Prasoon Saurav is right about using the master method, but it's important that you know what the recurrence relation is saying. The things to ask are, how much work do I do at each step, and what is the number of steps for an input of size n?
Use Master Theorem to solve such recurrence relations.
Let a be an integer greater than or equal to 1 and b be a real number greater than
1. Let c be a positive real number and
d a nonnegative real number. Given a recurrence of the form
T (n) = a T(n/b) + nc .. if n > 1
T(n) = d .. if n = 1
then for n a power of b,
if logb a < c, T (n) = Θ(nc),
if logb a = c, T (n) = Θ(nc log n),
if logb a > c, T (n) = Θ(nlogb a).
1) T(n) = 2T(n/2) + 0(1)
In this case
a = b = 2;
logb a = 1; c = 0 (since nc =1 => c= 0)
So Case (3) is applicable. So T(n) = Θ(n) :)
2) T(n) = T(sqrt(n)) + 0(1)
Let m = log2 n;
=> T(2m) = T( 2m / 2 ) + 0(1)
Now renaming K(m) = T(2m) => K(m) = K(m/2) + 0(1)
Apply Case (2).
For part 1, you can use Master Theorem as #Prasoon Saurav suggested.
For part 2, just expand the recurrence:
T(n) = T(n ^ 1/2) + O(1) // sqrt(n) = n ^ 1/2
= T(n ^ 1/4) + O(1) + O(1) // sqrt(sqrt(n)) = n ^ 1/4
etc.
The series will continue to k terms until n ^ 1/(2^k) <= 1, i.e. 2^k = log n or k = log log n. That gives T(n) = k * O(1) = O(log log n).
Let's look at the first recurrence, T(n) = 2T(n/2) + 1. The n/2 is our clue here: each nested term's parameter is half that of its parent. Therefore, if we start with n = 2^k then we will have k terms in our expansion, each adding 1 to the total, before we hit our base case, T(0). Hence, assuming T(0) = 1, we can say T(2^k) = k + 1. Now, since n = 2^k we must have k = log_2(n). Therefore T(n) = log_2(n) + 1.
We can apply the same trick to your second recurrence, T(n) = T(n^0.5) + 1. If we start with n = 2^2^k we will have k terms in our expansion, each adding 1 to the total. Assuming T(0) = 1, we must have T(2^2^k) = k + 1. Since n = 2^2^k we must have k = log_2(log_2(n)), hence T(n) = log_2(log_2(n)) + 1.
Recurrence relations and recursive functions as well should be solved by starting at f(1). In case 1, T(1) = 1; T(2) = 3; T(4) = 7; T(8) = 15; It's clear that T(n) = 2 * n -1, which in O notation is O(n).
In second case T(1) = 1; T(2) = 2; T(4) = 3; T(16) = 4; T(256) = 5; T(256 * 256) =6; It will take little time to find out that T(n) = log(log(n)) + 1 where log is in base 2. Clearly this is O(log(log(n)) relation.
Most of the time the best way to deal with recurrence is to draw the recurrence tree and carefully handle the base case.
However here I will give you slight hint to solve using substitution method.
In recurrence first try substitution n = 2^k
In recurrence second try substitution n = 2^2^k