Recurrence relation problems - algorithm

I had the following recurrence relations on a test and I got them wrong, I am not sure why.
1. T(n) = 2T(n/4) + O(n^0.5)
Using MT: a = 2, b = 4, f(n) = n^0.5
Comparing n^(log_4(2)) to n^0.5 => n^0.5 == n^0.5
Thus, case 3: Θ(n log n)
Apparently thats wrong, don't know why.
2. T(n) = 3T(n/4) + O(n^0.75)
Using MT: a = 3, b = 4, f(n) = n^0.75
Comparing n^(log_4(3)) to n^0.75
Thus, case 1: Θ(n^log_4(3))
3. T(n) = T(n/2) + T(n/3) + O(n log n)
This isn't in a form that can be solved with MT and I cannot easily find a p-value without aid. Thus, I took a stab in the dark and was wrong. No clue where to begin with this one.
4. T(n) = 2T(n/2) + n log n
Using MT: a = 2, b = 2, f(n) = n log n
Comparing n^log_2(2) to n log n => n^1 to n log n
Case 2: Θ(n log n)

You may have misread or omitted some details of the Master theorem. Will refer to the Wikipedia article.
1)
The second case states that:
Since c_crit = 0.5 and k = 0, the final complexity is:
You just missed out the exponent on the n in front.
2)
This is correct.
4)
You missed another detail here: k = 1, and there needs to be an additional factor of log n:
3)
This is slightly trickier. Using the Akra-Bazzi method:
To solve for the exponent p, just use Newton-Raphson on your calculator - gives p = 0.787885.... Performing the integration by parts:
Substituting in:

Related

Solution to T(n) = 2T(n/2) + log n

So my recursive equation is T(n) = 2T(n/2) + log n
I used the master theorem and I find that a = 2, b =2 and d = 1.
which is case 2. So the solution should be O(n^1 log n) which is O(n log n)
I looked online and some found it O(n). I'm confused
Can anyone tell me how it's not O(n log n) ?
This should not be case 2, but case 1.
With T(n) = 2T(n/2) + log n the critical exponent of c_crit = log_2(2) = 1 as you found, I think correctly. But certainly log n is O(n^c) for some c < 1, even for all 0 < c < 1, so case 1 applies and the whole thing is O(n^c_crit) = O(n^1) = O(n).
I'm not familiar with d in the master theorem. The wikipedia article on the Master Theorem states that you need to find c = log_b a, the critical exponent. Here the c = 1. Case 2 requires we have f(n) = Theta(n log n), but in reality we have f(n) = log n. Instead, this problem falls into case 1 (see if you can figure out why!), which means T(n) = Theta(n), as you found elsewhere.

Big-O notation of T(sqrtn) + 5

I face a question: T(N) = T(sqrt(N)) + 5.
I am wondering can I solve it in this way?
T(N) = O(sqrt(N)) + O(5)
Since O(5) = O(1) is a constant, we can ignore it.
So the big O notation of T(N) is O(N^(1/2)).
Or can I just say its notation is O(N) as there is no big difference between O(N) and O(sqrt(N)).
Thank you!
(For neatness, let's replace the 5 with a constant c)
Substituting this function into itself multiple times, we can spot a pattern emerging:
When do we stop iterating? When the stopping condition is met. Take this to be n = 2 (not 1 as is usually the case, since the argument is asymptotic to n = 1):
So the final cost of this function is:
Note that the constant c (= 5) does not matter in terms of asymptotic complexity. (And also that the result is not simply log n but log log n)
EDIT: if you were to choose a different stopping condition n = a, a > 1, then the above step would become:
Which only differs by a constant from the original result.
Edit: I made a mistake in the original answer, assuming that n is a power of 2 and reducing the recurrence to 1, 2, 4, ... n, which is wrong. I apologize for the misleading. Here is the updated answer.
From,
T(n) = T(sqrt(n)) + 5,
we can also write it as:
T(n) = T(n^(1/2)) + 5,
then by recurrence:
T(n^(1/2)) = T(n^(1/4)) + 5,
T(n^(1/4)) = T(n^(1/8)) + 5,
...
T(n^(2^-m)) = T(n^(2^-(m+1)) + 5,
this doesn't show a constant where we can stop. Therefore we need to substitute n.
Try:
n = 2^(2^m),
where we have
m = log log n
starting from m = 0, which is n = 2,
then we have:
T(n) = T(2) + 5 + 5 + ... + 5,
how many 5s are there?
We count like this:
2^(2^0), 2^(2^1), 2^(2^2), ... 2^(2^m)
So there are m 5s, where m = log log n. So
T(n) = T(2) + 5 log log n,
which is,
T(n) = O(log log n).

Solving the recurrence T(n) = 2T(sqrt(n))

I would like to solve the following recurrence relation:
T(n) = 2T(√n);
I'm guessing that T(n) = O(log log n), but I'm not sure how to prove this. How would I show that this recurrence solves to O(log log n)?
One idea would be to simplify the recurrence by introducing a new variable k such that 2k = n. Then, the recurrence relation works out to
T(2k) = 2T(2k/2)
If you then let S(k) = T(2k), you get the recurrence
S(k) = 2S(k / 2)
Note that this is equivalent to
S(k) = 2S(k / 2) + O(1)
Since 0 = O(1). Therefore, by the Master Theorem, we get that S(k) = Θ(k), since we have that a = 2, b = 2, and d = 0 and logb a > d.
Since S(k) = Θ(k) and S(k) = T(2k) = T(n), we get that T(n) = Θ(k). Since we picked 2k = n, this means that k = log n, so T(n) = Θ(log n). This means that your initial guess of O(log log n) is incorrect and that the runtime is only logarithmic, not doubly-logarithmic. If there was only one recursive call being made, though, you would be right that the runtime would be O(log log n).
Hope this helps!
You can solve this easily by unrolling the recursion:
Now the recurrence will finish when T(1) = a and you can find the appropriate a. When a = 0 or 1 it does not make sense but when a=2 you will get:
Substituting the k into latest part of the first equation you will get the complexity of O(log(n)).
Check other similar recursions here:
T(n) = 2T(n^(1/2)) + log n
T(n) = T(n^(1/2)) + Θ(lg lg n)
T(n) = T(n^(1/2)) + 1

Proving a recurrence relation with induction

I've been having trouble with an assignment I received with the course I am following.
The assignment in question:
Use induction to prove that when n >= 2 is an exact power of 2, the solution of
the recurrence:
T(n) = {2 if n = 2,
2T(n/2)+n if n =2^k with k > 1 }
is T(n) = nlog(n)
NOTE: the logarithms in the assignment have base 2.
The base case here is obvious, when n = 2, we have that 2 = 2log(2)
However, I am stuck on the step here and I am not sure how to solve this.
Step. Let us assume that the statement holds for 2^m for all m <= k and let us show it for 2^{k+1}.
Then, T(2^{k+1}) = 2T(2^k) + 2^{k+1}.
By the inductive assumption T(2^k) = 2^k*log(2^k), i.e., T(2^k) = k*2^k (since the logarithms have base 2 here).
Hence, T(2^{k+1}) = 2*k*2^k + 2^{k+1} = 2^{k+1}*(k+1), which can be written as 2^{k+1}*log(2^{k+1}), completing the proof.

Can someone help solve this recurrence relation? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
T(n) = 2T(n/2) + 0(1)
T(n) = T(sqrt(n)) + 0(1)
In the first one I use substitution method for n, logn, etc; all gave me wrong answers.
Recurrence trees: I don't know if I can apply as the root will be a constant.
Can some one help?
Let's look at the first one. First of all, you need to know T(base case). You mentioned that it's a constant, but when you do the problem it's important that you write it down. Usually it's something like T(1) = 1. I'll use that, but you can generalize to whatever it is.
Next, find out how many times you recur (that is, the height of the recursion tree). n is your problem size, so how many times can we repeatedly divide n by 2? Mathematically speaking, what's i when n/(2^i) = 1? Figure it out, hold onto it for later.
Next, do a few substitutions, until you start to notice a pattern.
T(n) = 2(2(2T(n/2*2*2) + θ(1)) + θ(1)) + θ(1)
Ok, the pattern is that we multiply T() by 2 a bunch of times, and divide n by 2 a bunch of times. How many times? i times.
T(n) = (2^i)*T(n/(2^i)) + ...
For the big-θ terms at the end, we use a cute trick. Look above where we have a few substitutions, and ignore the T() part. We want the sum of the θ terms. Notice that they add up to (1 + 2 + 4 + ... + 2^i) * θ(1). Can you find a closed form for 1 + 2 + 4 + ... + 2^i? I'll give you that one; it's (2^i - 1). It's a good one to just memorize, but here's how you'd figure it out.
Anyway, all in all we get
T(n) = (2^i) * T(n/(2^i)) + (2^i - 1) * θ(1)
If you solved for i earlier, then you know that i = log_2(n). Plug that in, do some algebra, and you get down to
T(n) = n*T(1) + (n - 1)*θ(1). T(1) = 1. So T(n) = n + (n - 1)*θ(1). Which is n times a constant, plus a constant, plus n. We drop lower order terms and constants, so it's θ(n).
Prasoon Saurav is right about using the master method, but it's important that you know what the recurrence relation is saying. The things to ask are, how much work do I do at each step, and what is the number of steps for an input of size n?
Use Master Theorem to solve such recurrence relations.
Let a be an integer greater than or equal to 1 and b be a real number greater than
1. Let c be a positive real number and
d a nonnegative real number. Given a recurrence of the form
T (n) = a T(n/b) + nc .. if n > 1
T(n) = d .. if n = 1
then for n a power of b,
if logb a < c, T (n) = Θ(nc),
if logb a = c, T (n) = Θ(nc log n),
if logb a > c, T (n) = Θ(nlogb a).
1) T(n) = 2T(n/2) + 0(1)
In this case
a = b = 2;
logb a = 1; c = 0 (since nc =1 => c= 0)
So Case (3) is applicable. So T(n) = Θ(n) :)
2) T(n) = T(sqrt(n)) + 0(1)
Let m = log2 n;
=> T(2m) = T( 2m / 2 ) + 0(1)
Now renaming K(m) = T(2m) => K(m) = K(m/2) + 0(1)
Apply Case (2).
For part 1, you can use Master Theorem as #Prasoon Saurav suggested.
For part 2, just expand the recurrence:
T(n) = T(n ^ 1/2) + O(1) // sqrt(n) = n ^ 1/2
= T(n ^ 1/4) + O(1) + O(1) // sqrt(sqrt(n)) = n ^ 1/4
etc.
The series will continue to k terms until n ^ 1/(2^k) <= 1, i.e. 2^k = log n or k = log log n. That gives T(n) = k * O(1) = O(log log n).
Let's look at the first recurrence, T(n) = 2T(n/2) + 1. The n/2 is our clue here: each nested term's parameter is half that of its parent. Therefore, if we start with n = 2^k then we will have k terms in our expansion, each adding 1 to the total, before we hit our base case, T(0). Hence, assuming T(0) = 1, we can say T(2^k) = k + 1. Now, since n = 2^k we must have k = log_2(n). Therefore T(n) = log_2(n) + 1.
We can apply the same trick to your second recurrence, T(n) = T(n^0.5) + 1. If we start with n = 2^2^k we will have k terms in our expansion, each adding 1 to the total. Assuming T(0) = 1, we must have T(2^2^k) = k + 1. Since n = 2^2^k we must have k = log_2(log_2(n)), hence T(n) = log_2(log_2(n)) + 1.
Recurrence relations and recursive functions as well should be solved by starting at f(1). In case 1, T(1) = 1; T(2) = 3; T(4) = 7; T(8) = 15; It's clear that T(n) = 2 * n -1, which in O notation is O(n).
In second case T(1) = 1; T(2) = 2; T(4) = 3; T(16) = 4; T(256) = 5; T(256 * 256) =6; It will take little time to find out that T(n) = log(log(n)) + 1 where log is in base 2. Clearly this is O(log(log(n)) relation.
Most of the time the best way to deal with recurrence is to draw the recurrence tree and carefully handle the base case.
However here I will give you slight hint to solve using substitution method.
In recurrence first try substitution n = 2^k
In recurrence second try substitution n = 2^2^k

Resources