Solving the recurrence T(n) = 2T(sqrt(n)) - algorithm

I would like to solve the following recurrence relation:
T(n) = 2T(√n);
I'm guessing that T(n) = O(log log n), but I'm not sure how to prove this. How would I show that this recurrence solves to O(log log n)?

One idea would be to simplify the recurrence by introducing a new variable k such that 2k = n. Then, the recurrence relation works out to
T(2k) = 2T(2k/2)
If you then let S(k) = T(2k), you get the recurrence
S(k) = 2S(k / 2)
Note that this is equivalent to
S(k) = 2S(k / 2) + O(1)
Since 0 = O(1). Therefore, by the Master Theorem, we get that S(k) = Θ(k), since we have that a = 2, b = 2, and d = 0 and logb a > d.
Since S(k) = Θ(k) and S(k) = T(2k) = T(n), we get that T(n) = Θ(k). Since we picked 2k = n, this means that k = log n, so T(n) = Θ(log n). This means that your initial guess of O(log log n) is incorrect and that the runtime is only logarithmic, not doubly-logarithmic. If there was only one recursive call being made, though, you would be right that the runtime would be O(log log n).
Hope this helps!

You can solve this easily by unrolling the recursion:
Now the recurrence will finish when T(1) = a and you can find the appropriate a. When a = 0 or 1 it does not make sense but when a=2 you will get:
Substituting the k into latest part of the first equation you will get the complexity of O(log(n)).
Check other similar recursions here:
T(n) = 2T(n^(1/2)) + log n
T(n) = T(n^(1/2)) + Θ(lg lg n)
T(n) = T(n^(1/2)) + 1

Related

Solution to T(n) = 2T(n/2) + log n

So my recursive equation is T(n) = 2T(n/2) + log n
I used the master theorem and I find that a = 2, b =2 and d = 1.
which is case 2. So the solution should be O(n^1 log n) which is O(n log n)
I looked online and some found it O(n). I'm confused
Can anyone tell me how it's not O(n log n) ?
This should not be case 2, but case 1.
With T(n) = 2T(n/2) + log n the critical exponent of c_crit = log_2(2) = 1 as you found, I think correctly. But certainly log n is O(n^c) for some c < 1, even for all 0 < c < 1, so case 1 applies and the whole thing is O(n^c_crit) = O(n^1) = O(n).
I'm not familiar with d in the master theorem. The wikipedia article on the Master Theorem states that you need to find c = log_b a, the critical exponent. Here the c = 1. Case 2 requires we have f(n) = Theta(n log n), but in reality we have f(n) = log n. Instead, this problem falls into case 1 (see if you can figure out why!), which means T(n) = Theta(n), as you found elsewhere.

Solve the recurrence equation T(n) = T(n/3) + O(1) using iteration or substitution

I realize that solving this with Master's theorem gives the answer of Big Theta(log n). However, I want to know more and find the base of the logarithm. I tried reading about masters theorem more to find out about the base but could not find more information on wikipedia (https://en.wikipedia.org/wiki/Master_theorem_(analysis_of_algorithms)).
How would I solve this using recursion tree or substitution method for solving recurrences?
You can assume n = 2^K and T(0) = 0.
Don't set n=2^k but n=3^k
thus T(3^k) = T(3^{k-1}) + c
recurrence becomes w_k = w_{k-1} + c
Assuming T(1) = 1
with the general term: w_k = ck+1
and w_0 = 1
you conclude T(n) = clog_3(n) + 1
and thus T(n) = O(log_3(n))
T(n) = T(n/3) + O(1) = T(n/9) + O(1) + O(1) = T(n/27) + O(1) + O(1) + O(1) = …
After log3(n) steps, the term T vanishes and T(n) = O(log(n)).

what the Time Complexity of T(n) = 2T(n/2) +O(1)

i want to know what the Time Complexity of my recursion method :
T(n) = 2T(n/2) + O(1)
i saw a result that says it is O(n) but i don't know why , i solved it like this :
T(n) = 2T(n/2) + 1
T(n-1) = 4T(n-1/4) + 3
T(n-2) = 8T(n-2/8) + 7
...... ………….. ..
T(n) = 2^n+1 T (n/2^n+1) + (2^n+1 - 1)
I think you have got the wrong idea about recursive relations. You can think as follows:
If T(n) represents the value of function T() at input = n then the relation says that output is one more double the value at half of the current input. So for input = n-1 output i.e. T(n-1) will be one more than double the value at half of this input, that is T(n-1) = 2*T((n-1)/2) + 1
The above kind of recursive relation should be solved as answered by Yves Daoust. For more examples on recursive relations, you can refer this
Consider that n=2^m, which allows you to write
T(2^m)=2T(2^(m-1))+O(1)
or by denoting S(m):= T(2^m),
S(m)=2 S(m-1) + O(1),
2^m S(m)=2 2^(m-1)S(m-1) + 2^(m-1) O(1)
and finally,
R(m) = R(m-1) + 2^(m-1) O(1).
Now by induction,
R(m) = R(0) + (2^m-1) O(1),
T(n) = S(m) = 2^(1-m) T(2^m) + (2 - 2^(m-1)) O(1) = 2/n T(n) + (2 - n/2) O(1).
There are a couple of rules that you might need to remember. If you can remember these easy rules then Master Theorem is very easy to solve recurrence equations. The following are the basic rules which needs to be remembered
case 1) If n^(log b base a) << f(n) then T(n) = f(n)
case 2) If n^(log b base a) = f(n) then T(n) = f(n) * log n
case 3) 1) If n^(log b base a) >> f(n) then T(n) = n^(log b base a)
Now, lets solve the recurrence using the above equations.
a = 2, b = 2, f(n) = O(1)
n^(log b base a) = n = O(n)
This is case 3) in the above equations. Hence T(n) = n^(log b base a) = O(n).

Deduct time complexity from this Recurrence formula?

I was reading a time complexity calculation related question on SO but I can't comment there (not enough reps).
What's the time complexity of this algorithm for Palindrome Partitioning?
I have a question regarding going from 1st to 2nd equation here:
Now you can write the same expression for H(n-1), then substitute back
to simplify:
H(n) = 2 H(n-1) + O(n) =========> Eq.1
And this solves to
H(n) = O(n * 2^n) =========> Eq.2
Can someone illustrate how he got Eq.2 from Eq.1? Thank you.
Eq 1. is a recurrence relation. See the link for a tutorial on how to solve these types of equations, but we can solve via expansion as below:
H(n) = 2H(n-1) + O(n)
H(n) = 2*2H(n-2) + 2O(n-1) + O(n)
H(n) = 2*2*2H(n-3) + 2*2O(n-2) + 2O(n-1) + O(n)
...
H(n) = 2^n*H(1) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
since H(1) = O(n) (see the original question)
H(n) = 2^n*O(n) + 2^(n-1)*O(1) + ... + 2O(n-1) + O(n)
H(n) = O(n * 2^n)
We need to homogenize the equation, in this simple case just by adding a constant to each side. First, designate O(n) = K to avoid ealing with the O notation at this stage:
H(n) = 2 H(n-1) + K
Then add a K to each side:
H(n) + K = 2 (H(n-1) + K)
Let G(n) = H(n) + K, then
G(n) = 2 G(n-1)
This is a well-known homogeneous 1-st order recurrence, with the solution
G(n) = G(0)×2n = G(1)×2n-1
Since H(1) = O(n), G(1) = H(1) + K = O(n) + O(n) = O(n),
G(n) = O(n)×2n-1 = O(n×2n-1) = O(n×2n)
and
H(n) = G(n) - K = O(n×2n) - O(n) = O(n×2n)
They are wrong.
Let's assume that O refers to a tight bound and substitute O(n) with c * n for some constant c. Unrolling the recursion you will get:
When you finish to unroll recursion n = i and b = T(0).
Now finding the sum:
Summing up you will get:
So now it is clear that T(n) is O(2^n) without any n
For people who are still skeptical about the math:
solution to F(n) = 2F(n-1) + n
solution to F(n) = 2F(n-1) + 99n

master theorem base case is constant?

Does Master Theorem assumes T(1) is constant? Say if I have an algorithm with time complexity: T(n) = 2T(n/2) + O(1) and T(1) = O(logn), what is the time complexity of this algorithm?
For the recurrence relation: T(n) = 2T(n/2) + O(1), we have
a = 2
b = 2
an O(1) time cost work outside the recursion
therefore the master theorem case 1 applies, and we have:
T(n) ∈ Θ(n ^ log2(2)) ⇒
T(n) ∈ Θ(n)
A recurrence relation defines a sequence based on an initial term. If a problem of size 1 is solved by the recurrence relation T(1) = f(n), where f ∈ O(logn), the value of T(1) can't be determined, i.e. makes no sense as a recurrence relation.
Your statement T(1) = O(logn) does not make any sense. You basically states that some function that does not depend on n for some reason has a logarithmic complexity (thus depends on n in a logarithmic way).
T(1), T(2), T(532143243) are boundary conditions and can not depend on any parameter. They should be a number (5, pi/e, sqrt(5) - i)
Sometimes it's best just to try things out rather than relying on a Theorem.
T(m) = 2T(m/2) + O(1)
T(1) = O(logn)
T(2) = 2T(1) = 2log(n)
T(4) = 2T(2) = 4log(n)
T(8) = 2T(4) = 8log(n)
T(16) = 2T(8) = 16log(n)
T(32) = 2T(16) = 32log(n)
T(m) = 2T(m/2) = mlog(n)
In conclusion, your initial question is indeed nonsensical as others have pointed out because you are attempting to calculate T(n) when the same n is used in T(1) = O(logn). But we can answer your second question that you have added as a comment.

Resources