Big-O notation of T(sqrtn) + 5 - big-o

I face a question: T(N) = T(sqrt(N)) + 5.
I am wondering can I solve it in this way?
T(N) = O(sqrt(N)) + O(5)
Since O(5) = O(1) is a constant, we can ignore it.
So the big O notation of T(N) is O(N^(1/2)).
Or can I just say its notation is O(N) as there is no big difference between O(N) and O(sqrt(N)).
Thank you!

(For neatness, let's replace the 5 with a constant c)
Substituting this function into itself multiple times, we can spot a pattern emerging:
When do we stop iterating? When the stopping condition is met. Take this to be n = 2 (not 1 as is usually the case, since the argument is asymptotic to n = 1):
So the final cost of this function is:
Note that the constant c (= 5) does not matter in terms of asymptotic complexity. (And also that the result is not simply log n but log log n)
EDIT: if you were to choose a different stopping condition n = a, a > 1, then the above step would become:
Which only differs by a constant from the original result.

Edit: I made a mistake in the original answer, assuming that n is a power of 2 and reducing the recurrence to 1, 2, 4, ... n, which is wrong. I apologize for the misleading. Here is the updated answer.
From,
T(n) = T(sqrt(n)) + 5,
we can also write it as:
T(n) = T(n^(1/2)) + 5,
then by recurrence:
T(n^(1/2)) = T(n^(1/4)) + 5,
T(n^(1/4)) = T(n^(1/8)) + 5,
...
T(n^(2^-m)) = T(n^(2^-(m+1)) + 5,
this doesn't show a constant where we can stop. Therefore we need to substitute n.
Try:
n = 2^(2^m),
where we have
m = log log n
starting from m = 0, which is n = 2,
then we have:
T(n) = T(2) + 5 + 5 + ... + 5,
how many 5s are there?
We count like this:
2^(2^0), 2^(2^1), 2^(2^2), ... 2^(2^m)
So there are m 5s, where m = log log n. So
T(n) = T(2) + 5 log log n,
which is,
T(n) = O(log log n).

Related

what the Time Complexity of T(n) = 2T(n/2) +O(1)

i want to know what the Time Complexity of my recursion method :
T(n) = 2T(n/2) + O(1)
i saw a result that says it is O(n) but i don't know why , i solved it like this :
T(n) = 2T(n/2) + 1
T(n-1) = 4T(n-1/4) + 3
T(n-2) = 8T(n-2/8) + 7
...... ………….. ..
T(n) = 2^n+1 T (n/2^n+1) + (2^n+1 - 1)
I think you have got the wrong idea about recursive relations. You can think as follows:
If T(n) represents the value of function T() at input = n then the relation says that output is one more double the value at half of the current input. So for input = n-1 output i.e. T(n-1) will be one more than double the value at half of this input, that is T(n-1) = 2*T((n-1)/2) + 1
The above kind of recursive relation should be solved as answered by Yves Daoust. For more examples on recursive relations, you can refer this
Consider that n=2^m, which allows you to write
T(2^m)=2T(2^(m-1))+O(1)
or by denoting S(m):= T(2^m),
S(m)=2 S(m-1) + O(1),
2^m S(m)=2 2^(m-1)S(m-1) + 2^(m-1) O(1)
and finally,
R(m) = R(m-1) + 2^(m-1) O(1).
Now by induction,
R(m) = R(0) + (2^m-1) O(1),
T(n) = S(m) = 2^(1-m) T(2^m) + (2 - 2^(m-1)) O(1) = 2/n T(n) + (2 - n/2) O(1).
There are a couple of rules that you might need to remember. If you can remember these easy rules then Master Theorem is very easy to solve recurrence equations. The following are the basic rules which needs to be remembered
case 1) If n^(log b base a) << f(n) then T(n) = f(n)
case 2) If n^(log b base a) = f(n) then T(n) = f(n) * log n
case 3) 1) If n^(log b base a) >> f(n) then T(n) = n^(log b base a)
Now, lets solve the recurrence using the above equations.
a = 2, b = 2, f(n) = O(1)
n^(log b base a) = n = O(n)
This is case 3) in the above equations. Hence T(n) = n^(log b base a) = O(n).

Recurrence relation problems

I had the following recurrence relations on a test and I got them wrong, I am not sure why.
1. T(n) = 2T(n/4) + O(n^0.5)
Using MT: a = 2, b = 4, f(n) = n^0.5
Comparing n^(log_4(2)) to n^0.5 => n^0.5 == n^0.5
Thus, case 3: Θ(n log n)
Apparently thats wrong, don't know why.
2. T(n) = 3T(n/4) + O(n^0.75)
Using MT: a = 3, b = 4, f(n) = n^0.75
Comparing n^(log_4(3)) to n^0.75
Thus, case 1: Θ(n^log_4(3))
3. T(n) = T(n/2) + T(n/3) + O(n log n)
This isn't in a form that can be solved with MT and I cannot easily find a p-value without aid. Thus, I took a stab in the dark and was wrong. No clue where to begin with this one.
4. T(n) = 2T(n/2) + n log n
Using MT: a = 2, b = 2, f(n) = n log n
Comparing n^log_2(2) to n log n => n^1 to n log n
Case 2: Θ(n log n)
You may have misread or omitted some details of the Master theorem. Will refer to the Wikipedia article.
1)
The second case states that:
Since c_crit = 0.5 and k = 0, the final complexity is:
You just missed out the exponent on the n in front.
2)
This is correct.
4)
You missed another detail here: k = 1, and there needs to be an additional factor of log n:
3)
This is slightly trickier. Using the Akra-Bazzi method:
To solve for the exponent p, just use Newton-Raphson on your calculator - gives p = 0.787885.... Performing the integration by parts:
Substituting in:

Time complexity and Master's theorem

I am trying to get a better understanding of Master's Theorem and time complexity. I found some examples online that I am practicing. Is my work correct?
T(N) = 3T(N/3) + O(N)
Will have time complexity Θ(n), because log(base 3) 3 = 1. Thus, Θ(n^1) + O(N) is simplified to Θ(n).
T(N) = 3T(2N/3) + O(1)
This one I don't understand. I Know it is the stooge sort algorithm, but if using master's theorem, wouldn't a and b both be 3, making log(base 3) 3 = 1, making this Θ(n)? I know that is incorrect but I am having a tough time understanding master's theorem.
T(N) = 4T(N/2) + O(N)
Will have time complexity Θ(n^2), because log(base 2) 4 = 2. Then, N^(log(base 2) 4) = N^2
T(N) = 2T(N/2) + O(N log(N))
Here I am thinking it is simply O(N log(N)), since log(base 2) of 2 is one.
by master theorem :-
if
T(n) = aT(n/b) + f(n^k)
if loga/logb > k then T(n) = O(n^(loga/logb))
if loga/logb < k then T(n) = O(n^k)
else T(n) = O(n*logn)
1. a = 3 b = 3 k=0 loga/logb = 1 = k hence T(n) = O(nlogn)
2. a = 3 b = 3/2 k=0 log3/log(3/2) > 1 > k hence T(n) = O(n^(log3/log(3/2)))
3. a = 4 b = 2 k = 1 log4/log2 = 2 > 1 hence T(n) = O(n^2)
Let's first elaborate on the master theorem and analyze your four cases.
In the above picture, it shows that at each level the complexity is:
And we sum up all computations from all levels and get the total:
Then we only need to analyze the total function which is a geometric series and determined by the multiplicative factor(or common ratio): a/b^d.
If the common ratio is bigger than one, there will be exponential growth towards the last term:
which is the big O when a/b^d > 1 or d<log_b a.
If the common ratio is less than 1, there will be an exponential decay starting from the first term n^d which is the dominant one or the big O when a/b^d < 1 or d > log_b a.
If the common ratio is equal to 1, the series will be a constant sequence and we sum up all terms:
In your case 1 where T(N) = 3T(N/3) + O(N), we first see the common ratio a/b^d = 3/3^1=1.
And for your case 2 T(N) = 3T(2N/3) + O(1) it would be a/b^d = 3/(3/2)^0 = 3 > 1(where a = 3, b = 3/2 and d = 0) and hence the big O would be: .
For your case 3 T(N) = 4T(N/2) + O(N) a would be 4, b would be 2 and d would be 1.
For your fourth case T(N) = 2T(N/2) + O(N log(N)) the common ratio would be smaller than 1 since a/b^d = 2/2^1.x where d > 1, then the geometric series would be exponentially decaying. And hence the first term n log n would dominate the series and hence it would be the big O.
References:
https://www.coursera.org/learn/algorithmic-toolbox

Proving a recurrence relation with induction

I've been having trouble with an assignment I received with the course I am following.
The assignment in question:
Use induction to prove that when n >= 2 is an exact power of 2, the solution of
the recurrence:
T(n) = {2 if n = 2,
2T(n/2)+n if n =2^k with k > 1 }
is T(n) = nlog(n)
NOTE: the logarithms in the assignment have base 2.
The base case here is obvious, when n = 2, we have that 2 = 2log(2)
However, I am stuck on the step here and I am not sure how to solve this.
Step. Let us assume that the statement holds for 2^m for all m <= k and let us show it for 2^{k+1}.
Then, T(2^{k+1}) = 2T(2^k) + 2^{k+1}.
By the inductive assumption T(2^k) = 2^k*log(2^k), i.e., T(2^k) = k*2^k (since the logarithms have base 2 here).
Hence, T(2^{k+1}) = 2*k*2^k + 2^{k+1} = 2^{k+1}*(k+1), which can be written as 2^{k+1}*log(2^{k+1}), completing the proof.

Can someone help solve this recurrence relation? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
T(n) = 2T(n/2) + 0(1)
T(n) = T(sqrt(n)) + 0(1)
In the first one I use substitution method for n, logn, etc; all gave me wrong answers.
Recurrence trees: I don't know if I can apply as the root will be a constant.
Can some one help?
Let's look at the first one. First of all, you need to know T(base case). You mentioned that it's a constant, but when you do the problem it's important that you write it down. Usually it's something like T(1) = 1. I'll use that, but you can generalize to whatever it is.
Next, find out how many times you recur (that is, the height of the recursion tree). n is your problem size, so how many times can we repeatedly divide n by 2? Mathematically speaking, what's i when n/(2^i) = 1? Figure it out, hold onto it for later.
Next, do a few substitutions, until you start to notice a pattern.
T(n) = 2(2(2T(n/2*2*2) + θ(1)) + θ(1)) + θ(1)
Ok, the pattern is that we multiply T() by 2 a bunch of times, and divide n by 2 a bunch of times. How many times? i times.
T(n) = (2^i)*T(n/(2^i)) + ...
For the big-θ terms at the end, we use a cute trick. Look above where we have a few substitutions, and ignore the T() part. We want the sum of the θ terms. Notice that they add up to (1 + 2 + 4 + ... + 2^i) * θ(1). Can you find a closed form for 1 + 2 + 4 + ... + 2^i? I'll give you that one; it's (2^i - 1). It's a good one to just memorize, but here's how you'd figure it out.
Anyway, all in all we get
T(n) = (2^i) * T(n/(2^i)) + (2^i - 1) * θ(1)
If you solved for i earlier, then you know that i = log_2(n). Plug that in, do some algebra, and you get down to
T(n) = n*T(1) + (n - 1)*θ(1). T(1) = 1. So T(n) = n + (n - 1)*θ(1). Which is n times a constant, plus a constant, plus n. We drop lower order terms and constants, so it's θ(n).
Prasoon Saurav is right about using the master method, but it's important that you know what the recurrence relation is saying. The things to ask are, how much work do I do at each step, and what is the number of steps for an input of size n?
Use Master Theorem to solve such recurrence relations.
Let a be an integer greater than or equal to 1 and b be a real number greater than
1. Let c be a positive real number and
d a nonnegative real number. Given a recurrence of the form
T (n) = a T(n/b) + nc .. if n > 1
T(n) = d .. if n = 1
then for n a power of b,
if logb a < c, T (n) = Θ(nc),
if logb a = c, T (n) = Θ(nc log n),
if logb a > c, T (n) = Θ(nlogb a).
1) T(n) = 2T(n/2) + 0(1)
In this case
a = b = 2;
logb a = 1; c = 0 (since nc =1 => c= 0)
So Case (3) is applicable. So T(n) = Θ(n) :)
2) T(n) = T(sqrt(n)) + 0(1)
Let m = log2 n;
=> T(2m) = T( 2m / 2 ) + 0(1)
Now renaming K(m) = T(2m) => K(m) = K(m/2) + 0(1)
Apply Case (2).
For part 1, you can use Master Theorem as #Prasoon Saurav suggested.
For part 2, just expand the recurrence:
T(n) = T(n ^ 1/2) + O(1) // sqrt(n) = n ^ 1/2
= T(n ^ 1/4) + O(1) + O(1) // sqrt(sqrt(n)) = n ^ 1/4
etc.
The series will continue to k terms until n ^ 1/(2^k) <= 1, i.e. 2^k = log n or k = log log n. That gives T(n) = k * O(1) = O(log log n).
Let's look at the first recurrence, T(n) = 2T(n/2) + 1. The n/2 is our clue here: each nested term's parameter is half that of its parent. Therefore, if we start with n = 2^k then we will have k terms in our expansion, each adding 1 to the total, before we hit our base case, T(0). Hence, assuming T(0) = 1, we can say T(2^k) = k + 1. Now, since n = 2^k we must have k = log_2(n). Therefore T(n) = log_2(n) + 1.
We can apply the same trick to your second recurrence, T(n) = T(n^0.5) + 1. If we start with n = 2^2^k we will have k terms in our expansion, each adding 1 to the total. Assuming T(0) = 1, we must have T(2^2^k) = k + 1. Since n = 2^2^k we must have k = log_2(log_2(n)), hence T(n) = log_2(log_2(n)) + 1.
Recurrence relations and recursive functions as well should be solved by starting at f(1). In case 1, T(1) = 1; T(2) = 3; T(4) = 7; T(8) = 15; It's clear that T(n) = 2 * n -1, which in O notation is O(n).
In second case T(1) = 1; T(2) = 2; T(4) = 3; T(16) = 4; T(256) = 5; T(256 * 256) =6; It will take little time to find out that T(n) = log(log(n)) + 1 where log is in base 2. Clearly this is O(log(log(n)) relation.
Most of the time the best way to deal with recurrence is to draw the recurrence tree and carefully handle the base case.
However here I will give you slight hint to solve using substitution method.
In recurrence first try substitution n = 2^k
In recurrence second try substitution n = 2^2^k

Resources