Solving recurrence equation by substituion - algorithm

I am trying to solve T(n) = sqrt(n)*T(sqrt(n))+sqrt(n) by drawing a reccurence tree and solving it be the substitution method. But I am having a hard time wrapping my head around how the sqrt method will affect this process and I am looking for some pointers if possible
Much appreciated!

You can write T(n) = sqrt(n)⋅T(sqrt(n)) + sqrt(n) as T(n) = n1/2 + n3/4 + n7/8 + ...
We know Σi=1,...,∞ 2-i = 1, so you can say
T(n) = n1/2 + n3/4 + n7/8 + ... < n + n + n + ...
Now you only have to compute the length of the sum by solving n2-x < 2 and you get something like x ≈ log log n.
So the solution is T(n) = O(n ⋅ log log n).
Sorry, you was looking for a solution using the substitun method. I never done this so I read a discription on this site.
Let T(sqrt(n)) = k ⋅ sqrt(n) ⋅ log log sqrt(n) + O(sqrt(n)) for constant k.
T(n) = sqrt(n) ⋅ k ⋅ sqrt(n) ⋅ log log sqrt(n) + sqrt(n) + O(sqrt(n))
= k ⋅ n ⋅ log (0.5 log n) + sqrt(n) + O(sqrt(n))
= k ⋅ n ⋅ log log n + log 0.5 + sqrt(n) + O(sqrt(n))
= k ⋅ n ⋅ log log n + O(sqrt(n))
= O(n log log n)
I hope this helps.

Related

Calculating the Recurrence Relation T(n) = sqrt(n * T(sqrt(n)) + n)

I think the complexity of this recursion is O(n^2/3)` by change variable and induction. but I'm not sure. Is this solution correct?
This is a fascinating recurrence and it does not solve to Θ(n). Rather, it appears to solve to Θ(n2/3).
To give an intuition for why this isn't likely to be Θ(n), let's imagine that we're dealing with a really, really large value of n. Then since
T(n) = (nT(√n) + n)1/2
under the assumption that T(√n) ≈ √n, we'd get that
T(n) = (n√n + n)1/2
= (n3/2 + n)1/2
≈ n3/4.
In other words, assuming that T(n) = Θ(n) would give us a different value of T(n) as n gets large.
On the other hand, let's assume that T(n) = Θ(n2/3). Then the same calculation gives us that
T(n) = (nT(n) + n)1/2
= (n · n2/3 + n)1/2
&approx; (n4/3)1/2
= n2/3,
which is consistent with itself.
To validate this, I wrote a short program that printed out different values of T(n) given different inputs and plotted the results. Here's the version of T(n) that I wrote up:
double T(double n) {
if (n <= 2) return n;
return sqrt(n * T(sqrt(n)) + n);
}
I decided to use 2 as a base case, since repeatedly taking square roots will never let n drop to one. I also decided to use real-valued arguments rather than discrete integer values just to make the math easier.
If you plot the values of T(n), you get this curve:
.
This doesn't look like what I'd expect from a linear plot. To figure out what this was, I plotted it on a log/log plot, which has the nice property that all polynomial functions get converted to straight lines whose slope is equal to the exponent. Here's the result:
I consulted my Handy Neighborhood Regression Software and asked it to determine the slope of this line. Here's what it gave back:
Slope: 0.653170918815869
R2: 0.999942627574643
That's a very good fit, and the slope of 0.653 is pretty close to 2/3. So that's more empirical evidence supporting that the recurrence solves to Θ(n2/3).
All that's left to do now is to work out the math. We'll solve this recurrence using a series of substitutions.
First, I'm generally not that comfortable working with exponents in the way that this recurrence uses them, so let's take the log of both sides. (Throughout this exposition, I'll use lg n to mean log2 n).
lg T(n) = lg (nT(√n) + n)1/2
= (1/2) lg (nT(√n) + n)
= (1/2) lg(T(√n) + 1) + (1/2)lg n
≈ (1/2) lg T(√n) + (1/2) lg n
Now, let's define S(n) = lg T(n). Then we have
S(n) = lg T(n)
≈ (1/2) lg T(√ n) + (1/2) lg n
= (1/2) S(√ n) + (1/2) lg n
That's a lot easier to work with, though we still have the problem of the recurrence shrinking by powers each time. To address this, let's do one more substitution, which is a fairly common one when working with these sorts of expressions. Let's define R(n) = S(2n). Then we have that
R(n) = S(2n)
&approx; (1/2)S(√2n) + (1/2) lg 2n
= (1/2)S(2n/2) + (1/2) n
= (1/2) R(n / 2) + (1/2) n
Great! All that's left to do now is to solve R(n).
Now, there is a slight catch here. We could immediately use the Master Theorem to conclude that R(n) = Θ(n). The problem with this is that just knowing that R(n) = Θ(n) won't allow us to determine what T(n) is. Specifically, let's suppose that we just know R(n) = Θ(n). Then we could say that
S(n) = S(2lg n) = R(lg n) = Θ(log n)
to get that S(n) = Θ(log n). However, we get stuck when trying to solve for T(n) in terms of S(n). Specifically, we know that
T(n) = 2S(n) = 2Θ(log n),
but we cannot go from this to saying that T(n) = Θ(n). The reason is that the hidden coefficient in the Θ(log n) is significant here. Specifically, if S(n) = k lg n, then we have that
2k lg n = 2lg nk = nk,
so the leading coefficient of the logarithm will end up determining the exponent on the polynomial. As a result, when solving R, we need to determine the exact coefficient of the linear term, which translates into the exact coefficient of the logarithmic term for S.
So let's jump back to R(n), which we know is
R(n) &approx; (1/2) R(n/2) + (1/2)n.
If we iterate this a few times, we see this pattern:
R(n) &approx; (1/2) R(n/2) + (1/2)n
&approx; (1/2)((1/2) R(n/4) + (1/4)n) + (1/2)n
&approx; (1/4)R(n/4) + (1/8)n + (1/2)n
&approx; (1/4)((1/2)R(n/8) + n/8) + (1/8)n + (1/2)n
&approx; (1/8)R(n/8) + (1/32)n + (1/8)n + (1/2)n.
The pattern appears to be that, after k iterations, we get that
R(n) &approx; (1/2k)R(n/2k) + n(1/2 + 1/8 + 1/32 + 1/128 + ... + 1/22k+1).
This means we should look at the sum
(1/2) + (1/8) + (1/32) + (1/128) + ...
This is
(1/2)(1 + 1/4 + 1/16 + 1/64 + ... )
which, as the sum of a geometric series, solves to
(1/2)(4/3)
= 2/3.
Hey, look! It's the 2/3 we were talking about earlier. This means that R(n) works out to approximately (2/3)n + c for some constant c that depends on the base case of the recurrence. Therefore, we see that
T(n) = 2S(n)
= 2S(2lg n)
= 2R(lg n)
&approx; 2(2/3)lg n + c
= 2lg n2/3 + c
= 2c 2lg n2/3
= 2c n2/3
= Θ(n2/3)
Which matches the theoretically predicted and empirically observed values from earlier.
This was a very fun problem to work through and I'll admit I'm surprised by the answer! I am a bit nervous, though, that I may have missed something when going from
lg T(n) = (1/2) lg (T(√n) + 1) + (1/2) lg n
to
lg T(n) &approx; (1/2) lg T(√ n) + (1/2) lg n.
It's possible that this +1 term actually introduces some other term into the recurrence that I didn't recognize. For example, is there an O(log log n) term that arises as a result? That wouldn't surprise me, given that we have a recurrence that shrinks by a square root. However, I've done some simple data explorations and I'm not seeing any terms in there that look like there's a double log involved.
Hope this helps!
We know that:
T(n) = sqrt(n) * sqrt(T(sqrt(n)) + 1)
Hence:
T(n) < sqrt(n) * sqrt(T(sqrt(n)) + T(sqrt(n)))
1 is replaced by T(sqrt(n)). So,
T(n) < sqrt(2) * sqrt(n) * sqrt(T(sqrt(n))
Now, to find an upper bound we need to solve the following recurrent relation:
G(n) = sqrt(2n) * sqrt(G(sqrt(n))
To solve this, we need to expand it (suppose n = 2^{2^k} and T(1) = 1):
G(n) = (2n)^{1/2} * (2n)^{1/8} * (2n)^{1/32} * ... * (2n)^(1/2^k) =>
G(n) = (2n)^{1/2 + 1/8 + 1/32 + ... + 1/2^k} =
If we take a factor 1/2 from 1/2 + 1/8 + 1/32 + ... + 1/2^k we will have 1/2 * (1 + 1/4 + 1/8 + ... + 1/2^{k-1}).
As we know that 1 + 1/4 + 1/8 + ... + 1/2^{k-1} is a geometric series with a ratio 1/4, it is equal to 4/3 at infinity. Therefore G(n) = Theta(n^{2/3}) and T(n) = O(n^{2/3}).
Notice that as sqrt(n) * sqrt(T(sqrt(n)) < T(n), we can show similar to the previous case that T(n) = Omega(n^{2/3}). It means T(n) = Theta(n^{2/3}).

Q: Solving the following recurrence: T(n) = 8T(n/8) + n log n

I am trying to solve the following recurrence:
T(n) = 8T(n/8) + n* log n.
I currently have done the following but am not sure if I am on the right track:
1. T(n)= 8 T(n/8) + n log n;
2. T(n)= 8^2 T(n/8^2) + n log (n/8) + n log n
3. T(n)= 8^3 T(n/8^3) + n log (n/8^2) + n log (n/8) + n log n
So the general formula came up for me as:
T(n)= 8^k T(n/8^k) + n log(n* n/8 * n/8^2 * ... * n/8^k).
And i am not sure how to continue this. I tried to rewrite the log as
n^k / 8^(k*(k+1)/2), but I still don't see the solution.
You are almost there. Set k = log_8(n) then you can solve for T(n)
Let n = 2^m. We rewrite the recurrence as
T(2^m) = 8.T(2^(m-1)) + n.m log(8).
Now, with T(2^m) = n.S(m),
n.S(m) = 8.n/8.S(m-1) + n.m log(8)
which simplifies to
S(m) = S(m-1) + m.log(8),
leading to the triangular numbers.

Calculating the Recurrence Relation T(n)=T(n / log n) + Θ(1)

The question comes from Introduction to Algorithms 3rd Edition, P63, Problem 3-6, where it's introduced as Iterated functions. I rewrite it as follows:
int T(int n){
for(int count = 0; n > 2 ; ++count)
{
n = n/log₂(n);
}
return count;
}
Then give as tight a bound as possible on T(n).
I can make it O(log n) and Ω(log n / log log n), but can it be tighter?
PS: Using Mathematica, I've learned that when n=1*10^3281039, T(n)=500000
and the same time, T(n)=1.072435*log n/ log log n
and the coefficient declines with n from 1.22943 (n = 2.07126*10^235) to 1.072435 (n = 1*10^3281039).
May this information be helpful.
It looks like the lower bound is pretty good, so I tried to proof that the upper bound is O(log n / log log n).
But let me first explain the other bounds (just for a better understanding).
TL;DR
T(n) is in Θ(log n / log log n).
T(n) is in O(log n)
This can be seen by modifying n := n/log₂n to n := n/2.
It needs O(log₂ n) steps until n ≤ 2 holds.
T(n) is in Ω(log n / log log n)
This can be seen by modifying n := n/log₂(n) to n := n/m, where m is the initial value of log n.
Solving the equation
n / (log n)x < 2 for x leads us to
log n - x log log n < log 2
⇔ log n - log 2 < x log log n
⇔ (log n - log 2) / log log n < x
⇒ x ∈ Ω(log n / log log n)
Improving the upper bound: O(log n) → O(log n / log log n)
Now let us try to improve the upper bound. Instead of dividing n by a fixed constant (namely 2 in the above proof) we divide n as long by the initial value of log(n)/2 as the current value of log(n) is bigger. To be more clearer have a look at the modified code:
int T₂(int n){
n_old = n;
for(int count=0; n>2 ;++count)
{
n = n / (log₂(n_old)/2);
if(log₂(n)) <= log₂(n_old)/2)
{
n_old = n;
}
}
return count;
}
The complexity of the function T₂ is clearly an upper bound for the function T, since log₂(n_old)/2 < log₂(n) holds for the whole time.
Now we need to know how many times we divide by each 1/2⋅log(n_old):
n / (log(sqrt(n)))x ≤ sqrt(n)
⇔ n / sqrt(n) ≤ log(sqrt(n))x
⇔ log(sqrt(n)) ≤ x log(log(sqrt(n)))
⇔ log(sqrt(n)) / log(log(sqrt(n))) ≤ x
So we get the recurrence formula T₂(n) = T(sqrt(n)) + O(log(sqrt(n)) / log(log(sqrt(n)))).
Now we need to know how often this formula has to be expanded until n < 2 holds.
n2-x < 2
⇔ 2-x⋅log n < log 2
⇔ -x log 2 + log log n < log 2
⇔ log log n < log 2 + x log 2
⇔ log log n < (x + 1) log 2
So we need to expand the formula about log log n times.
Now it gets a little bit harder. (Have also a look at the Mike_Dog's answer)
T₂(n) = T(sqrt(n)) + log(sqrt(n)) / log(log(sqrt(n)))
= Σk=1,...,log log n - 1 2-k⋅log(n) / log(2-k⋅log n))
= log(n) ⋅ Σk=1,...,log log n - 1 2-k / (-k + log log n))
(1) = log(n) ⋅ Σk=1,...,log log n - 1 2k - log log n / k
= log(n) ⋅ Σk=1,...,log log n - 1 2k ⋅ 2- log log n / k
= log(n) ⋅ Σk=1,...,log log n - 1 2k / (k ⋅ log n)
= Σk=1,...,log log n - 1 2k / k
In the line marked with (1) I reordered the sum.
So, at the end we "only" have to calculate Σk=1,...,t 2k / k for t = log log n - 1. At this point Maple solves this to
Σk=1,...,t 2k / k = -I⋅π - 2t⋅LerchPhi(2, 1, t) +2t/t
where I is the imaginary unit and LerchPhi is the Lerch transcendent. Since the result for the sum above is a real number for all relevant cases, we can just ignore all imaginary parts. The Lerch transcendent LerchPhi(2,1,t) seems to be in O(-1/t), but I'm not 100% sure about it. Maybe someone will prove this.
Finally this results in
T₂(n) = -2t⋅O(-1/t) + 2t/t = O(2t/t) = O(log n / log log n)
All together we have T(n) ∈ Ω(log n / log log n) and T(n) ∈ O(log n/ log log n),
so T(n) ∈ Θ(log n/ log log n) holds. This result is also supported by your example data.
I hope this is understandable and it helps a little.
The guts of the problem of verifying the conjectured estimate is to get a good estimate of plugging the value
n / log(n)
into the function
n --> log(n) / log(log(n))
Theorem:
log( n/log(n) ) / log(log( n/log(n) )) = log(n)/log(log(n)) - 1 + o(1)
(in case of font readibility issues, that's little-oh, not big-oh)
Proof:
To save on notation, write
A = n
B = log(n)
C = log(log(n))
The work is based on the first-order approximation to the (natural) logarithm: when 0 < y < x,
log(x) - y/x < log(x - y) < log(x)
The value we're trying to estimate is
log(A/B) / log(log(A/B)) = (B - C) / log(B - C)
Applying the bounds for the logarithm of a difference gives
(B-C) / log(B) < (B-C) / log(B-C) < (B-C) / (log(B) - C/B)
that is,
(B-C) / C < (B-C) / log(B-C) < (B-C)B / (C (B-1))
Both the recursion we're trying to satisfy and the lower bound suggest we should estimate this with B/C - 1. Pulling that off of both sides gives
B/C - 1 < (B-C) / log(B-C) < B/C - 1 + (B-C)/(C(B-1))
and thus we conclude
(B-C) / log(B-C) = B/C - 1 + o(1)
If you take away one idea from this analysis to use on your own, let it be the point of using differential approximations (or even higher order Taylor series) to replace complicated functions with simpler ones. e.g. once you have the idea to use
log(x-y) = log(x) + Θ(y/x) when y = o(x)
then all of the algebraic calculations you need for your problem simply follow directly.
Thanks for the answer of #AbcAeffchen
I'm the owner of the question, using the knowledge of "the master method" I learned yesterday, the "a little bit harder" part of proof can be done as follows simply.
I will start here:
T(n) = T(sqrt(n)) + O(log(sqrt(n)) / log(log(sqrt(n))))
⇔ T(n)=T(sqrt(n)) + O(log n / log log n)
Let
n=2k , S(k)=T(2k)
then we have
T(2k) =T(2k/2) + O(log 2k / log log
2k) ⇔ S(k) =S(k/2) + O( k/log k)
with the master method
S(k)=a*S(k/b)+f(k), where a=1, b=2, f(k)=k/log k
= Ω(klog21 +ε) = Ω(kε),
as long as ε∈(0,1)
so we can apply case 3. Then
S(k) = O(k/log k)
T(n) = S(k) = O(k/log k) = O(log n/ log log n)

What is the tightest asymptotic growth rate

I have solved all of them however i have been told there are some mistakes, can somebody please help me
n^4 - 10^3 n^3 + n^2 + 4n + 10^6 = O(n^4)
10^5 n^3 + 10^n = O(10^n)
10 n^2 + n log n + 30 √n = O(n^2)
25^n = O(1)
n^2+ n log n + 7 n = O(n^2)
(n^3 + 10) (n log n+ 1) / 3 = O(n^4 log n)
20 n^10 + 4^n = O(4^n)
n^2 log n^3 + 10 n^2 = O(n^2 log n)
10^20 = O(1)
n^2 log (6^2)n = O(n^2 log n)
n log(2n) = O(n log n)
30 n + 100 n log n + 10 = O(n log n)
(n+√n) log n^3 = O(n+√n log n)
n (n + 1) + log log n = O(n^2)
4n log 5^(n+1) = O(n log 5^n)
3^(n+4) = O(3^n)
n^2 log n^2 + 100 n^3 = O(n^3)
(n log n) / (n + 10) = O(n^2 log n)
5n + 8 n log(n) + 10n^2 = O(n^2)
2n^3 + 2n^4 + 2^n + n^10 = O(2^n)
Hints:
if you have n on the left, you should have it on the right
there should not be any + operations on the right
log(x^y) can be simplified
Most of your answers look correct, but you have 25^n = O(1) which looks wrong (unless it's 0.25^n), and also you have (n log n) / (n + 10) = O(n^2 log n) which does not look like the tightest possible bound (I'm assuming you want the tightest possible upper bound function). Also you should never have to add functions in your big-O, unless your original function is taking the sum or max of two functions or something and the two functions have cris-crossing different growth rates at different values of n as n goes to infinity. And that very rarely happens.

Solving the recurrence T(n) = T(n / 3) + T(2n / 3) + n^2?

I have been trying to solve a recurrence relation.
The recurrence is T(n) = T(n/3)+T(2n/3)+n^2
I solved the the recurrence n i got it as T(n)=nT(1)+ [ (9/5)(n^2)( (5/9)^(log n) ) ]
Can anyone tell me the runtime of this expression?
I think this recurrence works out to Θ(n2). To see this, we'll show that T(n) = Ω(n2) and that T(n) = O(n2).
Showing that T(n) = Ω(n2) is pretty straightforward - since T(n) has an n2 term in it, it's certainly Ω(n2).
Let's now show that T(n) = O(n2). We have that
T(n) = T(n / 3) + T(2n / 3) + n2
Consider this other recurrence:
S(n) = S(2n / 3) + S(2n / 3) + n2 = 2S(2n / 3) + n2
Since T(n) is increasing and T(n) ≤ S(n), any upper bound for S(n) should also be an upper-bound for T(n).
Using the Master Theorem on S(n), we have that a = 2, b = 3/2, and c = 2. Since logb a = log3/2 2 = 1.709511291... < c, the Master Theorem says that this will solve to O(n2). Since S(n) = O(n2), we also know that T(n) = O(n2).
We've shown that T(n) = Ω(n2) and that T(n) = O(n2), so T(n) = Θ(n2), as required.
Hope this helps!
(By the way - (5 / 9)log n = (2log 5/9)log n = 2log n log 5/9 = (2log n)log 5/9 = nlog 5/9. That makes it a bit easier to reason about.)
One can't tell about runtime from the T(n) OR the time complexity!It is simply an estimation of running time in terms of order of input(n).
One thing which I'd like to add is :-
I haven't solved your recurrence relation,but keeping in mind that your derived relation is correct and hence further putting n=1,in your given recurrence relation,we get
T(1)=T(1/3)+T(2/3)+1
So,either you'll be provided with the values for T(1/3) and T(2/3) in your question OR you have to understand from the given problem statement like what should be T(1) for Tower of Hanoi problem!
For a recurrence, the base-case is T(1), now by definition its value is as following:
T(1) = T(1/3) + T(2/3) + 1
Now since T(n) denotes the runtime-function, then the run-time of any input that will not be processed is always 0, this includes all terms under the base-case, so we have:
T(X < 1) = 0
T(1/3) = 0
T(2/3) = 0
T(1) = T(1/3) + T(2/3) + 1^2
T(1) = 0 + 0 + 1
T(1) = 1
Then we can substitute the value:
T(n) = n T(1) + [ (9/5)(n^2)( (5/9)^(log n) ) ]
T(n) = n + ( 9/5 n^2 (5/9)^(log n) )
T(n) = n^2 (9/5)^(1-log(n)) + n
We can approximate (9/5)^(1-log(n)) to 9/5 for asymptotic upper-bound, since (9/5)^(1-log(n)) <= 9/5:
T(n) ~ 9/5 n^2 + n
O(T(n)) = O(n^2)

Resources