I'm new to Divide and Conquer. I would like to know how to find out the running time of this calculation?
What exactly do I have to pay attention to and how do I proceed?
n=1 runningtime = O(1)
so lets see the calculation for this :
T(n) = 4T(n/4) + n * sqrt(n)
expanding for sum k steps it will be like
T(n) = 4^k[T(n/4^k)] + n * sqrt(n) * {sqrt(1/4)+sqrt(1/16)....}
here {sqrt(1/4)+sqrt(1/16)....} is Geometric progression
if we take k=log4(n) //here base is 4
T(n) = n * [T(1)] + n * sqrt(n)*{1-[2/sqrt(n)]}
T(n) = n * [T(1)] + n * sqrt(n) -2 * n
you still can use
Master theorem
T(n) = aT(n/b) + f(n).
If f(n) = Θ(n^d), where d ≥ 0, then
T(n) = Θ(n^d) if a < bd,
T(n) = Θ((n^d)log n) if a = bd,
T(n) = Θ(n^(logba)) if a > bd
yes the ans is O(n^3/2)
{sorry could not comment due to low reputation}
Related
I have got a question of solving this recursive complexity T(n)=T(n/4)+T(3n/4)+nlogn. Can you help me to solve it?
You can use Akra-Bazzi method with the following paramters:
a_1 = a_2 = 1,
b_1 = 1/4, b_2 = 3/4
p = 1
T(n) = \Theta(n * (1 + integral( u log(u)/ u^2 du,1, n))) =
\Theta(n * (1 + (log^2(n)/2))) =
\Theta(n log^2(n))
Notice that 3n/4 > n/4. Hence we can see that T(n) <= 2T(3n/4) + n log n.
Now we can apply Master Theorem.
We can see that a=2, b=4/3 and f(n) = n log n
We can see that, log_(4/3) 2 = 2.41
Hence n^log_b a >= f(n).
Thus by Master's theorem, we have T(n) = O(n^2.41)
I know how to solve the recurrence relations using Master Method.
Also I'm aware of how to solve the recurrences below:
T(n) = sqrt(n)*T(sqrt(n)) + n
T(n) = 2*T(sqrt(n)) + lg(n)
In the above two recurrences there is same amount of work at each level of the recursion tree. And there are a total of log log n levels in the recursion tree.
I'm having trouble in solving this one:
T(n) = 4*T(sqrt(n)) + n
EDIT:
Here n is a power of 2
Suppose that n = 2^k. We have T(2^k) = 4*T(2^(k/2)) + 2^k. Let S(k) = T(2^k). We have S(k) = 4S(k/2) + 2^k. By using Mater Theorem, we get S(k) = O(2^k). Since S(k) = O(2^k) and S(k) = T(2^k), T(2^k) = O(2^k) which implies T(n) = O(n).
I'm having trouble in solving this one: T(n) = 4*T(sqrt(n)) + n
EDIT: Here n is a power of 2
This edit is important. So lets say that the recurrence stops at 2.
So the question now is how deep the recursion tree is. Well, that is the number of times that you can take the square root of n before n gets sufficiently small (say, less than 2). If we write
n = 2lg n
then on each recursive call n will have its square root taken. This is equivalent to halving the above exponent, so after k iterations we have that
n1/(2k) = 2lg n/(2k)
We want to stop when this is less than 2, giving
2lg n/(2k) = 2
lg n/(2k) = 1
lg n = 2k
lg lg n = k
So after lg lg n iterations of square rooting the recursion stops. (source)
For each recursion we will have 4 new branches, the total of branches is 4 ^ (depth of the tree) therefore 4^(lg lg n).
EDIT:
Source
T(n) = 4 T(sqrt(n)) + n
4 [ 4 T(sqrt(sqrt(n) + n ] + n
4^k * T(n^(1/2^k)) +kn because n is power of 2.
4^k * T(2^(L/2^k)) +kn [ Let n = 2^L , L= logn]
4^k * T(2) +kn [ Let L = 2^k, k = logL = log log n]
2^2k * c +kn
L^2 * c + nloglogn
logn^2 * c + nloglogn
= O(nloglogn)
T(n) = 4T(√n) + n
suppose that (n = 2^m) . so we have :
T(2^m) = 4T(2^(m/2)) + (2^m)
now let name T(2^m) as S(m):
S(m) = 4S(m/2) + m . now with master Method we can solve this relation, and the answer is :
S(m) = Θ(m^2)
now we step back to T(2^m):
T(2^m) = Θ((2^m)^2)
now we need m to solve our problem and we can get it from the second line and we have :
n = 2^m => m=lgn
and the problem solved .
T(n) = Θ((2^lgn)^2)
T(n) = Θ(n^2)
I have been trying to solve a recurrence relation.
The recurrence is T(n) = T(n/3)+T(2n/3)+n^2
I solved the the recurrence n i got it as T(n)=nT(1)+ [ (9/5)(n^2)( (5/9)^(log n) ) ]
Can anyone tell me the runtime of this expression?
I think this recurrence works out to Θ(n2). To see this, we'll show that T(n) = Ω(n2) and that T(n) = O(n2).
Showing that T(n) = Ω(n2) is pretty straightforward - since T(n) has an n2 term in it, it's certainly Ω(n2).
Let's now show that T(n) = O(n2). We have that
T(n) = T(n / 3) + T(2n / 3) + n2
Consider this other recurrence:
S(n) = S(2n / 3) + S(2n / 3) + n2 = 2S(2n / 3) + n2
Since T(n) is increasing and T(n) ≤ S(n), any upper bound for S(n) should also be an upper-bound for T(n).
Using the Master Theorem on S(n), we have that a = 2, b = 3/2, and c = 2. Since logb a = log3/2 2 = 1.709511291... < c, the Master Theorem says that this will solve to O(n2). Since S(n) = O(n2), we also know that T(n) = O(n2).
We've shown that T(n) = Ω(n2) and that T(n) = O(n2), so T(n) = Θ(n2), as required.
Hope this helps!
(By the way - (5 / 9)log n = (2log 5/9)log n = 2log n log 5/9 = (2log n)log 5/9 = nlog 5/9. That makes it a bit easier to reason about.)
One can't tell about runtime from the T(n) OR the time complexity!It is simply an estimation of running time in terms of order of input(n).
One thing which I'd like to add is :-
I haven't solved your recurrence relation,but keeping in mind that your derived relation is correct and hence further putting n=1,in your given recurrence relation,we get
T(1)=T(1/3)+T(2/3)+1
So,either you'll be provided with the values for T(1/3) and T(2/3) in your question OR you have to understand from the given problem statement like what should be T(1) for Tower of Hanoi problem!
For a recurrence, the base-case is T(1), now by definition its value is as following:
T(1) = T(1/3) + T(2/3) + 1
Now since T(n) denotes the runtime-function, then the run-time of any input that will not be processed is always 0, this includes all terms under the base-case, so we have:
T(X < 1) = 0
T(1/3) = 0
T(2/3) = 0
T(1) = T(1/3) + T(2/3) + 1^2
T(1) = 0 + 0 + 1
T(1) = 1
Then we can substitute the value:
T(n) = n T(1) + [ (9/5)(n^2)( (5/9)^(log n) ) ]
T(n) = n + ( 9/5 n^2 (5/9)^(log n) )
T(n) = n^2 (9/5)^(1-log(n)) + n
We can approximate (9/5)^(1-log(n)) to 9/5 for asymptotic upper-bound, since (9/5)^(1-log(n)) <= 9/5:
T(n) ~ 9/5 n^2 + n
O(T(n)) = O(n^2)
I know how to solve the recurrence relations using Master Method.
Also I'm aware of how to solve the recurrences below:
T(n) = sqrt(n)*T(sqrt(n)) + n
T(n) = 2*T(sqrt(n)) + lg(n)
In the above two recurrences there is same amount of work at each level of the recursion tree. And there are a total of log log n levels in the recursion tree.
I'm having trouble in solving this one:
T(n) = 4*T(sqrt(n)) + n
EDIT:
Here n is a power of 2
Suppose that n = 2^k. We have T(2^k) = 4*T(2^(k/2)) + 2^k. Let S(k) = T(2^k). We have S(k) = 4S(k/2) + 2^k. By using Mater Theorem, we get S(k) = O(2^k). Since S(k) = O(2^k) and S(k) = T(2^k), T(2^k) = O(2^k) which implies T(n) = O(n).
I'm having trouble in solving this one: T(n) = 4*T(sqrt(n)) + n
EDIT: Here n is a power of 2
This edit is important. So lets say that the recurrence stops at 2.
So the question now is how deep the recursion tree is. Well, that is the number of times that you can take the square root of n before n gets sufficiently small (say, less than 2). If we write
n = 2lg n
then on each recursive call n will have its square root taken. This is equivalent to halving the above exponent, so after k iterations we have that
n1/(2k) = 2lg n/(2k)
We want to stop when this is less than 2, giving
2lg n/(2k) = 2
lg n/(2k) = 1
lg n = 2k
lg lg n = k
So after lg lg n iterations of square rooting the recursion stops. (source)
For each recursion we will have 4 new branches, the total of branches is 4 ^ (depth of the tree) therefore 4^(lg lg n).
EDIT:
Source
T(n) = 4 T(sqrt(n)) + n
4 [ 4 T(sqrt(sqrt(n) + n ] + n
4^k * T(n^(1/2^k)) +kn because n is power of 2.
4^k * T(2^(L/2^k)) +kn [ Let n = 2^L , L= logn]
4^k * T(2) +kn [ Let L = 2^k, k = logL = log log n]
2^2k * c +kn
L^2 * c + nloglogn
logn^2 * c + nloglogn
= O(nloglogn)
T(n) = 4T(√n) + n
suppose that (n = 2^m) . so we have :
T(2^m) = 4T(2^(m/2)) + (2^m)
now let name T(2^m) as S(m):
S(m) = 4S(m/2) + m . now with master Method we can solve this relation, and the answer is :
S(m) = Θ(m^2)
now we step back to T(2^m):
T(2^m) = Θ((2^m)^2)
now we need m to solve our problem and we can get it from the second line and we have :
n = 2^m => m=lgn
and the problem solved .
T(n) = Θ((2^lgn)^2)
T(n) = Θ(n^2)
I'm trying to calculate the following :
f(n) = ∑ (i*log(i)) , when i=1 to log(n) .
How do I do that ?
I have succeeded doing :
f(n) = ∑ (i*log(i)) , when i=1 to n .
Which is : 1*log(1) + 2*log(2) + ... + n*log(n) <= n(n*log(n))
Where in the end : f(n) = ∑ (i*log(i)) = Ω(n^2 log^2(n) ) (Where i=1 to n)
But I don't know how to do the first one , any idea anybody ?
Regards
First, you have to remove ^2 from log^2(n) in your current result would be
f(n) = ∑ (i*log(i)) <= n(n*log(n)) = Ω(n^2*log(n))
Then, for the case where i goes from 1 to log(n), just substitute n by log(n).
Let's define
g(n) = ∑ (i*log(i)), when i=1 to log(n) // The result you are looking for
f(n) = ∑ (i*log(i)), when i=1 to n // The result we have
Then
g(n) = f(log(n)) = Ω(log(n)^2*log(log(n)))
f(n) = Theta(log2(n) * log(log(n))
Proof:
f(n) = 1 * log(1) + 2 * log(2) + ... + log(n) * log(log(n)) <=
<= log(n)*log(log(n)) * log(n) =
= O(log^2(n) * loglog(n))
f(n) = 1 * log(1) + 2 * log(2) + ... + log(n) * log(log(n)) >=
>= log(n/2) * log(log(n/2)) + log(n/2 + 1) * log(log(n/2 + 1) + ... + log(n) * log(log(n)) >=
>= log(n/2) * log(log(n/2)) + ... + log(n/2) * log(log(n/2)) =
= log(n/2) * log(log(n/2)) * log(n/2)
= log^2(n/2)*log(log(n/2)) = log^2(n/2)*log(log(n)-log(2)) =
= Omega(log^2(n)*loglog(n))
If you know some calculus, you can often find the order of growth of such sums by integration.
If f is a positive monotonic function, ∑ f(i) for 1 <= i <= k can be approximated by the integral ∫ f(t) dt (t ranging from 1 to k). So if you know a primitive function F of f (in modern parlance an antiderivative), you can easily evaluate the integral to F(k) - F(1). For growth analysis, the constant term F(1) is irrelevant, so you can approximate the sum (as well as the integral) simply by F(k).
A tool that is often useful in such calculations is partial integration,
b b
∫ f'(t)*g(t) dt = f(b)*g(b) - f(a)*g(a) - ∫ f(t)*g'(t) dt
a a
which follows from the product rule (f*g)' = f' * g + f * g'. It is often helpful to write f as 1*f in order to apply partial integration, for example to find a primitive of the (natural) logarithm,
∫ log t dt = ∫ 1*log t dt = t*log t - ∫ t * (log t)' dt = t*log t - ∫ t*(1/t) dt = t*log t - t
In this case, with f(t) = t*log t, partial integration yields
∫ t*log t dt = 1/2*t^2 * log t - ∫ (1/2*t^2) * (log t)' dt
= 1/2*t^2 * log t - 1/2 ∫ t^2*(1/t) dt
= 1/2*t^2 * log t - 1/4*t^2
Since the second term grows slower than the first, it can be ignored for growth analysis, so you obtain
k
∑ i*log i ≈ 1/2*k^2*log k
1
Since logarithms to different bases only differ by a constant factor, a different choice of logarithm just changes the constant factor, and you see that in all cases
k
∑ i*log i ∈ Θ(k^2 * log k)
1
For your specific problem, k = log n, so the sum is Θ((log n)^2 * log(log n)), as has been derived in a different way by the other answers.
http://img196.imageshack.us/img196/7012/5f1ff74e3e6e4a72bbd5483.png
now subtitute n for logn and you'll get it's VERY tightly bounded by log^2(n)*log(log(n))