Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am having some trouble, finding the lower bound for this series :
S = lg(n-2) + 2lg(n-3) + 3lg(n-4) + ... + (n-2)lg2.
The upper bound as I have figured out (and I explain below) is O (N^2 . lgN)
Could you help me in finding out the lower bound on this.
My proof for the upper bound goes as :
S = lg [ (n-2)* (n-3)^2 * (n-4)^3 *.. *2^(n-2) ]
= O ( lg n^(1+2+3+..+(n-1) )
= O ( n^2*log(n) )
EDIT:
Just a random thought. Can I assume the series to be closely approximated by Integral (xLogx), which happens to be O (X^2. lgX) ?? But this too, would give only an upper bound and not a lower bound.
lg(n-2) + 2lg(n-3) + 3lg(n-4) + ... + (n-2)lg2 > lg(n-2) + 2 lg(n-3) + ... + (n/2)log(n/2) =
= lg [(n-2) * (n-3)^2 * ... * (n/2)^(n/2)] > lg[(n/2) * (n/2)^2 * ... * (n/2)^(n/2)] =
= lg [(n/2)^(1+2+...+n/2)] = lg [ (n/2) ^ [(n^2)/4] = [(n^2)/4] * lg(n/2) = omega(n^2 * lgn)
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I have having trouble solving the problem. I figured out that at last, it will be n[7^log n base (3/4)], what will be the answer.
You can rewrite the equation like the following:
T(n) = T(n/(4/3)) + 2T(n/2) + n
Now as 4/3 < 2, We can say T(n) < 3 T(n/(4/3)) + n. Then using master theorem, T(n) = O(n^{log_{4/3}(3)}) = O(n^{3.82}). On the other, we can find a lower bound as T(n) > 3T(n/2) + n, T(n) = Omega(n^{log_2(3)})=Omega(n^{1.58}).
To find exact solution, you can use Akra-Bazzi Theorem as well:
a1 = 1, b1 = 3/4
a2 = 2, b2 = 1/2
Find a p such that:
(3/4)^p + 2 (1/2)^p = 1
p ~ 2.17. So, T(n) = \Theta(n^{2.17} * (1 + int(u/u^{3.17},1,n)) ) = Theta(n^{2.17}) (as int(u/u^{3.17},1,n) = int(1/u^{2.17},1,n) = O(1)).
In sum:
T(n) = Theta(n^{2.17})
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
Please help me by giving the time complexity analysis of the following function.
function (n)
{
for( i = 1 ; i <= n ; i + + )
{
for( j = 1 ; j <= n ; j+ = i )
{
print( “*” ) ;
}
}
}
I feel it should be n + n*log(n).
One cycle it's O(n);
Cycle inside cycle it's O(n^2);
This case it should be: n*log(n) - as (n/1 + n/2 + n/3 + … + n/n) each iteration number is reducing (in n division by n), plus n, as this is also first iteration cycle;
So, final answer is: n + n*log(n).
Checked my assumption with simple test and some numbers, looks good.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I saw the same question here.They have proved the lower bound like this
log(1) + ... + log(n/2) + ... + log(n) >= log(n/2) + ... + log(n)
>= log(n/2) + ... + log(n/2)
= n/2 * log(n/2)
My doubt is why can't the lower bound be n log n itself? Or is there any other tighter lower bound possible?. Why is it specifically n/2 * log(n/2)?
This is used to prove that
log(n!) = log(1) + log(2) + ... + log(n-1) + log(n) = Θ(n·log(n))
To prove this it is enough to find both an upper bound and a lower bound of Θ(n·log(n))
The lower bound
n/2 * log(n/2)
already corresponds to Θ(n·log(n)). It is easy to obtain and belongs to the Θ we are interested in. Finding a tighter lower bound would be more difficult an is not necessary.
The complete proof in this question:
Is log(n!) = Θ(n·log(n))?
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
There is an algorithm which has the time complexity
T(n)=T(n-1)+1/n if n>1
=1 otherwise
I am solving for its asymptotic complexity, and getting order as 'n' but the answer given is 'log n'. Is it correct? If it is log n, then why?
It can be easily seen (or proven formally with induction) that T(n) is the sum of 1/k for the values of k from 1 to n. This is the nth harmonic number, Hn = 1 + 1/2 + 1/3 + ... + 1/n.
Asymptotically, the harmonic numbers grow on the order of log(n). This is because the sum is close in value to the integral of 1/x from 1 to n, which is equal to the natural logarithm of n. In fact, Hn = ln(n) + γ + O(1/n) where γ is a constant. From this, it is easy to show that T(n) = Θ(log(n)).
For more details:
With H(N) = 1 + 1/2 + 1/3 + ... + 1/N
the function x :-> 1/x is a decreasing function so :
We sum from 1 to N the left part and for the right part we sum from 2 to N and we add 1, we get:
Then we calculate the left and right parts : ln(N+1) <= H(N) <= 1 + ln(N)
this implies H(N)/ln(N) -> 1 hence H(N)=Θ(log(N))
(from http://fr.wikipedia.org/wiki/S%C3%A9rie_harmonique#.C3.89quivalent_de_Hn)
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am looking to prove that T(n)=T(n/2)+sqrt(n) is O(sqrt(n)) given T(1)=1
using only induction.
It is easy to solve using the Master theorem but this is not the case.
I tried to assume
T(n/2) < c*sqrt(n/2)
but didnt get very far with the rest of the proof.
Thank you all in advance for your answers.
Edit:
my line of solution (after the assumption above) is:
T(n) <= c*sqrt(n/2)+sqrt(n) = sqrt(n)(c/sqrt(2)+1) <= sqrt(n)(c+1)
I dont know how to move from this to the required
T(n)<=c*sqrt(n)
ok, you're close. So basically, as I mentioned in the comment, base case is simple. For induction case, you want to show that T(n) is O(sqrt(n)) given that T(n/2) is O(sqrt(n/2)).
So, it goes like this:
T(n) = T(n/2) + sqrt(n) ; this is just your recurrence
< c sqrt(n/2) + sqrt(n) ; since T(n/2) is O(sqrt(n))
; wlog here, assume c > 4
= c sqrt(n) / sqrt(2) + sqrt(n)
= (c/sqrt(2) + 1) sqrt(n)
observe that for c > 4, c / sqrt(2) + 1 < c, so
(c/sqrt(2) + 1) sqrt(n) < c sqrt(n)
so
T(n) < c sqrt(n)
Therefore, T(n) is O(sqrt(n))
So there's a couple key points here that you missed.
The first is that you can always increase the c to whatever value you want. This is because big O only requires <. if it's < c f(n) then it is < d f(n) where d > c.
The second is to note that the line f(c) = c/sqrt(2) + 1 intersects with the line f(c) = c at about c = sqrt(2) / (sqrt(2)-1) = 3.4143 (or so), so all you have to do is force c to be > this value in order to get (c/sqrt(2) + 1) < c. 4 certainly works, so that's where the 4 comes from.
In retrospect, I should have given the key points as hints. My fault. Sorry!
One line of thinking which may help is to expand the recurrence recursively. You get
T(n) = sqrt(n) + sqrt(n/2) + sqrt(n/4) + ... + sqrt(n/(2^k)) + ... + sqrt(1)
= sqrt(n) + sqrt(n)/sqrt(2) + sqrt(n)/sqrt(4) + ... + sqrt(n)/sqrt(2^k) + ... + sqrt(1)
= sqrt(n) * (1 + sqrt(1/2) + sqrt(1/2)^2 + ... + sqrt(1/2)^k + ...)
<= sqrt(n) * ∑(k=0 to ∞) sqrt(1/2)^k
= sqrt(n) * 1/(1 - sqrt(1/2))
Since 1/(1-sqrt(1/2)) is a finite constant (it's about 3.4), T(n) must be O(sqrt(n)). You can use this information to prove it using standard induction.