Is lower bound for log (n!) also nlogn [closed] - algorithm

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I saw the same question here.They have proved the lower bound like this
log(1) + ... + log(n/2) + ... + log(n) >= log(n/2) + ... + log(n)
>= log(n/2) + ... + log(n/2)
= n/2 * log(n/2)
My doubt is why can't the lower bound be n log n itself? Or is there any other tighter lower bound possible?. Why is it specifically n/2 * log(n/2)?

This is used to prove that
log(n!) = log(1) + log(2) + ... + log(n-1) + log(n) = Θ(n·log(n))
To prove this it is enough to find both an upper bound and a lower bound of Θ(n·log(n))
The lower bound
n/2 * log(n/2)
already corresponds to Θ(n·log(n)). It is easy to obtain and belongs to the Θ we are interested in. Finding a tighter lower bound would be more difficult an is not necessary.
The complete proof in this question:
Is log(n!) = Θ(n·log(n))?

Related

Solving a recurrence T(n) = 2T(n/2) + sqrt(n) [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Need a little help! This is what I have so far using backward substitution:
T(n) = 2T(n/2) + sqrt(n), where T(1) = 1, and n = 2^k
T(n) = 2[2T(n/4) + sqrt(n/2)] + sqrt(n) = 2^2T(n/4) + 2sqrt(n/2) + sqrt(n)
T(n) = 2^2[2T(n/8) + sqrt(n/4)] + 2sqrt(n/2) + sqrt(n)
= 2^3T(n/8) + 2^2sqrt(n/4) + 2sqrt(n/2) + sqrt(n)
In general
T(n) = 2^kT(1) + 2^(k-1) x sqrt(2^1) + 2^(k-2) x sqrt(2^2) + ... + 2^1 x sqrt(2^(k-1)) + sqrt(2^k)
Is this right so far? If it is, I can not figure out how to simplify it and reduce it down to a general formula.
I'm guessing something like this? Combining the terms
= 1 + 2^(k-(1/2)) + 2^(k-(2/2)) + 2^(k-(3/2)) + ... + 2^((k-1)/2) + 2^(k/2)
And this is where I'm stuck. Maybe a way to factor out a 2^k?
Any help would be great, thanks!
You're half way there.
The expression can be simplified to this:
If you want just a big-O solution, then Master Theorem is just fine.
If you want a exact equation for this, a recursion tree is good. like this:
The right hand-side is cost for every level, it's easy to find a general form for the cost, which is sqrt((2^h) * n). Then, sum up the cost you could get T(n).
According to Master Theorem, it's case 1, so O(n).
According to Recursion Tree, the exact form should be sqrt(n)*(sqrt(2n)-1)*(sqrt(2)+1), which corresponds with the big-O notation.
EDIT:
The recursion tree is just a visualized form of the so called backward substitution. If you sum up the right hand side, i.e. the cost, you could get the generalized form of T(n). All these methods could found in introduction to algorithm

Big Oh Classification [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Say that I have a function that has the following growth:
2000n^2 + log n
Is it possible for me to conclude that the function is part of the set of functions that fall into the category of O(n)?
O(some function) is the limiting behavior of a function.
Does there exist some C such that C*n describes the upper limit of the function you described for all n?
If you look closely at your function, you can set C to 2000 such that 2000*n^2 = C*n^2...which is greater than C*n.
So no, it is not O(n).
No since O(log n) < O(n^x) for any fixed x, O(2000n^2 + log(n)) = O(n^2)
An easier way to see this is that since O(log n) < O(n^x), O(log n) < O(n^2) and so O(2000n^2 + log(n)) <= O(2000n^2 + n^2) = O(2001n^2) = O(n^2) and since O(2000n^2 + log(n)) has an n^2 term, it is at least as big as n^2 giving us O(2000n^2 + log(n)) >= O(n^2). Now we have that O(2000n^2 + log(n)) <= O(n^2) and O(2000n^2 + log(n)) >= O(n^2) so we can conclude that O(2000n^2 + log(n)) = O(n^2)

Asymptotic complexity of T(n)=T(n-1)+1/n [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
There is an algorithm which has the time complexity
T(n)=T(n-1)+1/n if n>1
=1 otherwise
I am solving for its asymptotic complexity, and getting order as 'n' but the answer given is 'log n'. Is it correct? If it is log n, then why?
It can be easily seen (or proven formally with induction) that T(n) is the sum of 1/k for the values of k from 1 to n. This is the nth harmonic number, Hn = 1 + 1/2 + 1/3 + ... + 1/n.
Asymptotically, the harmonic numbers grow on the order of log(n). This is because the sum is close in value to the integral of 1/x from 1 to n, which is equal to the natural logarithm of n. In fact, Hn = ln(n) + γ + O(1/n) where γ is a constant. From this, it is easy to show that T(n) = Θ(log(n)).
For more details:
With H(N) = 1 + 1/2 + 1/3 + ... + 1/N
the function x :-> 1/x is a decreasing function so :
We sum from 1 to N the left part and for the right part we sum from 2 to N and we add 1, we get:
Then we calculate the left and right parts : ln(N+1) <= H(N) <= 1 + ln(N)
this implies H(N)/ln(N) -> 1 hence H(N)=Θ(log(N))
(from http://fr.wikipedia.org/wiki/S%C3%A9rie_harmonique#.C3.89quivalent_de_Hn)

Lower bound : resource required by an algorithm for some class of input size n [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I've been asked to find the lower bound of the following :
T(n)= 23n^3-n^2-n.
So here is how i proceeded and i don't know whether I'm tackling it the proper way:
T(n)>=c(23n^2-n^2) for all n greater than n>=n0
23n^3-n^2-n >=(22n^2) for all n>=2.
T(n)>=c|n^2| for all n>=2
c=22 n0=22.
T(n) is in Big Omega n^2
HELP PLEASE!
Note that n^3 >= n^2 for n >= 1. So, -n^3 <= -n^2 for n >= 1.
Note that n^3 >= n for n >= 1. So, -n^2 <= -n for n >= 1.
So
23n^3 - n^2 - n >= 23n^3 - n^3 - n^3 = 21n^3.
Thus, 21n^3 is a decent lower bound.
Intuitively this makes sense as 23n^3 - n^2 - n is clearly cubic in nature, and thus should have lower bound and upper bound of cn^3 for some c (different c for the lower bound from the c for the upper bound).

Logarithmic series. Lower bound [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am having some trouble, finding the lower bound for this series :
S = lg(n-2) + 2lg(n-3) + 3lg(n-4) + ... + (n-2)lg2.
The upper bound as I have figured out (and I explain below) is O (N^2 . lgN)
Could you help me in finding out the lower bound on this.
My proof for the upper bound goes as :
S = lg [ (n-2)* (n-3)^2 * (n-4)^3 *.. *2^(n-2) ]
= O ( lg n^(1+2+3+..+(n-1) )
= O ( n^2*log(n) )
EDIT:
Just a random thought. Can I assume the series to be closely approximated by Integral (xLogx), which happens to be O (X^2. lgX) ?? But this too, would give only an upper bound and not a lower bound.
lg(n-2) + 2lg(n-3) + 3lg(n-4) + ... + (n-2)lg2 > lg(n-2) + 2 lg(n-3) + ... + (n/2)log(n/2) =
= lg [(n-2) * (n-3)^2 * ... * (n/2)^(n/2)] > lg[(n/2) * (n/2)^2 * ... * (n/2)^(n/2)] =
= lg [(n/2)^(1+2+...+n/2)] = lg [ (n/2) ^ [(n^2)/4] = [(n^2)/4] * lg(n/2) = omega(n^2 * lgn)

Resources