To prove lg n! = theta(n lg n) [closed] - algorithm

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I was trying to prove lg n! = theta(n lg n)
I used the below expression to prove it
0 <= c1(n lg n) <= lg n! <= c2(n lg n) - equation 1
By using lg n! <= c2(n lg n) from the above equation, I could prove that lg n! = big O(n lg n)
however to prove lg n! = big omega(n lg n), I need to use the other part of equation 1 which is
c1(n lg n) <= lg n! - equation 2
can anyone help me as to how to solve equation 2 to prove big omega. The hint which I got to know is to do perform integration. But I'm not able to do it. Kindly help me out here
Thank you very much.

Related

Comparing big theta values [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I am trying to order these different big theta values from largest to smallest:
Θ(n2)
Θ(2n log n)
Θ(n log n2)
Θ(2n2)
Θ(log n)
Θ(n log 2n)
Θ(k2)
Θ(22n)
Θ(n3)
Θ(n)
Θ(2n)
Θ(n1.5)
Θ(√n)
Θ(2n2)
and some of the values are equivalent. Particularly, I want to know if a constant term makes one big-theta value larger than an identical big-theta term without the constant term (for example, are these two values equivalent: Θ(22n) & Θ(n)?).
Θ(log n)
Θ(√n) = Θ(n1/2)
Θ(n) = Θ(2n) = Θ(22n)
Θ(n log n) = Θ(2n log n) = Θ(n log n2) = Θ(n log 2n)
Θ(n1.5)
Θ(n2) = Θ(2n2)
Θ(n3)
Considering your comment:
n log 2n = n (log 2 + log n) = n log 2 + n log n
log 2 is a constant non-zero value, so:
Θ(n log 2n) = Θ(n log 2 + n log n) = Θ(n + n log n) = Θ(n log n)
See the sum and multiplication by a constant properties of the big-{O,Theta, Omega}-notations.
If try replacing n with a huge value then you can figure it out yourself without even asking it to the forum:
o(1)
O(log log n)
O(log n)
O(n^c)
O(n)
O(n log n)
O(n^2)
O(c^n)
O(n!)

How to find cross-over point of running times of 2 algorithms? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions must demonstrate a minimal understanding of the problem being solved. Tell us what you've tried to do, why it didn't work, and how it should work. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I'm trying to compare two sorting algorithms. Suppose that for all inputs of size n, the first algorithm runs in 8n^2 seconds, while the second algorithm runs in 64n lg n seconds. For which value of n does the first algorithm beat the second algorithm?
The answer is: 8n^2 < 64n lg n.
2 <= n <= 43.
How do I derive it from the question? why isn't it.
8n^2 > 64n lg n
or 8n^2 = 64n lg n
And getting the values 2 <= n <= 43. Sorry I'm new to this. Anyone can explain to me?
You want n such that
8n^2 < 64n lg n
=> 8n^2 - 64n lg n < 0
We solve h(n) = 8n^2 - 64n lg n for its roots and discover that it has roots at n_1 ~= 1.100 and n_2 ~= 43.559. If we plot this function, we see that it is positive when n < n_1 and when n > n_2.
Thus, the quadratic algorithm exceeds the runtime for the linearithmic algorithm when n < n_1 or n > n_2. Therefore, the quadratic algorithm beats the linearithmic if n in [1.1, 43.559] which implies 2 <= n <= 43 since n must be integral. Otherwise, for all other n, the quadratic algorithm is inferior to the linearithmic algorithm.
If memory serves me correctly, and trust me it's been a while, but all you'd really have to do is graph these curves to find the answer. To understand the question better, graph a basic log function. You will see that it accelerates quickly in the beginning and levels off as x becomes greater while the acceleration of the x^2 algorithm will continue to increase. Look at the graph if you have a graphing calculator and it will help you to understand it better

Solving the recurrence T(n) = T(n / 2) + O(1) using the Master Theorem? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I'm trying to solve a recurrence relation to find out the complexity of an algorithm using the Master Theorem and its recurrences concepts, how can I prove that:
T(n) = T(n/2)+O(1)
is
T(n) = O(log(n)) ?
Any explanation would be apprecciated!!
Your recurrence is
T(n) = T(n / 2) + O(1)
Since the Master Theorem works with recurrences of the form
T(n) = aT(n / b) + nc
In this case you have
a = 1
b = 2
c = 0
Since c = logba (since 0 = log2 1), you are in case two of the Master Theorem, which solves to Θ(nc log n) = Θ(n0 log n) = Θ(log n).
Hope this helps!

Big Oh Classification [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Say that I have a function that has the following growth:
2000n^2 + log n
Is it possible for me to conclude that the function is part of the set of functions that fall into the category of O(n)?
O(some function) is the limiting behavior of a function.
Does there exist some C such that C*n describes the upper limit of the function you described for all n?
If you look closely at your function, you can set C to 2000 such that 2000*n^2 = C*n^2...which is greater than C*n.
So no, it is not O(n).
No since O(log n) < O(n^x) for any fixed x, O(2000n^2 + log(n)) = O(n^2)
An easier way to see this is that since O(log n) < O(n^x), O(log n) < O(n^2) and so O(2000n^2 + log(n)) <= O(2000n^2 + n^2) = O(2001n^2) = O(n^2) and since O(2000n^2 + log(n)) has an n^2 term, it is at least as big as n^2 giving us O(2000n^2 + log(n)) >= O(n^2). Now we have that O(2000n^2 + log(n)) <= O(n^2) and O(2000n^2 + log(n)) >= O(n^2) so we can conclude that O(2000n^2 + log(n)) = O(n^2)

Show that n^2 is not O(n*log(n))? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Using only the definition of O()?
You need to prove by contradiction. Assume that n^2 is O(n*log(n)). Which means by definition there is a finite and non variable real number c such that
n^2 <= c * n * log(n)
for every n bigger than some finite number n0.
Then you arrive to the point when c >= n /log(n), and you derive that as n -> INF, c >= INF which is obviously impossible.
And you conclude n^2 is not O(n*log(n))
You want to calculate the limit of
(n * log(n)) / (n ^ 2) =
= log(n) / n =
= 0 if n approaches infinity.
because log(n) grows slower than n.

Resources