Prove that (log n)! = O(n^k) - algorithm

i need help to prove this
(log n)! = O(n^k)
I started with nlog n <= c*n^k but could not arrive at the desired solution.

n^k = (e^k)^log n and the factorial grows faster than an exponential (product of growing factors vs. product of constant factors).

Related

O( log n! ) and O( (log n )! )

To find their relation I substituted log n = x and log n! = n(log n) so with base a , O( log n! ) became a^x(x) and (log n)! became x(x-1)(x-2)....
now I think the first one has a higher growing speed. But can you help me to find their relation using big O of n^2
Actually x(x-1)(x-2).... becomes x^x + ... because you have x scopes. This means that O((log n)!)has a higher growing speed.
Also, if log(n) := x, then n = 2^x and n^2 will become (2^x)^2 = 2^2x which has lower growing speed than x^x
Summary
O(log n!) < O(n^2) < O((log n)!)

Difference between complexity logn and log(sqrt(n))

considering
log(sqrt(n)) = (1/2)log(n)
And if for asymptotic analysis we don't consider the constant terms
so, is O(log(sqrt(n))) is as good as O(log(n))?
As per my understanding log(sqrt(n)) will grow slowly in comparison to log(n) if we increase the size of n. But I am not able understand the glitch in moving power of (1/2) at front?
Is it just this that factor 1/2 only slows down the rate?
consider the case when we have log(n*n) represented as 2log(n) , and log(n)?
It is the same asymptotically:
O(log(sqrt(n))) = O(log(n^1/2)) = O(1/2 log(n)) = O(log(n))
You are right, O(log(sqrt(n))) is the same as O(log(n)) by the reasoning given in your question.
Time(A) = log n
Time(B) = log sqrt(n) = log n^(1/2) = 1/2 log n
Asymptotically the same
O(Time(A)) = O(log n)
O(Time(B)) = O(1/2 log n) = O(log n)
O(Time(A)) = O(Time(B))
Insignificantly different
Time(A) = 1 * log n
Time(B) = 1/2 * log n
Time(A) > Time(B)
Time(A) = 2 * Time(B)
Conclusion
log n = 2 log sqrt(n)
Although the difference between log n and log sqrt(n) is in insignificant, log n will always take double the amount of time log sqrt(n) takes
Visual
The big-O notation ignores any constant multiplier.
O(500000.N) is O(N) and is O(0.00001.N).
For the same reason, O(Log(Sqrt(N))) is O(1/2.Log(N)) is O(Log(N)), and that in any base.
The big-O notation is not about the speed of your program, it is about the growth of the running time as N increases.

When c > 0 Log(n) = O(n)? Not sure why it isn't O(log n)

In my homework, the question asks to determine the asymptotic complexity of n^.99999*log(n). I figured that it would be closer to O( n log n) but the answer key suggests that when c > 0, log n = O(n). I'm not quite sure why that is, could someone provide an explanation?
It's also true that lg n = O( nk ) (in fact, it is o(nk); did the hint actually say that, perhaps?) for any constant k, not just 1. Now consider k=0.00001. Then n0.99999 lg n = O(n0.99999 n0.00001 ) = O(n). Note that this bound is not tight, since I could choose an even smaller k, so it's perfectly fine to say that n0.99999 lg n is O(n0.99999 lg n), just as we say n lg n is O(n lg n).

O(n^2) vs O (n(logn)^2)

Is time complexity O(n^2) or O (n(logn)^2) better?
I know that when we simplify it, it becomes
O(n) vs O((logn)^2)
and logn < n, but what about logn^2?
n is only less than (log n)2 for values of n less than 0.49...
So in general (log n)2 is better for large n...
But since these O(something)-notations always leave out constant factors, in your case it might not be possible to say for sure which algorithm is better...
Here's a graph:
(The blue line is n and the green line is (log n)2)
Notice, how the difference for small values of n isn't so big and might easily be dwarfed by the constant factors not included in the Big-O notation.
But for large n, (log n)2 wins hands down:
For each constant k asymptotically log(n)^k < n.
Proof is simple, do log on both sides of the equation, and you get:
log(log(n))*k < log(n)
It is easy to see that asymptotically, this is correct.
Semantic note: Assuming here log(n)^k == log(n) * log(n) * ... * log(n) (k times) and NOT log(log(log(...log(n)))..) (k times) as it is sometimes also used.
O(n^2) vs. O(n*log(n)^2)
<=> O(n) vs. O(log(n)^2) (divide by n)
<=> O(sqrt(n)) vs. O(log(n)) (square root)
<=> polynomial vs. logarithmic
Logarithmic wins.
(Log n)^2 is better because if you do a variable change n by exp m, then m^2 is better than exp m
(logn)^2 is also < n.
Take an example:
n = 5
log n = 0.6989....
(log n)^ 2 = 0.4885..
You can see, (long n)^2 is further reduced.
Even if you take any bigger value of n e.g. 100,000,000 , then
log n = 9
(log n)^ 2 = 81
which is far less than n.
O(n(logn)^2) is better (faster) for large n!
take log from both sides:
Log(n^2)=2log(n)
Log(n(logn)^2)=Log(n)+2log(Log(n))=Log(n)+2log(Log(n))
lim n--> infinity [(Log(n)+2log(Log(n)))/2log(n)/]=0.5 (use l'Hôpital's rule)(http://en.wikipedia.org/wiki/L'H%C3%B4pital's_rule)]

Ordering functions by Big O complexity

I'm trying to order the following functions in terms of Big O complexity from low complexity to high complexity: 4^(log(N)), 2N, 3^100, log(log(N)), 5N, N!, (log(N))^2
This:
3^100
log(log(N))
2N
5N
(log(N))^2
4^(log(N))
N!
I figured this out just by using the chart given on wikipedia. Is there a way of verifying the answer?
3^100 = O(1)
log log N = O(log log N)
(log N)^2 = O((log N)^2)
N, 2N, 5N = O(N)
4^logN = O(e^logN)
N! = o(N!)
you made just one small mistake. this is the right order.

Resources