What is O(log(n!)) and O(n!)? I believe it is O(n log(n)) and O(n^n)? Why?
I think it has to do with Stirling's approximation, but I don't get the explanation very well.
Am I wrong about O(log(n!) = O(n log(n))? How can the math be explained in simpler terms? In reality I just want an idea of how this works.
O(n!) isn't equivalent to O(n^n). It is asymptotically less than O(n^n).
O(log(n!)) is equal to O(n log(n)). Here is one way to prove that:
Note that by using the log rule log(mn) = log(m) + log(n) we can see that:
log(n!) = log(n*(n-1)*...2*1) = log(n) + log(n-1) + ... log(2) + log(1)
Proof that O(log(n!)) ⊆ O(n log(n)):
log(n!) = log(n) + log(n-1) + ... log(2) + log(1)
Which is less than:
log(n) + log(n) + log(n) + log(n) + ... + log(n) = n*log(n)
So O(log(n!)) is a subset of O(n log(n))
Proof that O(n log(n)) ⊆ O(log(n!)):
log(n!) = log(n) + log(n-1) + ... log(2) + log(1)
Which is greater than (the left half of that expression with all (n-x) replaced by n/2:
log(n/2) + log(n/2) + ... + log(n/2) = floor(n/2)*log(floor(n/2)) ∈ O(n log(n))
So O(n log(n)) is a subset of O(log(n!)).
Since O(n log(n)) ⊆ O(log(n!)) ⊆ O(n log(n)), they are equivalent big-Oh classes.
By Stirling's approximation,
log(n!) = n log(n) - n + O(log(n))
For large n, the right side is dominated by the term n log(n). That implies that O(log(n!)) = O(n log(n)).
More formally, one definition of "Big O" is that f(x) = O(g(x)) if and only if
lim sup|f(x)/g(x)| < ∞ as x → ∞
Using Stirling's approximation, it's easy to show that log(n!) ∈ O(n log(n)) using this definition.
A similar argument applies to n!. By taking the exponential of both sides of Stirling's approximation, we find that, for large n, n! behaves asymptotically like n^(n+1) / exp(n). Since n / exp(n) → 0 as n → ∞, we can conclude that n! ∈ O(n^n) but O(n!) is not equivalent to O(n^n). There are functions in O(n^n) that are not in O(n!) (such as n^n itself).
Related
Can the following recursion:
T(n) = 2T(n/4)+ n^3 + n^2
be solved using Master Theorem?
It meets the preconditions that f(n) is positive, a>=1, b>1, and the difference between (n^3 + n^2) and n/2 (logarithm base 4 of 2) is polynomial. Would f(n)=O(n^3)?
I am practicing problems on asymptotic analysis and I am stuck with this problem.
Is log(n!) = O((log(n))^2) ?
I am able to show that
log(n!) = O(n*log(n))
(log 1 + log 2 + .. + log n <= log n + log n + ... + log n)
and
(log(n))^2 = O(n*log(n))
(log n <= n => (log n)^2 <= n*logn )
I am not able to proceed further. Any hint or intuition on how to proceed further? Thanks
Accoriding to Stirling's Approximation:
log(n!) = n*log(n) - n + O(log(n))
So clearly upper bound for log(n!) will be O(nlogn)
Lower bound can be calculated by removing first half of the equation as:
log(1) + ... + log(n/2) + ... + log(n) = log(n/2) + ... + log(n)
= log(n/2) + ... + log(n/2)
= n/2 * log(n/2)
So Lower bound is also nlogn. Clearly answer would be NO
I think I got the answer to my own question. We will prove the following facts:
1) n*log(n) is a tight bound for log(n!)
2) n*log(n) is a upper bound for (log(n))^2
3)n*log(n) is not a lower bound for (log(n))^2
For proof of (1) see this.
Proof(2) & (3) is provided in the question itself.
growth rate of log n < growth rate of n.
So growth rate of log(n)^2 < growth rate of n*log(n).
So log(n)^2 = o(n*log(n)) (Here I have used little-o to denote that growth rate of n*log(n) is strictly greater than growth rate of log(n)^2
So the conclusion is that log(n!) = big-omega(log(n^2))
Correct me if I have made any mistake
Studying for a test and getting this question:
Comparing two algorithms with asymptotic complexities O(n) and O(n + log(n)),
which one of the following is true?
A) O(n + log(n)) dominates O(n)
B) O(n) dominates O(n + log(n))
C) Neither algorithm dominates the other.
O(n) dominates log(n) correct? So in this do we just take o(n) from both and deduce neither dominate?
[C] is true, because of the summation property of Big-O
Summation
O(f(n)) + O(g(n)) -> O(max(f(n), g(n)))
For example: O(n^2) + O(n) = O(n^2)
In Big-O, you only care about the largest-growing function and ignore all the other additives.
Edit: originally I put [A] as an answer, I just didn't put much attention to all the options and misinterpreted the [A] option. Here is more formal proof
O(n) ~ O(n + log(n)) <=>
O(n) ~ O(n) + O(log(n)) <=>
O(n) ~ O(n).
Yes, that's correct. If runtime is the sum of several runtimes, by order of magnitude, the largest order of magnitude dominates.
Assuming that big-O notation is used in the sense of asymptotic tight bound, which really should be denoted with a big-Theta, then I would answer C), because Theta(n) = Theta(n + log(n)). (Because log(n) is dominated by n).
If I am formally (mathematically) correct, then I would say that none of these answers is correct, because O(n) and O(n+log(n)) only give upper bounds, but not lower bounds on the asymptotic behaviour:
Let f(n) in O(n) and g(n) in O(n + log(n)). Then there are the following contra examples:
For A): Let f(n) = n in O(n) and g(n) = 1 in O(n + log(n)). Then g(n) does not dominate f(n).
for B): Let f(n) = 1 in O(n) and g(n) = n in O(n + log(n)). Then f(n) does not dominate g(n).
for C): Let f(n) = 1 in O(n) and g(n) = n in O(n + log(n)). Then g(n) does dominate f(n).
As this would be a very tricky question, I assume that you use the more common sloppy definition, which would give the answer C). (But you might want to check your definitions for big-O).
If my answer confuses you, then you probably didn't use the formal definition and you should probably ignore my answer...
This is an interview question:
Given: f(n) = O(n)
g(n) = O(n²)
find f(n) + g(n) and f(n)⋅g(n)?
What would be the answer for this question?
When this answer was prepared, f(n) was shown as o(n) and g(n) as Θ(n²).
From f(n) = o(n) and g(n) = Θ(n²) you get a lower bound of o(n²) for f(n) + g(n), but you don't get an upper bound on f(n) + g(n) because no upper bound was given on f(n). [Note, in above, Θ is a big-θ, or big theta]
For f(n)·g(n), you get a lower bound of o(n³) because Θ(n²) implies lower and upper bounds of o(n²) and O(n²) for g(n). Again, no upper bound on f(n)·g(n) is available, because f(n) can be arbitrarily large; for f(n), we only have an o(n) lower bound.
With the question modified to give only upper bounds on f and g, as f(n) = O(n) and g(n) = O(n²), we have that f(n)+g(n) is O(n²) and f(n)·g(n) is O(n³).
To show this rigorously is a bit tedious, but is quite
straightforward. Eg, for the f(n)·g(n) case, suppose that by the definitions of O(n) and O(n²) we are given C, X, K, Y such that n>X ⇒ C·n > f(n) and n>Y ⇒ K·n² > g(n). Let J=C·K and Z=max(X,Y). Then n>Z ⇒ J·n³ > f(n)·g(n) which proves that f(n)·g(n) is O(n³).
O(f(n) + g(n)) = O(max{f(n), g(n)})
so for first
f(n) + g(n) = O(max{n, n^2}) = O(n^2)
for
f(n) ⋅ g(n)
we will have
O(f(n) ⋅ g(n)) = O(n ⋅ n^2) = O(n^3)
Think about it this way.
f(n) = c.n + d
g(n) = a.n^2 + b.n + p
Then,
f(n) + g(n) = a.n^2 + (lower powers of n)
And,
f(n).g(n) = x.n^3 + (lower powers of n)
It follows that O(f(n) + g(n)) = O(n^2)
and O(f(n).g(n)) = O(n^3)
This question can be understood like this :-
f(n)=O(n) means it takes O(n) time to compute f(n).
Similarly,
for g(n) which requires O(n^2) time
So,
P(n)=f(n)+g(n) would definitely take O(n)+O(n^2)+O(1)(for addition,
once you know the value of both f and g)
. Hence, this new function
P(n) would require O(n^2) time.
Same is the case for
Q(n) =f(n)*g(n) which requires O(n^2) time
.
I'm trying to order the following functions in terms of Big O complexity from low complexity to high complexity: 4^(log(N)), 2N, 3^100, log(log(N)), 5N, N!, (log(N))^2
This:
3^100
log(log(N))
2N
5N
(log(N))^2
4^(log(N))
N!
I figured this out just by using the chart given on wikipedia. Is there a way of verifying the answer?
3^100 = O(1)
log log N = O(log log N)
(log N)^2 = O((log N)^2)
N, 2N, 5N = O(N)
4^logN = O(e^logN)
N! = o(N!)
you made just one small mistake. this is the right order.