Is O(log(n!)) complexity of this nested loop!
The inner loop executes log(i) times for each value of i.
So it becomes:
log(1) + log(2) + log(3) + log(4) ... + log(n)
Which is O(n * log(n)).
Related
I have a algorithm, the pseudo code below:
def foo(n):
if n == 0
return;
// Loop below take O(N)
for(i=0; i<n:i++){
....
}
foo(n-1):
The idea is that each recursion takes n time, and there are n recursions.
The total time should be like 1 + 2 3 + 4 +5 + ... +n
Can it be proved as O(n*n)?
Yes, it is O(n^2).
The sum of n natural numbers is: n * (n+1) / 2, link. Which is different to n^2 by a constant factor, so O(n * (n+1) / 2) == O(n^2)
First, you have n iterations in the for loop, then the function will repeat with n-1, n-2, ..., 0.
It's easy to see that n + (n-1) + (n-2) + ... + 1 = (n+1) * n/2 = (n^2 + n)/2 = O(n^2).
To evaluate Big O, that is, the complexity of the worst case, remember you have to ignore the all the coeficients, constants and lower power terms:
(n^2 + n)/2 = (1/2) * (n^2 + n)
O( (1/2) * (n^2 + n) ) = O(n^2 + n) = O(n^2)
I have with me a recursion of the form,
T(n) = T(n-1) + O(n)
which should be equivalent to
T(n) = T(1) + O(2) + O(3) + .... + O(n-1) + O(n)
so my solution depends on the value of
O(n) + O(n-1) + O(n-2) + ... + O(1)
Since, n + n-1 + n-2 + ... + 1 = n*(n+1)/2, I feel this should be O(n^2) but I am not sure how to use the Big-O maths to come at this solution.
I mean,
c * O(n) is O(n)
but
n * O(n) is O(n^2)
How do I conclude that
O(n) + O(n-1) + O(n-2) + ... + O(1) = O(n^2)
Edit: After reading the comments, maybe this is a simplifaction / wrong. Would be interested as to exactly why though
O(n) + O(n-1) + O(n-2) + ... + O(1)
= O(n) + O(1) + O(n-1) +O(2) .... O(n-k + 1) + O(k)
where k = n /2
now
O(n - a) = O(n)
O(n) + O(a) = O(n)
so there are n/2 terms of :
O(n) + O(n) ....
so O(n^2)
Why is the average case time complexity of tree sort O(n log n)?
From Wikipedia:
Adding one item to a binary search tree is on average an O(log n)
process (in big O notation), so adding n items is an O(n log n)
process
But we don't each time add an item to a tree of n items. We start with an empty tree, and gradually increase the size of the tree.
So it looks more like
log1 + log2 + ... + logn = log (1*2*...*n) = log n!
Am I missing something?
The reason why O(log(n!)) = O(nlog(n)) is a two-part answer. First, expand O(log(n!)),
log(1) + log(2) + ... + log(n)
We can both agree here that log(1), log(2), and all the numbers up to log(n-1) are each less than log(n). Therefore, the following inequality can be made,
log(1) + log(2) + ... + log(n) <= log(n) + log(n) + ... + log(n)
Now the other half of the answer depends on the fact that half of the numbers from 1 to n are greater than n/2. This means that log(n!) would be greater than n/2*log(n/2) aka the first half of the sum log(n!),
log(1) + log(2) + ... + log(n) => log(n/2) + log(n/2) + ... + log(n/2)
The reason being that the first half of log(1) + log(2) + ... + log(n) is log(1) + log(2) + ... + log(n/2), which is less than n/2*log(n/2) as proven by the first inequality so by adding the second half of the sum log(n!), it can be shown that it is greater than n/2*log(n/2).
So with these two inequalities, it can be proven that O(log(n!)) = O(nlog(n))
O(log(n!)) = O(nlog(n)).
https://en.wikipedia.org/wiki/Stirling%27s_approximation
(Answers must be 30 characters.)
I am trying to count the cost of the following algorithm in terms of a function of n.
for i:= 1 to n do
for j:= i to n do
k:=0
I understand that the inner for loop will iterate (n-1) + (n-2) + .... (n-n) times, however I don't know how to express this mathematically in a simpler form. How can I do this ?
(n-1) + (n-2) + .... (n-n) is equal to the sum of all integers from 0 to N-1. So it is equal to the N-1th triangular number, which can be found with the formula
Tn = n * (n+1) / 2
Which is equivalent to (1/2)*n^2 + (1/2)*n.
When calculating Big O complexity, you discard constant multipliers and all but the fastest-growing component, so an algorithm that takes (1/2)*n^2 + (1/2)*n steps to execute runs in O(n^2) time.
The inner loop, on average iterates (≈½n) times.
In "Big O" notation, you only care about the largest factor.
That is, for example, if you have:
n³ + n + log(n) + 1234
then the only thing that matters is the n³ factor, so O(n³).
So in your case:
½n x n = ½n²
which is O(n²) because the ½ doesn't matter.
What is the running time of this algorithm:
for i=1 to n^2
for j=1 to i
// some constant time operation
I want to say O(n^4) but I can't be certain. How do you figure this out?
n^4 is correct. The inner loop takes an average of (n^2)/2 time to run, because i goes up to n^2 linearly, and it is run (n^2) times.
You are correct, it is N^4.
Do the substitution M = N^2. Now your loops change to this:
for i in 0..M
for j in 0..i
This is your familiar O(M^2), hence the result is O((N^2)^2) = O(N^4) after the reverse substitution.
The constant time operation is run:
1 + 2 + 3 + ... + n^2 (n^2 adders)
times which is less than:
n^2 + n^2 + ... + n^2 (n^2 adders)
= n^2 * n^2
= n^4
So, it's obviously O(n^4)
To prove it's Θ(n^4), you can use a liitle math:
1 + 2 + 3 + ... + n^2
= n^2 * (n^2 + 1) / 2
= n^4 / 2 + n^2 / 2
>= n^4 / 2
With nested loops the Big Oh run time multiplicative. So Big Oh of the outer loop (N^2) is multiplied by the Big Oh of the inner (N^2). Therefore the Big Oh is (N^2 * N^2) and if you remember how to add exponents of a similar base you get N^(2+2) or N^4.
Using Sigma Notation, you end up getting the order of growth methodically:
n^5 = outer * inner
outer = n^2
inner = n^2 + n^2-1 + n^2-2 +...1