Why isn't heapsort lg(n!)? - algorithm

I'm reading CLRS and it says heapsort is
HEAPSORT(A):
BUILD-MAX-HEAP(A);
for (i = A.length; i >= 1; i++)
{
exchange A[1] with A[i];
A.heap-size = A.heap-size - 1;
MAX-HEAPIFY(A,1);
}
MAX_HEAPIFY is O(lg n).
The book says it runs MAX-HEAPIFY n times thus it is O(n lg n) time.
But if the heap is shrinking in size by 1 each iteration shouldn't it be O(lg n!) ?
It would be lg 1 + lg 2 ... + lg(n-1) + lg (n) = lg(n!), right ?

Actually it's Stirling's Approximation:
O( log(n!) ) = O(nlogn)

Related

What will be the time complexity of this code?

I've this code :
int arr [];
QuickSort.(arr); // O(log n)
for (int i = 0; i < arr.length() - 1; i++) {
print(i); // O(n)
}
What is the time complexity of this? Is it O(n * log n) or O(n + log n)?
O(log n) + O(n) =
O(n + log n) =
O(n)
In the complexity analysis, you keep the biggest factor only. As n > log n, log n disappears because its growth is negligible when compared to n's when n goes to infinity.
On the other hand, the quicksort algorithm is an O(n²) algorithm, so I guess the real complexity should be:
O(n²) + O(n) =
O(n² + n) =
O(n²)
Where n is the size of the array.

What is this function time complexity?

is this function complexity O(nlogn)?
find(n){
for(i=0; i<=n-1; i++){
for(j=1; j<=i; j=j*2)
print("*");
}
}
The complexity of the inner loop is O(log2(i)). The outer loop starts from 0 until n - 1. so the complexity should be log2(1) + log2(2).... log2(n-1) = log((n-1)!).
So the complexity is O(log2(n!)).
This is because log(a) + log(b) + log(c)... = log(abc...).
In your case it's log(1) + log(2) + log(3)... + log(n-1) = log(1*2*3*...*(n-1)) = log((n-1)!).
O(log n!) approximates to O(n log n) so your complixity is: O(n log n).

Is O(K + (N-K)logK) equivalent to O(K + N log K)?

Can we say O(K + (N-K)logK) is equivalent to O(K + N logK) for 1 < = K <= N?
The short answer is they are not equivalent and it depends on the value of k. If k is equal to N, then the first complexity is O(N), and the second complexity is O(N + Nlog N) which is equivalent to O(NlogN). However, O(N) is not equivalent to O(N log N).
Moreover, if a function is in O(K + (N-K) log K) is in O(K + N log K) (definitely for every positive K), and the proof of this is straightforward.
Yes because in the worst case (N-K) logK is at most N logK given your constraints since 1 <= K <= N.
Not exactly.
If they are equivalent, then every function in O(k + (n-k)log k) is also in O(k + n log k) and vice-versa.
Let f(n,k) = n log k
This function is certainly in O(k + n log k), but not in O(k + (n-k)log k).
Let g(n,k) = k + (n-k)log k
Then as x approaches infinity, f(x,x)/g(x,x) grows without bound, since:
f(x,x) / g(x,x)
= (x log x) / x
= log x
See the definition of big-O notation for multiple variables: http://mathwiki.cs.ut.ee/asymptotics/04_multiple_variables
Wikipedia provides the same information, but in less accessible notation:
https://en.wikipedia.org/wiki/Big_O_notation#Multiple_variables

Big O Notation queries

int k = n;
while (k > 0)
{
for (int j = 0; j < n; j++)
{
System.out.println(“Inside the inner loop”);
}
k = k / 2;
}
Hi for this question, I came up with two answers, that is
O (N^2 Log N) OR O (n * N/2) = O (N2/2). I'm not sure if there are the same or different? My lecturer also mentioned to chose the upper case. Therefore, for this question, there are one O(N^2) and one O(Log N). So the answer should be O(N^2) according to my lecturer? please help. Thanks.
The answer is O(N * log N)
Since you divide K in halfes you get O(log N) for outer loop. And since in each iteration you iterate N times then it will be O(N * log N).

What is the tightest asymptotic growth rate

I have solved all of them however i have been told there are some mistakes, can somebody please help me
n^4 - 10^3 n^3 + n^2 + 4n + 10^6 = O(n^4)
10^5 n^3 + 10^n = O(10^n)
10 n^2 + n log n + 30 √n = O(n^2)
25^n = O(1)
n^2+ n log n + 7 n = O(n^2)
(n^3 + 10) (n log n+ 1) / 3 = O(n^4 log n)
20 n^10 + 4^n = O(4^n)
n^2 log n^3 + 10 n^2 = O(n^2 log n)
10^20 = O(1)
n^2 log (6^2)n = O(n^2 log n)
n log(2n) = O(n log n)
30 n + 100 n log n + 10 = O(n log n)
(n+√n) log n^3 = O(n+√n log n)
n (n + 1) + log log n = O(n^2)
4n log 5^(n+1) = O(n log 5^n)
3^(n+4) = O(3^n)
n^2 log n^2 + 100 n^3 = O(n^3)
(n log n) / (n + 10) = O(n^2 log n)
5n + 8 n log(n) + 10n^2 = O(n^2)
2n^3 + 2n^4 + 2^n + n^10 = O(2^n)
Hints:
if you have n on the left, you should have it on the right
there should not be any + operations on the right
log(x^y) can be simplified
Most of your answers look correct, but you have 25^n = O(1) which looks wrong (unless it's 0.25^n), and also you have (n log n) / (n + 10) = O(n^2 log n) which does not look like the tightest possible bound (I'm assuming you want the tightest possible upper bound function). Also you should never have to add functions in your big-O, unless your original function is taking the sum or max of two functions or something and the two functions have cris-crossing different growth rates at different values of n as n goes to infinity. And that very rarely happens.

Resources