What will be the time complexity of this code? - algorithm

I've this code :
int arr [];
QuickSort.(arr); // O(log n)
for (int i = 0; i < arr.length() - 1; i++) {
print(i); // O(n)
}
What is the time complexity of this? Is it O(n * log n) or O(n + log n)?

O(log n) + O(n) =
O(n + log n) =
O(n)
In the complexity analysis, you keep the biggest factor only. As n > log n, log n disappears because its growth is negligible when compared to n's when n goes to infinity.
On the other hand, the quicksort algorithm is an O(n²) algorithm, so I guess the real complexity should be:
O(n²) + O(n) =
O(n² + n) =
O(n²)
Where n is the size of the array.

Related

Time Complexity in asymptotic analysis log n and log (n+m)

Just some interesting discussion inspired by a conversation in my class.
There are two algorithms, one has time complexity log n and another log (n+m).
Am I correct to argue for average cases, log (n+m) one will perform faster while they make no differences in running time when considering it asymptotically? Because taking the limit of both and f1'/f2' will result in a constant, therefore they have the same order of growth.
Thanks!
As I can see from the question, both n and m are independent variables. So
when stating that
O(m + n) = O(n)
it should hold for any m, which is not: the counter example is
m = exp(n)
O(log(m + n)) = O(log(n + exp(n))) = O(log(exp(n))) = O(n) > O(log(n))
That's why in general case we can only say, that
O(log(m + n)) >= O(log(n))
An interesting problem is when O(m + n) = O(n). If m grows not faster then polynom from n, i.e. O(m) <= O(P(n)):
O(log(m + n)) = O(log(P(n) + n)) = O(log(P(n))) = k * O(log(n)) = O(log(n))
In case of (multi)graphs seldom have we that many edges O(m) > P(n): even complete graph Kn contains only m = n * (n - 1) / 2 = P(n) edges, that's why
O(m + n) = O(n)
holds for ordinary graph (no parallel/multiple edges, no loops)

If there are two O(n^2) and one O(n) snippets in a single program what is the complexity of the program

What would be the time-complexity and how to calculate.Thanks in advance.
T(n) = O(n^2) + O(n^2) + O(n)
= O(2n^2) + O(n) // add terms
= O(2n^2) // n^2 is dominant over n
= O(n^2) //remove constant factor
If they are combined so they are done sequentially (i.e. each of the snippets is executed once), then total complexity is summed, and you can simplify as follows:
O(n^2) + O(n) = O(n^2 + n) = O(n * (n + 1)) = O(n) * O(n+1) = O(n) * O(n) = O(n * n) = O(n^2)

Time Complexity Analysis of the Given Code

How do I find time complexity as a function of the problem size n?
sum = 0;
if (EVEN(n)) {
for (i = 0; i < n; i++) {
if (i % 2 == 0) {
O(logn)
}
else {
sum++;
}
}
}
else {
sum = sum + n;
}
The answer is: O(N log N)
Considering the worst case scenario EVEN(n), the for loop will execute N times or in O(N) time.
The worst case complexity of the code inside the for loop is O(log N).
You then multiply the for loop's complexity with the complexity of its contents.
Therefore, O(N) * O(log N) = O(N log N).
EDIT: With regards to the code inside the for loop...
Since the O(log N) execution is only run when i % 2 == 0, that means it only runs every other iteration of the for loop. Therefore, the true complexity is O(0.5log N), but since you drop all constants when calculating complexity, the complexity is still O(log N), and the final answer is still O(N log N).

Analysis of algorithm and finding its time complexity

if T(n) = n √ n then
T(n) = O(n)
T(n) = O(n log n)
T(n) = O(n^2)
None of the above
Which of the above options is correct? How to find the order in this case?
Order is O(n^1.5).
O(n) < O(n log n) < O(n^1.5) < O(n^2).
So answer is 4.

Why isn't heapsort lg(n!)?

I'm reading CLRS and it says heapsort is
HEAPSORT(A):
BUILD-MAX-HEAP(A);
for (i = A.length; i >= 1; i++)
{
exchange A[1] with A[i];
A.heap-size = A.heap-size - 1;
MAX-HEAPIFY(A,1);
}
MAX_HEAPIFY is O(lg n).
The book says it runs MAX-HEAPIFY n times thus it is O(n lg n) time.
But if the heap is shrinking in size by 1 each iteration shouldn't it be O(lg n!) ?
It would be lg 1 + lg 2 ... + lg(n-1) + lg (n) = lg(n!), right ?
Actually it's Stirling's Approximation:
O( log(n!) ) = O(nlogn)

Resources