Merge Sort time complexity is O(n log n) so here n is dominate on logn , so Is Merge-Sort is O(N)
Thanks
O(n log n) is the best that you can get using tradional sort algorithms.
You can't say that O(n log n) == O(n) even if n dominates logn because they are multiplying not adding.
If you got n + logn and n dominates logn then you can say that O is O(n)
Related
I'm writing a report where I need to present some results with Big O notation. Since I have not used Big O notations before, I'm a bit unsure of how to write.
I understand that if you have O(n) * O(n) then the result becomes O(n^2). For example, a loop in a loop.
And O(n) * O(log n) equals O(n log n).For example, if you need to loop over a funktion that search in a balanced binary tree.
But if I have to loop a function with time complexity O(n log n).
How do I write O(n) * O(n log n) correctly?
It's just normal multiplication of whatever is inside the O.
n * n*log(n) = n^2*log(n)
So it's:
O(n^2 log n)
if I sort in O(m+n) complexity and then mergesort in O(nlogn) complexity, is it over all the sum or the most significant complexity?
They are both independent from each other. Then (m + n) + nlogn = O(n logn) since n logn is little faster than linear time.
To build a suffis array on a string of n characters,
we first generate the n suffixes O(n)
and then sort them O(n log n)
the total time complexity apprears to be O(n) + O(nlogn) = O(nlogn).
But I am reading that it is O(n^2 log n) and could not understand how. Can someone please explain?
First of all the statement O(n) + O(nlogn) = O(n) is wrong. O(n) + O(nlogn) = O(nlog(n)).
Second and the reason why you are confused - comparing two suffixes is not constant. As each suffix is a string of length up to n, the comparison of two suffixes is in the order of O(n). Thus sorting n suffixes is in the order of O(n * log (n) * n) = O(n^2 * log(n)).
I working on a problem for which I came up with two algorithms: one takes O(n lgn) time but requires extra space and other takes O(n+nlgn) time. So just wanted to ask is O(n lgn) time complexity an improvement over O(n+nlgn) or both will be considered equal considering nlgn is the biggest value.
They are the same:
n + n lg n <= 2 n lg n -- for n >= base of logarithm
= O(n lg n)
I'm trying to order the following functions in terms of Big O complexity from low complexity to high complexity: 4^(log(N)), 2N, 3^100, log(log(N)), 5N, N!, (log(N))^2
This:
3^100
log(log(N))
2N
5N
(log(N))^2
4^(log(N))
N!
I figured this out just by using the chart given on wikipedia. Is there a way of verifying the answer?
3^100 = O(1)
log log N = O(log log N)
(log N)^2 = O((log N)^2)
N, 2N, 5N = O(N)
4^logN = O(e^logN)
N! = o(N!)
you made just one small mistake. this is the right order.