Ordering functions by Big O complexity - asymptotic-complexity

I'm trying to order the following functions in terms of Big O complexity from low complexity to high complexity: 4^(log(N)), 2N, 3^100, log(log(N)), 5N, N!, (log(N))^2
This:
3^100
log(log(N))
2N
5N
(log(N))^2
4^(log(N))
N!
I figured this out just by using the chart given on wikipedia. Is there a way of verifying the answer?

3^100 = O(1)
log log N = O(log log N)
(log N)^2 = O((log N)^2)
N, 2N, 5N = O(N)
4^logN = O(e^logN)
N! = o(N!)
you made just one small mistake. this is the right order.

Related

how to compute the big O of sort and then merge

if I sort in O(m+n) complexity and then mergesort in O(nlogn) complexity, is it over all the sum or the most significant complexity?
They are both independent from each other. Then (m + n) + nlogn = O(n logn) since n logn is little faster than linear time.

Why Merge Sort time complexity is not O(N)?

Merge Sort time complexity is O(n log n) so here n is dominate on logn , so Is Merge-Sort is O(N)
Thanks
O(n log n) is the best that you can get using tradional sort algorithms.
You can't say that O(n log n) == O(n) even if n dominates logn because they are multiplying not adding.
If you got n + logn and n dominates logn then you can say that O is O(n)

Algorithm domination

Studying for a test and getting this question:
Comparing two algorithms with asymptotic complexities O(n) and O(n + log(n)),
which one of the following is true?
A) O(n + log(n)) dominates O(n)
B) O(n) dominates O(n + log(n))
C) Neither algorithm dominates the other.
O(n) dominates log(n) correct? So in this do we just take o(n) from both and deduce neither dominate?
[C] is true, because of the summation property of Big-O
Summation
O(f(n)) + O(g(n)) -> O(max(f(n), g(n)))
For example: O(n^2) + O(n) = O(n^2)
In Big-O, you only care about the largest-growing function and ignore all the other additives.
Edit: originally I put [A] as an answer, I just didn't put much attention to all the options and misinterpreted the [A] option. Here is more formal proof
O(n) ~ O(n + log(n)) <=>
O(n) ~ O(n) + O(log(n)) <=>
O(n) ~ O(n).
Yes, that's correct. If runtime is the sum of several runtimes, by order of magnitude, the largest order of magnitude dominates.
Assuming that big-O notation is used in the sense of asymptotic tight bound, which really should be denoted with a big-Theta, then I would answer C), because Theta(n) = Theta(n + log(n)). (Because log(n) is dominated by n).
If I am formally (mathematically) correct, then I would say that none of these answers is correct, because O(n) and O(n+log(n)) only give upper bounds, but not lower bounds on the asymptotic behaviour:
Let f(n) in O(n) and g(n) in O(n + log(n)). Then there are the following contra examples:
For A): Let f(n) = n in O(n) and g(n) = 1 in O(n + log(n)). Then g(n) does not dominate f(n).
for B): Let f(n) = 1 in O(n) and g(n) = n in O(n + log(n)). Then f(n) does not dominate g(n).
for C): Let f(n) = 1 in O(n) and g(n) = n in O(n + log(n)). Then g(n) does dominate f(n).
As this would be a very tricky question, I assume that you use the more common sloppy definition, which would give the answer C). (But you might want to check your definitions for big-O).
If my answer confuses you, then you probably didn't use the formal definition and you should probably ignore my answer...

Big O notation is O(nlgn) same as O(n+nlgn) in terms of computational complexity?

I working on a problem for which I came up with two algorithms: one takes O(n lgn) time but requires extra space and other takes O(n+nlgn) time. So just wanted to ask is O(n lgn) time complexity an improvement over O(n+nlgn) or both will be considered equal considering nlgn is the biggest value.
They are the same:
n + n lg n <= 2 n lg n -- for n >= base of logarithm
= O(n lg n)

When c > 0 Log(n) = O(n)? Not sure why it isn't O(log n)

In my homework, the question asks to determine the asymptotic complexity of n^.99999*log(n). I figured that it would be closer to O( n log n) but the answer key suggests that when c > 0, log n = O(n). I'm not quite sure why that is, could someone provide an explanation?
It's also true that lg n = O( nk ) (in fact, it is o(nk); did the hint actually say that, perhaps?) for any constant k, not just 1. Now consider k=0.00001. Then n0.99999 lg n = O(n0.99999 n0.00001 ) = O(n). Note that this bound is not tight, since I could choose an even smaller k, so it's perfectly fine to say that n0.99999 lg n is O(n0.99999 lg n), just as we say n lg n is O(n lg n).

Resources