Do you know any example when space complexity is O (n log(n)) - allocation

Do you know any example when space complexity is O (n log(n))
I just started to learn algorithms and I wasn't able to find the answer.

A skip list has (worst case) space complexity of O(n log n)
There is a good summary of algorithms and datastructures and their space/time complexity at http://bigocheatsheet.com/

Related

What is Big O of n^2 x logn?

Is it n^2 x logn or n^3? I know both of these answers act as upper bounds, I’m just torn between choosing a tighter but more complex bound (option 1), or a “worse” yet simpler bound (option 2).
Are there general rules to big O functions such as big O functions can never be too complex/a product of two functions?
You already seem to have an excellent understanding of the fact that big-O notation is an upper bound, and also that a function with runtime complexity n^2 logn falls in both O(n^2 logn) and O(n^3), so I'll spare you the mathematics of that. It's immediately clear (from the fact that n^2 logn is in O(n^3)) that O(n^2 logn) is a subset of O(n^3), so the former is a at least as good of a bound. It turns out to be a strictly tighter bound (that can be seen with some basic algebra), which is a definite point in its favor. I do understand your concern about the complexity of bounds, but I wouldn't worry about that. Mathematically, it's best to favor accuracy over simplicity, when the two are at odds, and n^2 logn is not that complex of an expression. So in my mind, O(n^2 logn) is a much better bound to state.
Other examples of similar or greater complexity:
As indicated in the comments, merge sort and quicksort have average time complexity O(n logn).
Interpolation search has an average time complexity of O(log logn).
The average case of Dijkstra's algorithm is stated on Wikipedia to be the absolute mouthful O(E + V log(E/V) log(V)).

Time complexity of an algoithm with multiple loops

Let us assume we have an algorithm with the following structure:
A for-loop with O(n) complexity.
Another for-loop with O(n) complexity.
Inside this loop is a search algorithm with O(log n) complexity executed in every iteration of the for-loop.
Now, what time complexity does this algorithm have? Is it O(n^2), O(n), O(n log n) or something else?
The solution would be O(n + nlogn) which is O(n logn)
If you want to learn about big-O notation I recommend this book: Introduction to algorithms
Link: https://mitpress.mit.edu/books/introduction-algorithms-third-edition

Big O and Big Theta Equality

For example, I am asked the asymptotic complexity of building a binary heap (the type of algorithm is arbitrary) if I say an algorithm is Θ(log(n)) would it also be correct to say that it is O(n)
As long as you're measuring the same quantity, anything that is Θ(log n) is also O(n). If the runtime is Θ(log n), then it's also O(log n) (that's part of the definition of Θ notation), and anything that's O(log n) is also O(n).
The case where you might have to be careful is if these are implicitly measuring different quantities. For example, if an algorithm's best-case runtime is Θ(log n), it doesn't necessarily follow that the algorithm's worst-case runtime will be O(n).

Complexity of algorithms - Competitive Analysis

For example we have an algorithm with the complexity O(n log n). An online algorithm for the same problem is 5-competitive. What is the complexity of the online algorithm?
In my opinion the result should be something like O(5 * n log n). Did I understand this correctly?
Big-O notation refers to the asymptotic complexity of a function. The simplest way to explain this is that it means no constant are included in the notation. That means that n log n, 5n log n, and even 10^6*n log n all fall in to the big-O class of O(n log n)

N log(N) or N clarification

Will performing a O(log N) algorithm N times give O(N log(N))? Or is it O(N)?
e.g. Inserting N elements into a self-balancing tree.
int i = 0;
while (i++ < N) {
insert(itemsToInsert[i]);
}
It's definitely O(N log(N)). It COULD also be O(N), if you could show that the sequence of calls, as a total, grows slow enough (because while SOME calls are O(log N), enough others are fast enough, say O(1), to bring the total down).
Remember: O(f) means the algorithm is no SLOWER than f, but it can be faster (even if just in certain cases).
N times O(log(N)) leads to O(N log(N)).
Big-O notation notates the asymptotic behavior of the algorithm. The cost of each additional step is O(log N); we know that for an O(N) algorithm, the cost of each additional step is O(1) and asymptotically the cost function bound is a straight line.
Therefore O(N) is too low of a bound; O(N log N) seems about right.
Yes and no.
Calculus really helps here. The first iteration is complexity log(1), the second iteration is log(2), &ct until the Nth iteration which is log(N). Rather than thinking of the problem as a multiplication, think of it as an integral...
This happens to come out as O(N log(N)), but that is kind of a coincidence.

Resources